WO2019188079A1 - Person transit time measurement system and person transit time measurement method - Google Patents

Person transit time measurement system and person transit time measurement method Download PDF

Info

Publication number
WO2019188079A1
WO2019188079A1 PCT/JP2019/008995 JP2019008995W WO2019188079A1 WO 2019188079 A1 WO2019188079 A1 WO 2019188079A1 JP 2019008995 W JP2019008995 W JP 2019008995W WO 2019188079 A1 WO2019188079 A1 WO 2019188079A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
image
time
feature amount
entrance
Prior art date
Application number
PCT/JP2019/008995
Other languages
French (fr)
Japanese (ja)
Inventor
小倉 慎矢
Original Assignee
株式会社日立国際電気
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立国際電気 filed Critical 株式会社日立国際電気
Priority to JP2020509778A priority Critical patent/JP7108022B2/en
Publication of WO2019188079A1 publication Critical patent/WO2019188079A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a person passage time measurement system that measures the passage time of a person from one point to another point.
  • the estimated processing time of each person obtained by multiplying the reference time by a coefficient corresponding to the characteristics of each person lined up at the airport security checkpoint is lined up at the security checkpoint. Calculating the estimated total processing time of a person. For example, in Patent Document 2, when acquiring the dwell time required for movement in the measurement area for each moving body based on the flow line of each moving body in the measurement area, a missing portion of the flow line in the measurement area is disclosed. It is disclosed that the missing part is replaced with the residence time of another person when there is a problem. For example, in Patent Document 3, the total extension of each matrix is calculated based on a matrix line generated for each matrix for a monitoring area where one or more naturally occurring matrixes or one or more guideway matrixes exist. It is disclosed to calculate the waiting time of a matrix based on the total extension.
  • the conventional technology cannot accurately measure the passing time of a person from one point to another, and a technique for measuring the passing time of a person with higher accuracy has been demanded.
  • accurate data of visitors cannot be obtained if measurement is performed including a specific person such as a security guard or a related person.
  • the mechanism for measuring the transit time including the matrix when only the representatives of the group are arranged in the matrix and the remaining persons of the group join from the middle, the transit time of the person who joined the group from the middle, There was also a problem that the measurement time was shorter than the actual transit time.
  • the passing time of persons who have been arranged in the queue but have been rearranged after leaving the queue is measured longer than the passing time in the actual congestion situation.
  • the present invention has been made in view of the above-described conventional circumstances, and provides a technique capable of measuring the passage time of a person from one point to another with higher accuracy. Objective.
  • the person passing time measuring system is configured as follows. That is, in a person passage time measurement system that measures a passage time required for a person to pass from a first point to a second point, a feature amount of the person included in the first image obtained by imaging the first point is calculated.
  • calculating means for calculating a feature amount of a person included in the second image obtained by imaging the second point, calculated from the feature amount calculated from the first image and the second image
  • a determination means for comparing the feature quantity and determining whether the same person as the person included in the second image is included in the first image; and the same person as the person included in the second image is
  • a recording unit that records a difference between the shooting time of the first image and the shooting time of the second image as the passing time of the person when it is determined that the image is included in the first image.
  • Such a configuration makes it possible to accurately measure the passage time required for the person to pass from the first point to the second point. Further, even when there are a plurality of persons, each person can be identified and the passing time can be individually measured. Furthermore, it is not necessary to track the movement of the person between the first point and the second point.
  • the shooting time of the first image and the shooting time of the second image are not required to be the time when each image is shot by the imaging device, but may be a time at which they can be viewed. . That is, the time when the image was transmitted from the imaging device, the time when the image arrived at the first calculation unit or the second calculation unit, the time when the feature amount was extracted by the first calculation unit or the second calculation unit, or the like may be used. . In short, it is only necessary to be able to specify the difference between the shooting times of the first image and the second image, and it is only necessary that the time is the same for the first image and the second image.
  • the recording when the difference between the shooting time of the first image and the shooting time of the second image exceeds a first threshold, or when the difference is less than a second threshold smaller than the first threshold, It is preferable to exclude the recording as the passage time. This makes it possible to exclude records of passage times that deviate significantly from the standard passage time. In other words, it is possible to exclude people who have taken actions that cause noise in the transit time survey, such as people who have been interrupted in the queue or those who have been rearranged in the queue, and the reliability of the survey results Can be increased.
  • the image processing apparatus further includes a storage unit that stores a feature amount of a person who is not subject to measurement of the passage time, and compares the feature amount calculated from the first image with the feature amount stored in the storage unit. It is preferable that the person included in the first image is excluded from the measurement target of the transit time when the similarity is equal to or greater than a predetermined value. As a result, it is possible to exclude persons who do not need to investigate the transit time, such as security guards and related persons, from the survey target, and it is possible to increase the reliability of the survey results.
  • a person passing time measurement system will be described with reference to the drawings.
  • a description will be given of a case where a person measures the transit time required for movement from the entrance to the exit in an environment where there is an entrance and an exit such as an airport security checkpoint or an event venue ticket office.
  • FIG. 1 shows a configuration example of a person passing time measuring system according to a first embodiment.
  • the person passage time measurement system of this example is configured such that an entrance side imaging device 101A, an exit side imaging device 101B, and a person passage time measurement device 102 are connected to a network 100 and can communicate with each other. Is done.
  • the network 100 is a communication means for performing communication by connecting devices such as a dedicated network for data communication, an intranet, the Internet, and a wireless LAN.
  • the entrance-side imaging device 101A and the exit-side imaging device 101B perform digital conversion processing on an image captured by a CCD (Charge-Coupled Device) or CMOS (Complementary-Metal-Oxide-Semiconductor) element, and the converted image data is transferred to the network 100.
  • CCD Charge-Coupled Device
  • CMOS Complementary-Metal-Oxide-Semiconductor
  • the person passage time measuring device 102 is a device that processes the image data output from the imaging devices 100A and 100B via the network, and measures and records the person passage time.
  • the apparatus is also equipped with a function for extracting and recording a person feature.
  • the person passage time measuring apparatus 102 includes an entrance side image processing unit 121 and an exit side image processing unit 122.
  • the entrance-side image processing unit 121 performs image processing on the image data output from the entrance-side imaging device 101A. Detailed processing contents will be described later with reference to FIG.
  • the exit side image processing unit 122 performs image processing on the image data output from the exit side imaging device 101B. Detailed processing contents will be described later with reference to FIG.
  • FIG. 2 shows a schematic diagram of the measurement environment of the person passing time.
  • FIG. 2 assumes a security checkpoint at an airport or a ticket office at an event venue.
  • the gate 251 is a gate where people form a line, and is actually a security check gate or a ticket window.
  • a plurality of gates 251 may be provided, and the present invention can be applied to a case where the gate 251 is not provided.
  • an entrance-side imaging device 101A is installed so that a person passing through the entrance can be photographed from the front.
  • an exit-side imaging device 101B is installed so that a person passing through the exit can be photographed from the front.
  • Person 252A is a person whose passage time is to be measured.
  • a person 252A photographed by the entrance-side imaging device 101A on the entrance side of the area passes through the gate 251 where many people are lined up, and after a while, the exit-side imaging device 101B on the exit side of the area.
  • the measurement environment is reflected in In FIG. 2, a state where the person 252A has moved to the exit side is indicated by reference numeral 252B.
  • FIG. 3 shows an example of the processing procedure of the entrance-side image processing unit 121 in the person passage time measuring apparatus 102 of the first embodiment.
  • the entrance-side image processing unit 121 first performs initialization processing (S301) for securing memory and initializing parameters.
  • the image data is received from the entrance-side imaging device 101A, and entrance-side image reception processing (S302) that is held in the memory of the person passage time measuring device 102 is performed.
  • S301 initialization processing
  • S302 entrance-side image reception processing
  • the entrance-side imaging apparatus 101A compresses (encodes) an image and transmits it, a process of converting (decoding) into an uncompressed image after reception is also performed.
  • person area detection processing for detecting a person area from the entrance side image is performed. For example, a search is made as to whether an estimated face image is included in the image.
  • a person detection determination process S304 is performed to determine whether or not a person area has been detected from the entrance-side image. If a person area is detected, the process proceeds to person feature extraction processing (S305). On the other hand, when the person area is not detected, the process proceeds to the entrance-side image reception process (S302) in order to receive a new image.
  • the person feature quantity extraction process is calculated as a numerical value for each person detected in the person area detection process (S303).
  • a face feature amount is extracted as a person feature amount.
  • a person feature / time recording process (S306) is performed in which both the person feature and the current time (entrance passage time) are stored in a memory area that can be used in common with the exit-side image processing unit 122.
  • an end determination process for determining whether there is a condition to be ended is performed. If it is a condition to be terminated, a termination process such as releasing the secured memory is performed. If it is not the condition to end, the process proceeds to the entrance side image reception process (S302) in order to receive a new entrance side image.
  • a condition to be terminated for example, there is a case where a switch of the person passage time measuring apparatus 102 is turned off by a human hand.
  • FIG. 4 shows an example of the processing procedure of the exit-side image processing unit 122 in the person passage time measuring apparatus 102 of the first embodiment.
  • the exit-side image processing unit 122 first performs initialization processing (S401) for securing memory and initializing parameters.
  • the image data is received from the exit-side imaging device 101B, and the exit-side image reception process (S402) that is held in the memory of the person passage time measuring device 102 is performed.
  • the exit side imaging apparatus 101B compresses (encodes) an image and transmits it, a process of converting (decoding) it into an uncompressed image after reception is also performed.
  • a person area detection process for detecting a person area from the exit side image is performed. For example, a search is made as to whether an estimated face image is included in the image.
  • a person detection determination process for determining whether a person area is detected from the exit side image is performed. If a person area is detected, the process proceeds to person feature extraction processing (S405). On the other hand, when the person area is not detected, the process proceeds to the exit side image reception process (S402) in order to receive a new image.
  • the person feature quantity extraction process (S405), the person feature quantity is calculated as a numerical value for each person detected in the person area detection process (S403). In this example, a face feature amount is extracted as a person feature amount.
  • the same person determination that compares the person feature amount calculated in the person feature amount extraction process (S405) with the plurality of person feature amounts stored in the person feature amount / time recording process (S306) in FIG. Processing (S406) is performed. That is, the process of searching for the same person as the person arriving at the exit from among a plurality of persons who have passed through the entrance is performed by comparing the person feature obtained at the entrance with the person feature obtained at the exit.
  • the similarity with the person feature quantity obtained at the exit is calculated, and the person with the highest similarity (the person with the smallest difference in person feature quantities) is calculated.
  • the process proceeds to the time difference recording process (S407). Even if the similarity is the highest, if the similarity is less than a predetermined threshold, it is determined that the same person as the person arriving at the exit was not found among the persons who passed through the entrance, and the end determination was made. The process proceeds to processing (S409).
  • the entrance passage time stored in the person feature amount / time recording process (S306) of FIG. 3 for the person determined to be the same person in the same person determination process (S406) and the current time The difference from the time (exit passage time) is calculated and recorded in the database or the like as the transit time required for the person to pass from the entrance to the exit.
  • the person feature quantity and the entrance passage time relating to the person are no longer necessary, and the person feature quantity deletion process (S408) for deleting the corresponding person feature quantity and time is performed. That is, only the person feature amount and the entrance passage time relating to the person in the area between the entrance and the exit remain in the memory.
  • an end determination process for determining whether there is a condition to be ended is performed. If it is a condition to be terminated, a termination process such as releasing the secured memory is performed. If it is not the condition to end, the process proceeds to the exit side image reception process (S402) in order to receive a new exit side image.
  • a condition to be terminated for example, there is a case where a switch of the person passage time measuring apparatus 102 is turned off by a human hand.
  • the entrance-side image processing unit 121 performs the process (S302 to S305) of calculating the feature amount of the person from the entrance-side image, and exit-side image processing.
  • the process of calculating the feature amount of the person from the exit side image (S402 to S405) is compared with the feature amount calculated from the entrance side image and the feature amount calculated from the exit side image.
  • Processing for determining whether the same person as the person included in the image is included in the entrance image (S406), and when it is determined that the same person as the person included in the exit image is included in the entrance image The process of recording the difference between the shooting time of the entrance-side image and the shooting time of the exit-side image as the passing time of the person (S407) is performed.
  • FIG. 5 shows a configuration example of a person passage time measuring system according to a second embodiment.
  • the person passage time measurement system of the second embodiment is different from the system configuration (FIG. 1) of the first embodiment in that the person passage time measurement device 102 includes a non-target person feature amount storage unit 123. Yes.
  • the non-target person feature amount storage unit 123 stores in advance the human feature amount of a person who is not a target in order for the entrance-side image processing unit 121 to determine a person who is not a target for the passage time measurement. .
  • FIG. 6 shows an example of the processing procedure of the entrance-side image processing unit 121 in the person passage time measuring apparatus 102 of the second embodiment.
  • Processes S301 to S307 in FIG. 6 are the same as the processes (FIG. 3) of the entrance-side image processing unit 121 in the first embodiment, and thus description thereof is omitted.
  • the non-measurement person determination process (S501) is performed after the person feature amount extraction process (S305).
  • a difference between the person feature quantity stored in the non-target person feature quantity storage unit 123 and the person feature quantity calculated in the person feature quantity extraction process (S305) is calculated in advance. Compare with the set reference value (threshold). If the calculated difference between the person feature amounts is smaller than the reference value, it is determined that a person who is not the subject is reflected on the entrance-side imaging apparatus 101A, and the process ends without performing the person feature amount / time recording process (S306). The process proceeds to a determination process (S307).
  • This process may be determined on the exit side. However, if the process is performed on the entrance side, the process in the person feature / time recording process (S306) becomes unnecessary, and the process of the person passing time measuring apparatus 102 cannot be performed. The amount of memory is reduced.
  • FIG. 7 shows an example of the processing procedure of the exit-side image processing unit 122 in the person passage time measuring apparatus 102 of the second embodiment.
  • Processes S401 to S409 in FIG. 7 are the same as the process of the exit side image processing unit 122 (FIG. 4) in the first embodiment, and thus description thereof is omitted.
  • standard time difference calculation processing (S701) is performed after initialization processing (S401).
  • the standard time difference calculation process (S701) is a process for calculating the standard time difference from the average value of the passage times of a plurality of persons in the past.
  • a plurality of persons serving as samples for calculating the standard time difference can be selected by an arbitrary method.
  • the standard time difference may be calculated from the average value of the passing times of the most recent dozens of people, or the standard time difference may be calculated from the average value of the passing times of dozens of the same past day of the week and the same time zone. May be.
  • a value obtained by adding a predetermined numerical value to the standard time difference calculated in the standard time difference calculation process (S701) is compared with the elapsed time from the time stored in the person feature amount / time recording process (S306) as a specified upper limit time.
  • a feature amount existence determination process exceeding the specified time (S702) is performed. Since there may be a plurality of times stored in the person feature amount / time recording process (S306), this process is repeated a plurality of times.
  • the predetermined numerical value added to the standard time difference may be a fixed value set in advance, or the standard deviation is calculated when calculating the average value in the standard time difference calculation, and is dynamically changed such as a value obtained by adding 3 ⁇ . It may be a value.
  • the process proceeds to a person feature quantity deletion process (S703). If no such person feature amount exists, the process proceeds to the exit-side image reception process (S402).
  • the person feature and time are deleted in the same manner as the person feature deletion process (S408). This is different from the person feature amount deletion processing (S408) in that the person feature amount whose elapsed time from the storage exceeds the specified upper limit time is deleted.
  • a less than prescribed time determination process (S704) is performed.
  • the current time (exit passage time) is compared with a specified lower limit time obtained by subtracting a predetermined numerical value from the standard time difference calculated in the standard time difference calculation process (S701). If it is determined that the calculated time difference is less than the specified time, the process shifts to the person feature amount deletion process (S408) without performing the time difference recording process (S407).
  • the predetermined numerical value to be subtracted from the standard time difference may be a fixed value set in advance, or the standard deviation is calculated when calculating the average value in the standard time difference calculation, and is dynamically changed such as a value obtained by adding 3 ⁇ . It may be a value.
  • the person passage time measurement system further includes the non-target person feature amount storage unit 123 that stores the feature amount of the person who is not the target of the passage time measurement, and from the entrance-side image.
  • the calculated feature quantity is compared with the feature quantity stored in the non-target person feature quantity storage unit 123, and when the similarity between the two is greater than or equal to a predetermined value, the person included in the entrance side image is measured for passing time.
  • the configuration is not applicable.
  • the difference between the shooting time of the entrance side image and the shooting time of the exit side image exceeds the specified upper limit time, or when it is less than the specified lower limit time, it is excluded from recording as a passing time. It has become.
  • the face feature amount is used as the feature amount used in the determination of the identity of the person.
  • the feature amount is also applied to other elements related to the person such as the hairstyle, clothes, and belongings. It may be calculated and used in the determination of person identity.
  • the function according to the present invention is realized in one apparatus called the person passage time measuring apparatus 102, but the function according to the present invention may be realized by a plurality of apparatuses. Further, the entrance-side imaging device 101A and the exit-side imaging device 101B may be configured to cooperate with each other to realize the function according to the present invention.
  • an imaging device may be arranged for each entrance and each exit.
  • the present invention can also be applied to measurement of transit time in an area where the entrance and exit are not clearly defined, such as measurement of transit time in a specific section of a passage.
  • present invention has been described in detail above, it is needless to say that the present invention is not limited to the system described here and can be widely applied to systems other than those described above.
  • the present invention can also be provided as, for example, a method and method for executing the processing according to the present invention, a program for realizing such a method and method, and a storage medium for storing the program.
  • the present invention can be used in a person passage time measurement system that measures the passage time of a person from one point to another point.
  • Network 101A Entrance side imaging device 101B: Exit side imaging device 102: Person transit time measuring device 121: Entrance side image processing unit 122: Exit side image processing unit 123: Non-target person feature amount storage unit 251: Gate 252A, 252B: Person

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)
  • Alarm Systems (AREA)

Abstract

The present invention provides a technique whereby a transit time of a person from an entrance to an exit can be measured more precisely. An entrance side image processing unit 121 performs a process of calculating a feature amount of each person on the basis of an entrance side image. An exit side image processing unit 122 performs a process of calculating a feature amount of each person on the basis of an exit side image; a process of determining whether the same person as the person included in the exit side image is included in the entrance side image by comparing the feature amount calculated from the entrance side image with the feature amount calculated from the exit side image; and a process of recording, if the same person as the person included in the exit side image is determined to be included in the entrance side image, a difference between a captured time of the entrance side image and a captured time of the exit side image as the transit time of the person.

Description

人物通過時間計測システム及び人物通過時間計測方法Person transit time measuring system and person transit time measuring method
 本発明は、ある地点から別の地点までの人物の通過時間を計測する人物通過時間計測システムに関する。 The present invention relates to a person passage time measurement system that measures the passage time of a person from one point to another point.
 従来から、空港の保安検査場、商業施設、美術館、遊園地、イベント会場といった施設において、ある地点から別の地点までの人物の通過時間を計測し、顧客満足度の向上のための分析データとしたり、顧客に現在の待ち時間の目安を提示する人物通過時間計測システムが存在している。 Traditionally, in facilities such as airport security checkpoints, commercial facilities, art museums, amusement parks, and event venues, it is possible to measure the passing time of people from one point to another and use it as analytical data to improve customer satisfaction. In addition, there is a human transit time measurement system that presents a guide to the current waiting time to a customer.
 例えば、特許文献1には、空港の保安検査場に並んでいる各人の特徴に対応する係数を基準時間に乗じて得られる各人の予想処理時間を合計して、保安検査場に並んでいる人の予想合計処理時間を算出することが開示されている。
 例えば、特許文献2には、測定エリアにおける各移動体の動線に基づいて、測定エリア内の移動に要した滞留時間を移動体毎に取得する際に、測定エリア内の動線の欠落部分がある場合に、欠落部分を他の人物の滞留時間で代替することが開示されている。
 例えば、特許文献3には、自然発生型の行列又は誘導路型の行列が1つ以上存在する監視エリアについて、行列毎に生成した行列ラインに基づいて各行列の総延長を算出し、行列の総延長に基づいて行列の待ち時間を算出することが開示されている。
For example, in Patent Document 1, the estimated processing time of each person obtained by multiplying the reference time by a coefficient corresponding to the characteristics of each person lined up at the airport security checkpoint is lined up at the security checkpoint. Calculating the estimated total processing time of a person.
For example, in Patent Document 2, when acquiring the dwell time required for movement in the measurement area for each moving body based on the flow line of each moving body in the measurement area, a missing portion of the flow line in the measurement area is disclosed. It is disclosed that the missing part is replaced with the residence time of another person when there is a problem.
For example, in Patent Document 3, the total extension of each matrix is calculated based on a matrix line generated for each matrix for a monitoring area where one or more naturally occurring matrixes or one or more guideway matrixes exist. It is disclosed to calculate the waiting time of a matrix based on the total extension.
特許第5960864号公報Japanese Patent No. 5960864 特許第5789776号公報Japanese Patent No. 5789976 特開2007-317052号公報JP 2007-317052 A
 従来の技術は、ある地点から別の地点までの人物の通過時間を正確に計測できるものではなく、より高精度に人物の通過時間を計測する技術が求められていた。また、警備員や関係者などの特定の人物も含めて計測を行ってしまうと、来場者の正確なデータが得られないという問題があった。更に、行列も含めた通過時間を計測する仕組みでは、行列にグループの代表者だけが並び、途中からグループの残りの人物が合流する場合には、途中からグループに合流した人物の通過時間が、実際の混雑状況での通過時間より短く計測されるという問題もあった。また、行列に並んでいたが一旦行列を離れた後に並び直した人物の通過時間が、実際の混雑状況での通過時間より長く計測されるという問題もあった。 The conventional technology cannot accurately measure the passing time of a person from one point to another, and a technique for measuring the passing time of a person with higher accuracy has been demanded. In addition, there is a problem that accurate data of visitors cannot be obtained if measurement is performed including a specific person such as a security guard or a related person. Furthermore, in the mechanism for measuring the transit time including the matrix, when only the representatives of the group are arranged in the matrix and the remaining persons of the group join from the middle, the transit time of the person who joined the group from the middle, There was also a problem that the measurement time was shorter than the actual transit time. In addition, there is a problem that the passing time of persons who have been arranged in the queue but have been rearranged after leaving the queue is measured longer than the passing time in the actual congestion situation.
 本発明は、上記のような従来の事情に鑑みて為されたものであり、ある地点から別の地点までの人物の通過時間をより高精度に計測することが可能な技術を提供することを目的とする。 The present invention has been made in view of the above-described conventional circumstances, and provides a technique capable of measuring the passage time of a person from one point to another with higher accuracy. Objective.
 本発明では、上記の目的を達成するために、人物通過時間計測システムを以下のように構成した。
 すなわち、人物が第1地点から第2地点までの通過に要した通過時間を計測する人物通過時間計測システムにおいて、前記第1地点を撮像した第1画像に含まれる人物の特徴量を算出する第1算出手段と、前記第2地点を撮像した第2画像に含まれる人物の特徴量を算出する第2算出手段と、前記第1画像から算出された特徴量と前記第2画像から算出された特徴量とを比較して、前記第2画像に含まれる人物と同一の人物が前記第1画像に含まれるかを判定する判定手段と、前記第2画像に含まれる人物と同一の人物が前記第1画像に含まれると判定された場合に、前記第1画像の撮影時刻と前記第2画像の撮影時刻との差分を、当該人物の通過時間として記録する記録手段とを備えたことを特徴とする。
In the present invention, in order to achieve the above object, the person passing time measuring system is configured as follows.
That is, in a person passage time measurement system that measures a passage time required for a person to pass from a first point to a second point, a feature amount of the person included in the first image obtained by imaging the first point is calculated. 1 calculating means, second calculating means for calculating a feature amount of a person included in the second image obtained by imaging the second point, calculated from the feature amount calculated from the first image and the second image A determination means for comparing the feature quantity and determining whether the same person as the person included in the second image is included in the first image; and the same person as the person included in the second image is And a recording unit that records a difference between the shooting time of the first image and the shooting time of the second image as the passing time of the person when it is determined that the image is included in the first image. And
 このような構成により、人物が第1地点から第2地点までの通過に要した通過時間を正確に計測することが可能となる。また、複数の人物がいる場合でも、各人物を識別して個別に通過時間を計測することができる。更に、第1地点から第2地点までの間の人物の動きを追跡する必要もない。 Such a configuration makes it possible to accurately measure the passage time required for the person to pass from the first point to the second point. Further, even when there are a plurality of persons, each person can be identified and the passing time can be individually measured. Furthermore, it is not necessary to track the movement of the person between the first point and the second point.
 なお、第1画像の撮影時刻および第2画像の撮影時刻は、各画像が撮像装置で撮影された時刻そのものであることを要求するものではなく、それと同視し得るような時刻であってもよい。すなわち、撮像装置から画像が送信された時刻、第1算出手段や第2算出手段に画像が到着した時刻、第1算出手段や第2算出手段で特徴量を抽出した時刻などであってもよい。要は、第1画像と第2画像の撮影時刻の差を特定できればよく、第1画像と第2画像について同じ条件で取得された時刻であればよい。 Note that the shooting time of the first image and the shooting time of the second image are not required to be the time when each image is shot by the imaging device, but may be a time at which they can be viewed. . That is, the time when the image was transmitted from the imaging device, the time when the image arrived at the first calculation unit or the second calculation unit, the time when the feature amount was extracted by the first calculation unit or the second calculation unit, or the like may be used. . In short, it is only necessary to be able to specify the difference between the shooting times of the first image and the second image, and it is only necessary that the time is the same for the first image and the second image.
 ここで、前記第1画像の撮影時刻と前記第2画像の撮影時刻との差分が、第1の閾値を超える場合、または、前記第1の閾値より小さい第2の閾値未満の場合には、通過時間としての記録の対象外とすることが好ましい。
 これにより、標準的な通過時間から極端に外れた通過時間の記録を除外することが可能になる。すなわち、行列に割りこんだ人物や、行列に並び直した人物のように、通過時間の調査においてノイズとなるような行動をとった人物を調査対象から除外することができ、調査結果の信頼性を高めることが可能となる。
Here, when the difference between the shooting time of the first image and the shooting time of the second image exceeds a first threshold, or when the difference is less than a second threshold smaller than the first threshold, It is preferable to exclude the recording as the passage time.
This makes it possible to exclude records of passage times that deviate significantly from the standard passage time. In other words, it is possible to exclude people who have taken actions that cause noise in the transit time survey, such as people who have been interrupted in the queue or those who have been rearranged in the queue, and the reliability of the survey results Can be increased.
 また、通過時間の計測の対象外とする人物の特徴量を記憶する記憶手段を更に備え、前記第1画像から算出された特徴量と前記記憶手段に記憶された特徴量とを比較し、両者の類似度が所定値以上の場合に、前記第1画像に含まれる人物を通過時間の計測の対象外とすることが好ましい。
 これにより、警備員や関係者のように、通過時間を調査する必要がない人物を調査対象から除外することができ、調査結果の信頼性を高めることが可能となる。
The image processing apparatus further includes a storage unit that stores a feature amount of a person who is not subject to measurement of the passage time, and compares the feature amount calculated from the first image with the feature amount stored in the storage unit. It is preferable that the person included in the first image is excluded from the measurement target of the transit time when the similarity is equal to or greater than a predetermined value.
As a result, it is possible to exclude persons who do not need to investigate the transit time, such as security guards and related persons, from the survey target, and it is possible to increase the reliability of the survey results.
 本発明によれば、ある地点から別の地点までの人物の通過時間をより高精度に計測することが可能となる。 According to the present invention, it is possible to measure the passage time of a person from one point to another with higher accuracy.
本発明の第1実施例に係る人物通過時間計測システムの構成例を示す図である。It is a figure which shows the structural example of the person passage time measuring system which concerns on 1st Example of this invention. 人物通過時間の計測環境の模式図であるIt is a schematic diagram of the measurement environment of person passage time 第1実施例における入口側画像処理部の処理手順の例を示す図である。It is a figure which shows the example of the process sequence of the entrance side image process part in 1st Example. 第1実施例における出口側画像処理部の処理手順の例を示す図である。It is a figure which shows the example of the process sequence of the exit side image process part in 1st Example. 本発明の第2実施例に係る人物通過時間計測システムの構成例を示す図である。It is a figure which shows the structural example of the person passage time measuring system which concerns on 2nd Example of this invention. 第2実施例における入口側画像処理部の処理手順の例を示す図である。It is a figure which shows the example of the process sequence of the entrance side image process part in 2nd Example. 第2実施例における出口側画像処理部の処理手順の例を示す図である。It is a figure which shows the example of the process sequence of the exit side image process part in 2nd Example.
 本発明の一実施形態に係る人物通過時間計測システムについて、図面を参照して説明する。以下では、空港の保安検査場やイベント会場のチケット売り場などの入口と出口がある環境において、人物が入口から出口までの移動に要した通過時間を計測する場合について説明する。 A person passing time measurement system according to an embodiment of the present invention will be described with reference to the drawings. In the following, a description will be given of a case where a person measures the transit time required for movement from the entrance to the exit in an environment where there is an entrance and an exit such as an airport security checkpoint or an event venue ticket office.
[第1実施例]  図1には、第1実施例に係る人物通過時間計測システムの構成例を示してある。
 本例の人物通過時間計測システムは、図1に示すように、ネットワーク100に、入口側撮像装置101A、出口側撮像装置101B、人物通過時間計測装置102が接続され、互いに通信可能な状態に構成される。
First Embodiment FIG. 1 shows a configuration example of a person passing time measuring system according to a first embodiment.
As shown in FIG. 1, the person passage time measurement system of this example is configured such that an entrance side imaging device 101A, an exit side imaging device 101B, and a person passage time measurement device 102 are connected to a network 100 and can communicate with each other. Is done.
 ネットワーク100は、データ通信を行う専用ネットワークやイントラネット、インターネット、無線LAN等の各装置を相互に接続して通信を行う通信手段である。 The network 100 is a communication means for performing communication by connecting devices such as a dedicated network for data communication, an intranet, the Internet, and a wireless LAN.
 入口側撮像装置101Aおよび出口側撮像装置101Bは、CCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)素子等で撮像した画像にデジタル変換処理を施し、変換された画像データを、ネットワーク100を介して録画装置(不図示)へ出力するネットワークカメラや監視カメラ等の装置である。 The entrance-side imaging device 101A and the exit-side imaging device 101B perform digital conversion processing on an image captured by a CCD (Charge-Coupled Device) or CMOS (Complementary-Metal-Oxide-Semiconductor) element, and the converted image data is transferred to the network 100. Network cameras and surveillance cameras that output to a recording device (not shown).
 人物通過時間計測装置102は、ネットワークを介して撮像装置100A,100Bより出力された画像データを処理し、人物通過時間を計測・記録する装置である。また、本装置には、人物特徴量を抽出・記録する機能も搭載されている。人物通過時間計測装置102は、入口側画像処理部121と、出口側画像処理部122とを備えている。 The person passage time measuring device 102 is a device that processes the image data output from the imaging devices 100A and 100B via the network, and measures and records the person passage time. The apparatus is also equipped with a function for extracting and recording a person feature. The person passage time measuring apparatus 102 includes an entrance side image processing unit 121 and an exit side image processing unit 122.
 入口側画像処理部121は、入口側撮像装置101Aから出力された画像データの画像処理を行う。詳細な処理内容は、図3を用いて後述する。
 出口側画像処理部122は、出口側撮像装置101Bから出力された画像データの画像処理を行う。詳細な処理内容は、図4を用いて後述する。
The entrance-side image processing unit 121 performs image processing on the image data output from the entrance-side imaging device 101A. Detailed processing contents will be described later with reference to FIG.
The exit side image processing unit 122 performs image processing on the image data output from the exit side imaging device 101B. Detailed processing contents will be described later with reference to FIG.
 図2には、人物通過時間の計測環境の模式図を示してある。図2は、空港の保安検査場やイベント会場のチケット売り場などを想定している。ゲート251は、人が列を作るようなゲートであり、実際には保安検査ゲートやチケット窓口である。ゲート251は複数あってもよいし、ゲート251が無いようなケースでも本発明を適用することができる。 FIG. 2 shows a schematic diagram of the measurement environment of the person passing time. FIG. 2 assumes a security checkpoint at an airport or a ticket office at an event venue. The gate 251 is a gate where people form a line, and is actually a security check gate or a ticket window. A plurality of gates 251 may be provided, and the present invention can be applied to a case where the gate 251 is not provided.
 人物通過時間の計測対象となるエリアには、入口と出口があることとする。入口側には、入口を通過する人物を正面から撮影できるように、入口側撮像装置101Aが設置される。また、出口側には、出口を通過する人物を正面から撮影できるように、出口側撮像装置101Bが設置される。 Suppose that there are an entrance and an exit in the area where the person passing time is measured. On the entrance side, an entrance-side imaging device 101A is installed so that a person passing through the entrance can be photographed from the front. On the exit side, an exit-side imaging device 101B is installed so that a person passing through the exit can be photographed from the front.
 人物252Aは、通過時間の計測対象とする人物である。本例の計測環境は、エリアの入口側にある入口側撮像装置101Aで撮影した人物252Aが、多数の人が並ぶゲート251を通り、しばらくした後に、エリアの出口側にある出口側撮像装置101Bに映るような計測環境である。図2では、人物252Aが出口側に移動した様子を符号252Bで示している。 Person 252A is a person whose passage time is to be measured. In the measurement environment of this example, a person 252A photographed by the entrance-side imaging device 101A on the entrance side of the area passes through the gate 251 where many people are lined up, and after a while, the exit-side imaging device 101B on the exit side of the area. The measurement environment is reflected in In FIG. 2, a state where the person 252A has moved to the exit side is indicated by reference numeral 252B.
 図3には、第1実施例の人物通過時間計測装置102における入口側画像処理部121の処理手順の例を示してある。
 入口側画像処理部121は、まず、メモリの確保やパラメータを初期化する初期化処理(S301)を行う。
 次に、入口側撮像装置101Aから画像データを受信し、人物通過時間計測装置102のメモリ上に保持する入口側画像受信処理(S302)を行う。入口側撮像装置101Aが画像を圧縮(エンコード)して送信する場合は、受信後に非圧縮の画像に変換(デコード)する処理も行われる。
FIG. 3 shows an example of the processing procedure of the entrance-side image processing unit 121 in the person passage time measuring apparatus 102 of the first embodiment.
The entrance-side image processing unit 121 first performs initialization processing (S301) for securing memory and initializing parameters.
Next, the image data is received from the entrance-side imaging device 101A, and entrance-side image reception processing (S302) that is held in the memory of the person passage time measuring device 102 is performed. When the entrance-side imaging apparatus 101A compresses (encodes) an image and transmits it, a process of converting (decoding) into an uncompressed image after reception is also performed.
 次に、入口側画像の中から人物領域を検出する人物領域検出処理(S303)を行う。例えば、画像中に顔画像と推定されるものが映っているかを探索する。
 次に、人物領域検出処理(S303)の結果、入口側画像から人物領域が検出されたか否かを判定する人物検出判定処理(S304)を行う。人物領域が検出された場合には、人物特徴量抽出処理(S305)に遷移する。一方、人物領域が検出されなかった場合には、新たな画像を受信するために入口側画像受信処理(S302)に遷移する。
Next, person area detection processing (S303) for detecting a person area from the entrance side image is performed. For example, a search is made as to whether an estimated face image is included in the image.
Next, as a result of the person area detection process (S303), a person detection determination process (S304) is performed to determine whether or not a person area has been detected from the entrance-side image. If a person area is detected, the process proceeds to person feature extraction processing (S305). On the other hand, when the person area is not detected, the process proceeds to the entrance-side image reception process (S302) in order to receive a new image.
 人物特徴量抽出処理(S305)では、人物領域検出処理(S303)で検出された人物毎に人物特徴量を数値として算出する。本例では、人物特徴量として、顔特徴量を抽出する。
 次に、人物特徴量と現在の時刻(入口通過時刻)の両方を、出口側画像処理部122と共通で利用できるメモリ領域に保存する人物特徴量・時刻記録処理(S306)を行う。
In the person feature quantity extraction process (S305), the person feature quantity is calculated as a numerical value for each person detected in the person area detection process (S303). In this example, a face feature amount is extracted as a person feature amount.
Next, a person feature / time recording process (S306) is performed in which both the person feature and the current time (entrance passage time) are stored in a memory area that can be used in common with the exit-side image processing unit 122.
 次に、終了すべき条件があるか否かを判定する終了判定処理(S307)を行う。終了すべき条件となっていれば、確保したメモリの解放等の終了処理を行う。終了すべき条件となっていなければ、新たな入口側画像を受信するために入口側画像受信処理(S302)に遷移する。終了すべき条件としては、例えば、人の手によって人物通過時間計測装置102のスイッチがオフにされた場合などがある。 Next, an end determination process (S307) for determining whether there is a condition to be ended is performed. If it is a condition to be terminated, a termination process such as releasing the secured memory is performed. If it is not the condition to end, the process proceeds to the entrance side image reception process (S302) in order to receive a new entrance side image. As a condition to be terminated, for example, there is a case where a switch of the person passage time measuring apparatus 102 is turned off by a human hand.
 図4には、第1実施例の人物通過時間計測装置102における出口側画像処理部122の処理手順の例を示してある。
 出口側画像処理部122は、まず、メモリの確保やパラメータを初期化する初期化処理(S401)を行う。
 次に、出口側撮像装置101Bから画像データを受信し、人物通過時間計測装置102のメモリ上に保持する出口側画像受信処理(S402)を行う。出口側撮像装置101Bが画像を圧縮(エンコード)して送信する場合は、受信後に非圧縮の画像に変換(デコード)する処理も行われる。
FIG. 4 shows an example of the processing procedure of the exit-side image processing unit 122 in the person passage time measuring apparatus 102 of the first embodiment.
The exit-side image processing unit 122 first performs initialization processing (S401) for securing memory and initializing parameters.
Next, the image data is received from the exit-side imaging device 101B, and the exit-side image reception process (S402) that is held in the memory of the person passage time measuring device 102 is performed. When the exit side imaging apparatus 101B compresses (encodes) an image and transmits it, a process of converting (decoding) it into an uncompressed image after reception is also performed.
 次に、出口側画像の中から人物領域を検出する人物領域検出処理(S403)を行う。例えば、画像中に顔画像と推定されるものが映っているかを探索する。
 次に、人物領域検出処理(S403)の結果、出口側画像から人物領域が検出されたか否かを判定する人物検出判定処理(S404)を行う。人物領域が検出された場合には、人物特徴量抽出処理(S405)に遷移する。一方、人物領域が検出されなかった場合には、新たな画像を受信するために出口側画像受信処理(S402)に遷移する。
Next, a person area detection process (S403) for detecting a person area from the exit side image is performed. For example, a search is made as to whether an estimated face image is included in the image.
Next, as a result of the person area detection process (S403), a person detection determination process (S404) for determining whether a person area is detected from the exit side image is performed. If a person area is detected, the process proceeds to person feature extraction processing (S405). On the other hand, when the person area is not detected, the process proceeds to the exit side image reception process (S402) in order to receive a new image.
 人物特徴量抽出処理(S405)では、人物領域検出処理(S403)で検出された人物毎に人物特徴量を数値として算出する。本例では、人物特徴量として、顔特徴量を抽出する。
 次に、人物特徴量抽出処理(S405)で算出した人物特徴量と、図3の人物特徴量・時刻記録処理(S306)で保存しておいた複数の人物特徴量とを比較する同一人物判定処理(S406)を行う。すなわち、入口を通過した複数の人物の中から、出口に到着した人物と同じ人物を探索する処理を、入口で取得した人物特徴量と出口で取得した人物特徴量とを比較することで行う。ここでは、入口で取得しておいた複数の人物特徴量の各々について、出口で取得した人物特徴量との類似度を算出し、類似度が最も高い人物(人物特徴量の差が最も小さい人物)が出口に到着した人物と同一人物であると判定し、時間差記録処理(S407)に遷移する。なお、最も高い類似度であっても、その類似度が所定の閾値未満である場合には、入口を通過した人物の中から出口に到着した人物と同一人物が見つからなかったと判定し、終了判定処理(S409)に遷移する。
In the person feature quantity extraction process (S405), the person feature quantity is calculated as a numerical value for each person detected in the person area detection process (S403). In this example, a face feature amount is extracted as a person feature amount.
Next, the same person determination that compares the person feature amount calculated in the person feature amount extraction process (S405) with the plurality of person feature amounts stored in the person feature amount / time recording process (S306) in FIG. Processing (S406) is performed. That is, the process of searching for the same person as the person arriving at the exit from among a plurality of persons who have passed through the entrance is performed by comparing the person feature obtained at the entrance with the person feature obtained at the exit. Here, for each of the plurality of person feature quantities acquired at the entrance, the similarity with the person feature quantity obtained at the exit is calculated, and the person with the highest similarity (the person with the smallest difference in person feature quantities) is calculated. ) Is the same person as the person arriving at the exit, and the process proceeds to the time difference recording process (S407). Even if the similarity is the highest, if the similarity is less than a predetermined threshold, it is determined that the same person as the person arriving at the exit was not found among the persons who passed through the entrance, and the end determination was made. The process proceeds to processing (S409).
 時間差記録処理(S407)では、同一人物判定処理(S406)で同一人物と判定された人物に対して図3の人物特徴量・時刻記録処理(S306)で保存された入口通過時刻と、現在の時刻(出口通過時刻)との差を計算し、その人物が入口から出口までの通過に要した通過時間として、データベース等に記録する。
 通過時間の算出後は、その人物に関する人物特徴量と入口通過時刻は不要となるため、該当する人物特徴量と時刻を削除する人物特徴量削除処理(S408)を行う。つまり、入口と出口の間のエリア内にいる人物に関する人物特徴量と入口通過時刻のみが、メモリ内に残るようにする。
In the time difference recording process (S407), the entrance passage time stored in the person feature amount / time recording process (S306) of FIG. 3 for the person determined to be the same person in the same person determination process (S406) and the current time The difference from the time (exit passage time) is calculated and recorded in the database or the like as the transit time required for the person to pass from the entrance to the exit.
After the calculation of the passage time, the person feature quantity and the entrance passage time relating to the person are no longer necessary, and the person feature quantity deletion process (S408) for deleting the corresponding person feature quantity and time is performed. That is, only the person feature amount and the entrance passage time relating to the person in the area between the entrance and the exit remain in the memory.
 次に、終了すべき条件があるか否かを判定する終了判定処理(S409)を行う。終了すべき条件となっていれば、確保したメモリの解放等の終了処理を行う。終了すべき条件となっていなければ、新たな出口側画像を受信するために出口側画像受信処理(S402)に遷移する。終了すべき条件としては、例えば、人の手によって人物通過時間計測装置102のスイッチがオフにされた場合などがある。 Next, an end determination process (S409) for determining whether there is a condition to be ended is performed. If it is a condition to be terminated, a termination process such as releasing the secured memory is performed. If it is not the condition to end, the process proceeds to the exit side image reception process (S402) in order to receive a new exit side image. As a condition to be terminated, for example, there is a case where a switch of the person passage time measuring apparatus 102 is turned off by a human hand.
 以上のように、第1実施例に係る人物通過時間計測システムは、入口側画像処理部121において、入口側画像から人物の特徴量を算出する処理(S302~S305)を行い、出口側画像処理部122において、出口側画像から人物の特徴量を算出する処理(S402~S405)と、入口側画像から算出された特徴量と出口側画像から算出された特徴量とを比較して、出口側画像に含まれる人物と同一の人物が入口側画像に含まれるかを判定する処理(S406)と、出口側画像に含まれる人物と同一の人物が入口側画像に含まれると判定された場合に、入口側画像の撮影時刻と出口側画像の撮影時刻との差分を、当該人物の通過時間として記録する処理(S407)とを行うように構成されている。 As described above, in the person passing time measurement system according to the first example, the entrance-side image processing unit 121 performs the process (S302 to S305) of calculating the feature amount of the person from the entrance-side image, and exit-side image processing. In the unit 122, the process of calculating the feature amount of the person from the exit side image (S402 to S405) is compared with the feature amount calculated from the entrance side image and the feature amount calculated from the exit side image. Processing for determining whether the same person as the person included in the image is included in the entrance image (S406), and when it is determined that the same person as the person included in the exit image is included in the entrance image The process of recording the difference between the shooting time of the entrance-side image and the shooting time of the exit-side image as the passing time of the person (S407) is performed.
 このような構成によれば、人物が入口から出口までの通過に要した通過時間を正確に計測することが可能となる。また、複数の人物がいる場合でも、各人物を識別して個別に通過時間を計測することができる。更に、入口から出口までの間の人物の動きを追跡する必要もない。 According to such a configuration, it is possible to accurately measure the passing time required for the person to pass from the entrance to the exit. Further, even when there are a plurality of persons, each person can be identified and the passing time can be individually measured. Furthermore, it is not necessary to track the movement of the person from the entrance to the exit.
 また、通過時間を算出後は、その人物に関する人物特徴量と撮影時刻を削除する処理を行うように構成されている。
 このような構成によれば、入口と出口の間のエリア内にいる人物に関する人物特徴量と入口通過時刻のみが、メモリ内に残るため、同一人物判定の処理が速くなる。
In addition, after the passage time is calculated, a process is performed to delete the person feature and the shooting time related to the person.
According to such a configuration, only the person feature amount and the entrance passage time relating to the person in the area between the entrance and the exit remain in the memory, so that the same person determination process is accelerated.
[第2実施例]  図5には、第2実施例に係る人物通過時間計測システムの構成例を示してある。
 第2実施例の人物通過時間計測システムは、第1実施例のシステム構成(図1)と比較すると、人物通過時間計測装置102が対象外人物特徴量記憶部123を備えた点で相違している。
 対象外人物特徴量記憶部123は、入口側画像処理部121が通過時間の計測の対象外とする人物を判定するために、対象外とする人物についての人物特徴量を事前に記憶している。
Second Embodiment FIG. 5 shows a configuration example of a person passage time measuring system according to a second embodiment.
The person passage time measurement system of the second embodiment is different from the system configuration (FIG. 1) of the first embodiment in that the person passage time measurement device 102 includes a non-target person feature amount storage unit 123. Yes.
The non-target person feature amount storage unit 123 stores in advance the human feature amount of a person who is not a target in order for the entrance-side image processing unit 121 to determine a person who is not a target for the passage time measurement. .
 図6には、第2実施例の人物通過時間計測装置102における入口側画像処理部121の処理手順の例を示してある。図6の処理S301~307は、第1実施例における入口側画像処理部121の処理(図3)と同一であるので、説明を省略する。 FIG. 6 shows an example of the processing procedure of the entrance-side image processing unit 121 in the person passage time measuring apparatus 102 of the second embodiment. Processes S301 to S307 in FIG. 6 are the same as the processes (FIG. 3) of the entrance-side image processing unit 121 in the first embodiment, and thus description thereof is omitted.
 第2実施例では、人物特徴量抽出処理(S305)の後に、非計測人物判定処理(S501)を行う。非計測人物判定処理(S601)では、対象外人物特徴量記憶部123に記憶してある人物特徴量と、人物特徴量抽出処理(S305)で算出した人物特徴量との差を算出し、予め設定された基準の値(閾値)と比較する。そして、算出した人物特徴量の差が基準の値よりも小さければ、対象外の人物が入口側撮像装置101Aに映ったと判定し、人物特徴量・時刻記録処理(S306)を行わずに、終了判定処理(S307)に遷移する。 In the second embodiment, the non-measurement person determination process (S501) is performed after the person feature amount extraction process (S305). In the non-measurement person determination process (S601), a difference between the person feature quantity stored in the non-target person feature quantity storage unit 123 and the person feature quantity calculated in the person feature quantity extraction process (S305) is calculated in advance. Compare with the set reference value (threshold). If the calculated difference between the person feature amounts is smaller than the reference value, it is determined that a person who is not the subject is reflected on the entrance-side imaging apparatus 101A, and the process ends without performing the person feature amount / time recording process (S306). The process proceeds to a determination process (S307).
 このようにすることによって、警備員や関係者等の予め定めた特定の人物について、通過時間の計測対象から除外することが可能になる。本処理は、出口側にて判定を行ってもよいが、入口側で行うことによって、人物特徴量・時刻記録処理(S306)における処理が不要になり、人物通過時間計測装置102の処理不可やメモリ量の削減になる。 In this way, it is possible to exclude predetermined specific persons such as security guards and related persons from being measured for passing time. This process may be determined on the exit side. However, if the process is performed on the entrance side, the process in the person feature / time recording process (S306) becomes unnecessary, and the process of the person passing time measuring apparatus 102 cannot be performed. The amount of memory is reduced.
 図7には、第2実施例の人物通過時間計測装置102における出口側画像処理部122の処理手順の例を示してある。図7の処理S401~409は、第1実施例における出口側画像処理部122の処理(図4)と同一であるので、説明を省略する。 FIG. 7 shows an example of the processing procedure of the exit-side image processing unit 122 in the person passage time measuring apparatus 102 of the second embodiment. Processes S401 to S409 in FIG. 7 are the same as the process of the exit side image processing unit 122 (FIG. 4) in the first embodiment, and thus description thereof is omitted.
 第2実施例では、初期化処理(S401)の後に、標準時間差算出処理(S701)を行う。標準時間差算出処理(S701)では、過去の複数の人物の通過時間の平均値から標準時間差を算出する処理である。標準時間差の算出のサンプルとなる複数の人物は、任意の手法により選択することができる。例えば、直近の数十名の通過時間の平均値から標準時間差を算出してもよいし、過去の同一の曜日及び同一の時間帯の数十名の通過時間の平均値から標準時間差を算出してもよい。 In the second embodiment, standard time difference calculation processing (S701) is performed after initialization processing (S401). The standard time difference calculation process (S701) is a process for calculating the standard time difference from the average value of the passage times of a plurality of persons in the past. A plurality of persons serving as samples for calculating the standard time difference can be selected by an arbitrary method. For example, the standard time difference may be calculated from the average value of the passing times of the most recent dozens of people, or the standard time difference may be calculated from the average value of the passing times of dozens of the same past day of the week and the same time zone. May be.
 次に、標準時間差算出処理(S701)で算出した標準時間差に所定の数値を加算した値を規定上限時間として、人物特徴量・時刻記録処理(S306)で保存された時刻からの経過時間と比較する規定時間超保持特徴量存在判定処理(S702)を行う。人物特徴量・時刻記録処理(S306)で保存された時刻は複数ある場合があるため、本処理は複数回繰り返される。標準時間差に加算する所定の数値は、予め設定された固定値でもよいし、標準時間差算出において平均値を算出するときに標準偏差を算出し、3σを加えた値などのような動的に変更される値でもよい。保存からの経過時間が規定上限時間を超える人物特徴量が存在する場合には、人物特徴量削除処理(S703)に遷移する。そのような人物特徴量が存在しない場合には、出口側画像受信処理(S402)に遷移する。 Next, a value obtained by adding a predetermined numerical value to the standard time difference calculated in the standard time difference calculation process (S701) is compared with the elapsed time from the time stored in the person feature amount / time recording process (S306) as a specified upper limit time. A feature amount existence determination process exceeding the specified time (S702) is performed. Since there may be a plurality of times stored in the person feature amount / time recording process (S306), this process is repeated a plurality of times. The predetermined numerical value added to the standard time difference may be a fixed value set in advance, or the standard deviation is calculated when calculating the average value in the standard time difference calculation, and is dynamically changed such as a value obtained by adding 3σ. It may be a value. If there is a person feature quantity whose elapsed time from the storage exceeds the specified upper limit time, the process proceeds to a person feature quantity deletion process (S703). If no such person feature amount exists, the process proceeds to the exit-side image reception process (S402).
 人物特徴量削除処理(S703)では、人物特徴量削除処理(S408)と同様に、その人物特徴量と時刻を削除する。保存からの経過時間が規定上限時間を超える人物特徴量を削除する点が、人物特徴量削除処理(S408)とは異なる。 In the person feature deletion process (S703), the person feature and time are deleted in the same manner as the person feature deletion process (S408). This is different from the person feature amount deletion processing (S408) in that the person feature amount whose elapsed time from the storage exceeds the specified upper limit time is deleted.
 また、第2実施例では、同一人物判定処理(S406)の後に、規定時間未満判定処理(S704)を行う。規定時間未満判定処理(S704)では、同一人物判定処理(S406)で同一人物と判定された人物に対して図3の人物特徴量・時刻記録処理(S306)で保存された時刻(入口通過時刻)と、現在の時刻(出口通過時刻)との差を計算し、標準時間差算出処理(S701)で算出した標準時間差に所定の数値を減算した規定下限時間と比較する。そして、算出した時間差が規定時間未満と判定されれば、時間差記録処理(S407)を行わずに、人物特徴量削除処理(S408)に遷移する。標準時間差から減算する所定の数値は、予め設定された固定値でもよいし、標準時間差算出において平均値を算出するときに標準偏差を算出し、3σを加えた値などのような動的に変更される値でもよい。 Also, in the second embodiment, after the same person determination process (S406), a less than prescribed time determination process (S704) is performed. In the less than prescribed time determination process (S704), the time (entrance passage time) stored in the person feature / time recording process (S306) of FIG. 3 for the person determined to be the same person in the same person determination process (S406) ) And the current time (exit passage time), and the difference is compared with a specified lower limit time obtained by subtracting a predetermined numerical value from the standard time difference calculated in the standard time difference calculation process (S701). If it is determined that the calculated time difference is less than the specified time, the process shifts to the person feature amount deletion process (S408) without performing the time difference recording process (S407). The predetermined numerical value to be subtracted from the standard time difference may be a fixed value set in advance, or the standard deviation is calculated when calculating the average value in the standard time difference calculation, and is dynamically changed such as a value obtained by adding 3σ. It may be a value.
 以上のように、第2実施例に係る人物通過時間計測システムは、通過時間の計測の対象外とする人物の特徴量を記憶する対象外人物特徴量記憶部123を更に備え、入口側画像から算出された特徴量と対象外人物特徴量記憶部123に記憶された特徴量とを比較し、両者の類似度が所定値以上の場合に、入口側画像に含まれる人物を通過時間の計測の対象外とする構成となっている。
 また、入口側画像の撮影時刻と出口側画像の撮影時刻との差分が、規定上限時間を超える場合、または、規定下限時間未満の場合には、通過時間としての記録の対象外とする構成となっている。
As described above, the person passage time measurement system according to the second embodiment further includes the non-target person feature amount storage unit 123 that stores the feature amount of the person who is not the target of the passage time measurement, and from the entrance-side image. The calculated feature quantity is compared with the feature quantity stored in the non-target person feature quantity storage unit 123, and when the similarity between the two is greater than or equal to a predetermined value, the person included in the entrance side image is measured for passing time. The configuration is not applicable.
In addition, when the difference between the shooting time of the entrance side image and the shooting time of the exit side image exceeds the specified upper limit time, or when it is less than the specified lower limit time, it is excluded from recording as a passing time. It has become.
 このような構成によれば、通過時間を調査する必要がない人物を調査対象から除外することができ、また、標準的な通過時間から極端に外れた通過時間の記録を除外することができるため、調査結果の信頼性をより高めることが可能となる。 According to such a configuration, a person who does not need to investigate the passage time can be excluded from the investigation target, and a record of the passage time extremely deviating from the standard passage time can be excluded. This makes it possible to further improve the reliability of the survey results.
 ここで、上記の各実施例では、人物の同一性の判定で使用する特徴量として、顔特徴量を用いる構成としたが、髪型、服装、持ち物等の人物に関する他の要素についても特徴量を算出し、人物の同一性の判定で使用するようにしてもよい。 Here, in each of the above-described embodiments, the face feature amount is used as the feature amount used in the determination of the identity of the person. However, the feature amount is also applied to other elements related to the person such as the hairstyle, clothes, and belongings. It may be calculated and used in the determination of person identity.
 また、上記の各実施例では、人物通過時間計測装置102という1つの装置内で本発明に係る機能を実現したが、本発明に係る機能を複数の装置で実現してもよい。また、入口側撮像装置101Aや出口側撮像装置101Bが互いに連携して本発明に係る機能を実現するように構成してもよい。 Further, in each of the above embodiments, the function according to the present invention is realized in one apparatus called the person passage time measuring apparatus 102, but the function according to the present invention may be realized by a plurality of apparatuses. Further, the entrance-side imaging device 101A and the exit-side imaging device 101B may be configured to cooperate with each other to realize the function according to the present invention.
 また、入口と出口は複数あってもよく、その場合には、入口毎及び出口毎に撮像装置を配置すればよい。また、本発明は、通路の特定の区間における通過時間の測定などのように、入口と出口が明確に規定されないエリアにおける通過時間の測定にも適用することができる。 Also, there may be a plurality of entrances and exits. In that case, an imaging device may be arranged for each entrance and each exit. The present invention can also be applied to measurement of transit time in an area where the entrance and exit are not clearly defined, such as measurement of transit time in a specific section of a passage.
 以上、本発明について詳細に説明したが、本発明は、ここに記載されたシステムに限定されるものではなく、上記以外のシステムにも広く適用できることは言うまでもない。
 また、本発明は、例えば、本発明に係る処理を実行する方法や方式、そのような方法や方式を実現するためのプログラム、そのプログラムを記憶する記憶媒体などとして提供することも可能である。
Although the present invention has been described in detail above, it is needless to say that the present invention is not limited to the system described here and can be widely applied to systems other than those described above.
The present invention can also be provided as, for example, a method and method for executing the processing according to the present invention, a program for realizing such a method and method, and a storage medium for storing the program.
 本発明は、ある地点から別の地点までの人物の通過時間を計測する人物通過時間計測システムに利用することができる。 The present invention can be used in a person passage time measurement system that measures the passage time of a person from one point to another point.
 100:ネットワーク、 101A:入口側撮像装置、 101B:出口側撮像装置、
 102:人物通過時間計測装置、 121:入口側画像処理部、 122:出口側画像処理部、 123:対象外人物特徴量記憶部、 251:ゲート、 252A,252B:人物
100: Network 101A: Entrance side imaging device 101B: Exit side imaging device
102: Person transit time measuring device 121: Entrance side image processing unit 122: Exit side image processing unit 123: Non-target person feature amount storage unit 251: Gate 252A, 252B: Person

Claims (6)

  1.  人物が第1地点から第2地点までの通過に要した通過時間を計測する人物通過時間計測システムにおいて、
     前記第1地点を撮像した第1画像に含まれる人物の特徴量を算出する第1算出手段と、
     前記第2地点を撮像した第2画像に含まれる人物の特徴量を算出する第2算出手段と、
     前記第1画像から算出された特徴量と前記第2画像から算出された特徴量とを比較して、前記第2画像に含まれる人物と同一の人物が前記第1画像に含まれるかを判定する判定手段と、
     前記第2画像に含まれる人物と同一の人物が前記第1画像に含まれると判定された場合に、前記第1画像の撮影時刻と前記第2画像の撮影時刻との差分を、当該人物の通過時間として記録する記録手段とを備えたことを特徴とする人物通過時間計測システム。
    In the person passing time measurement system that measures the passing time required for the person to pass from the first point to the second point,
    First calculation means for calculating a feature amount of a person included in a first image obtained by imaging the first point;
    Second calculating means for calculating a feature amount of a person included in a second image obtained by imaging the second point;
    The feature amount calculated from the first image is compared with the feature amount calculated from the second image to determine whether the same person as the person included in the second image is included in the first image. Determination means to perform,
    If it is determined that the same person as the person included in the second image is included in the first image, the difference between the shooting time of the first image and the shooting time of the second image is calculated as A person passage time measuring system comprising recording means for recording the passage time.
  2.  請求項1に記載の人物通過時間計測システムにおいて、
     前記第1画像の撮影時刻と前記第2画像の撮影時刻との差分が、第1の閾値を超える場合、または、前記第1の閾値より小さい第2の閾値未満の場合には、通過時間としての記録の対象外とすることを特徴とする人物通過時間計測システム。
    In the person passage time measuring system according to claim 1,
    When the difference between the shooting time of the first image and the shooting time of the second image exceeds a first threshold value, or is less than a second threshold value that is smaller than the first threshold value, the passing time is Human transit time measurement system, characterized in that it is excluded from the recording of the above.
  3.  請求項1に記載の人物通過時間計測システムにおいて、
     通過時間の計測の対象外とする人物の特徴量を記憶する記憶手段を更に備え、
     前記第1画像から算出された特徴量と前記記憶手段に記憶された特徴量とを比較し、両者の類似度が所定値以上の場合に、前記第1画像に含まれる人物を通過時間の計測の対象外とすることを特徴とする人物通過時間計測システム。
    In the person passage time measuring system according to claim 1,
    A storage means for storing a feature amount of a person who is not subject to measurement of the passage time;
    The feature amount calculated from the first image is compared with the feature amount stored in the storage unit, and when the similarity between the two is equal to or greater than a predetermined value, the person included in the first image is measured for passing time. Person transit time measuring system, characterized in that it is excluded from the scope of
  4.  人物が第1地点から第2地点までの通過に要した通過時間を計測する人物通過時間計測方法において、
     前記第1地点を撮像した第1画像に含まれる人物の特徴量を算出するステップと、
     前記第2地点を撮像した第2画像に含まれる人物の特徴量を算出するステップと、
     前記第1画像から算出された特徴量と前記第2画像から算出された特徴量とを比較して、前記第2画像に含まれる人物と同一の人物が前記第1画像に含まれるかを判定するステップと、
     前記第2画像に含まれる人物と同一の人物が前記第1画像に含まれると判定された場合に、前記第1画像の撮影時刻と前記第2画像の撮影時刻との差分を、当該人物の通過時間として記録するステップとを有することを特徴とする人物通過時間計測方法。
    In the person passing time measuring method for measuring the passing time required for the person to pass from the first point to the second point,
    Calculating a feature amount of a person included in a first image obtained by imaging the first point;
    Calculating a feature amount of a person included in a second image obtained by imaging the second point;
    The feature amount calculated from the first image is compared with the feature amount calculated from the second image to determine whether the same person as the person included in the second image is included in the first image. And steps to
    If it is determined that the same person as the person included in the second image is included in the first image, the difference between the shooting time of the first image and the shooting time of the second image is calculated as And a step of recording the passage time as a person passage time measurement method.
  5.  請求項4に記載の人物通過時間計測方法において、
     前記第1画像の撮影時刻と前記第2画像の撮影時刻との差分が、第1の閾値を超える場合、または、前記第1の閾値より小さい第2の閾値未満の場合には、通過時間としての記録の対象外とすることを特徴とする人物通過時間計測方法。
    In the person passage time measuring method according to claim 4,
    When the difference between the shooting time of the first image and the shooting time of the second image exceeds a first threshold value, or is less than a second threshold value that is smaller than the first threshold value, the passing time is A person passing time measurement method, characterized in that it is excluded from the recording target.
  6.  請求項4に記載の人物通過時間計測方法において、
     通過時間の計測の対象外とする人物の特徴量を予め記憶しておき、前記第1画像から算出された特徴量と前記記憶しておいた特徴量とを比較し、両者の類似度が所定値以上の場合に、前記第1画像に含まれる人物を通過時間の計測の対象外とすることを特徴とする人物通過時間計測方法。
    In the person passage time measuring method according to claim 4,
    The feature amount of the person who is not subject to the measurement of the passage time is stored in advance, the feature amount calculated from the first image is compared with the stored feature amount, and the degree of similarity between them is predetermined. A person passing time measurement method, wherein if it is equal to or greater than the value, the person included in the first image is excluded from the passing time measurement target.
PCT/JP2019/008995 2018-03-26 2019-03-07 Person transit time measurement system and person transit time measurement method WO2019188079A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020509778A JP7108022B2 (en) 2018-03-26 2019-03-07 Person passing time measuring system and person passing time measuring method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-058528 2018-03-26
JP2018058528 2018-03-26

Publications (1)

Publication Number Publication Date
WO2019188079A1 true WO2019188079A1 (en) 2019-10-03

Family

ID=68059846

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/008995 WO2019188079A1 (en) 2018-03-26 2019-03-07 Person transit time measurement system and person transit time measurement method

Country Status (2)

Country Link
JP (1) JP7108022B2 (en)
WO (1) WO2019188079A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022062675A (en) * 2020-10-08 2022-04-20 株式会社日立製作所 Method and apparatus for people flow analysis using similar-image search

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017151832A (en) * 2016-02-26 2017-08-31 株式会社日立製作所 Wait time calculation system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007317052A (en) * 2006-05-29 2007-12-06 Japan Airlines International Co Ltd System for measuring waiting time for lines
JP5789776B2 (en) * 2014-04-28 2015-10-07 パナソニックIpマネジメント株式会社 Residence time measuring device, residence time measuring system, and residence time measuring method
JP5960864B1 (en) * 2015-03-25 2016-08-02 三菱電機インフォメーションシステムズ株式会社 Processing time calculation apparatus and processing time calculation program

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017151832A (en) * 2016-02-26 2017-08-31 株式会社日立製作所 Wait time calculation system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022062675A (en) * 2020-10-08 2022-04-20 株式会社日立製作所 Method and apparatus for people flow analysis using similar-image search
JP7235820B2 (en) 2020-10-08 2023-03-08 株式会社日立製作所 People Flow Analysis Method and Apparatus by Similar Image Retrieval
US11657123B2 (en) 2020-10-08 2023-05-23 Hitachi, Ltd. Method and apparatus for people flow analysis using similar-image search

Also Published As

Publication number Publication date
JPWO2019188079A1 (en) 2021-03-25
JP7108022B2 (en) 2022-07-27

Similar Documents

Publication Publication Date Title
US10290162B2 (en) Information processing apparatus, information processing method, and storage medium
CA2804468C (en) System and method for face capture and matching
JP6854881B2 (en) Face image matching system and face image search system
CN110073407B (en) Face image processing method and face image processing apparatus
US20160191865A1 (en) System and method for estimating an expected waiting time for a person entering a queue
US20060067456A1 (en) People counting systems and methods
CN108229333A (en) For identifying the method for the event in sport video
US20230368559A1 (en) Information processing apparatus, information processing method, and non-transitory computer-readable storage medium
JP2017151832A (en) Wait time calculation system
JP2018036782A (en) Information processing device, information processing method, and program
US10902355B2 (en) Apparatus and method for processing information and program for the same
CN112016731A (en) Queuing time prediction method and device and electronic equipment
WO2019188079A1 (en) Person transit time measurement system and person transit time measurement method
CN111353374A (en) Information processing apparatus, control method thereof, and computer-readable storage medium
JP7439897B2 (en) Stay management device, stay management method, program and stay management system
KR102099816B1 (en) Method and apparatus for collecting floating population data on realtime road image
US10796165B2 (en) Information processing apparatus, method for controlling the same, and non-transitory computer-readable storage medium
Al-Ahmadi et al. Statistical analysis of the crowd dynamics in Al-Masjid Al-Nabawi in the city of Medina, Saudi Arabia
JP6558178B2 (en) Nuisance agent estimation system, control method and control program for nuisance agent estimation system
US20220375227A1 (en) Counting system, counting method, and program
CN113591713A (en) Image processing method and device, electronic equipment and computer readable storage medium
JP6879336B2 (en) Annoying actor estimation system, control method and control program of annoying actor estimation system
JP6883345B2 (en) Customer number measurement method and customer number measurement device
JP2021081881A (en) Information processing device, information processing method, program, and camera system
JP2016035689A (en) Entry/exit management system and entry/exit management method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19775114

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020509778

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19775114

Country of ref document: EP

Kind code of ref document: A1