WO2014156733A1 - 人数計数装置および人数計数方法 - Google Patents
人数計数装置および人数計数方法 Download PDFInfo
- Publication number
- WO2014156733A1 WO2014156733A1 PCT/JP2014/056954 JP2014056954W WO2014156733A1 WO 2014156733 A1 WO2014156733 A1 WO 2014156733A1 JP 2014056954 W JP2014056954 W JP 2014056954W WO 2014156733 A1 WO2014156733 A1 WO 2014156733A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- edge
- circle
- candidate
- person
- luminance gradient
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/168—Segmentation; Edge detection involving transform domain methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20061—Hough transform
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30242—Counting objects in image
Definitions
- the present invention relates to a number counting device for counting the number of persons present in a target area.
- a conventional number counting device for example, the presence or absence of a person is detected by an infrared sensor or the like, and when a person exists, the number of persons in the target area is measured by increasing a counter provided in the apparatus. It is.
- an image (input image) obtained by imaging a target area by an imaging device equipped with a solid-state imaging device such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal-Oxide Semiconductor) is processed, and a predetermined image in the input image is processed. Is automatically detected, the number of persons is measured based on attribute information such as the position and movement path of the object, and the counter provided in the device is increased to increase the number of persons in the target area. Counting is done.
- Patent Document 1 discloses that a head model corresponding to the size of the head expected in each part of the image when a person is imaged (the ratio of the edge to the size of the head model) is shifted in position.
- Set extract a human head included in the image based on the set head model, associate the extracted heads among the plurality of images included in the moving image, and associate the heads with each other
- a number counting device is disclosed that counts the number of people included in a moving image corresponding to a predetermined condition (time, region) based on the number of tracks of a part.
- the method is based on an infrared sensor, it is not possible to count the person moving only in a specific direction because the direction of movement of the person is not recognized. If this is the case, there is a problem in that it is impossible to count the correct number of people when a plurality of people appear in the field of view.
- the present invention has been made in view of such a conventional situation, and an object thereof is to propose a technique capable of accurately counting the persons included in an image.
- the number counting device is configured as follows. That is, in the number counting device according to the present invention, an edge extracting unit that extracts an edge from a planar image of a target region, and a circle that detects a circle candidate included in the planar image based on the edge extracted by the edge extracting unit. For each of the candidate circles detected by the candidate detection means and the circle candidate detection means, a luminance gradient is calculated for each of a plurality of edge pixels constituting the edge of the circle candidate, and the luminance gradient relating to each edge pixel is calculated. A person determination unit that determines a circle candidate having a height higher than the reference once as a person's head, and the number of circle candidates that are determined as a person's head by the human head determination unit And a counting means.
- the circle candidate is determined based on the uniformity of the luminance gradient of each edge pixel constituting the edge of the circle candidate. It is possible to effectively determine whether or not the image has a circular shape like a portion, and it is possible to accurately count the number of persons included in the planar image.
- the person determination means uses the distribution of the direction of the luminance gradient related to each edge pixel as a circle as an index indicating the uniformity of the luminance gradient related to each edge pixel constituting the edge of the circle candidate.
- the uniformity of the luminance gradient of each edge pixel constituting the candidate edge of the circle is higher than the standard. It is determined whether or not This focuses on the direction of the luminance gradient associated with each edge pixel that constitutes the edge of a circle candidate, and the luminance gradient associated with each edge pixel is highly distributed in a radial pattern (those with a high circularity). The head is determined.
- the person determination means uses the degree of variation in the magnitude of the luminance gradient associated with each edge pixel as an index representing the uniformity of the luminance gradient associated with each edge pixel constituting the edge of the circle candidate. By calculating and comparing the calculation result with a predetermined threshold value, it is determined whether the uniformity of the luminance gradient related to each edge pixel constituting the edge of the circle candidate has a height higher than the reference. . Focusing on the magnitude of the brightness gradient associated with each edge pixel that constitutes the edge of a circle candidate, the person with a low degree of variation in the magnitude of the brightness gradient associated with each edge pixel (highly circular) The head is determined.
- the circle candidate is determined based on the uniformity of the luminance gradient of each edge pixel constituting the edge of the circle candidate. It is possible to effectively determine whether or not it has such circularity, and it is possible to accurately count the number of persons included in the planar image.
- FIG. 1 It is a figure which shows the structural example of the monitoring apparatus to which the people counting device which concerns on this invention is applied. It is a figure which shows the example of the flowchart of the people counting process which concerns on this invention.
- (A) is a figure which shows the example of an input image
- (b) is a figure which shows the example of an edge image.
- (A) is an example of a circle passing through the i-th edge pixel (xi, yi)
- (b) is an example of expressing a circle distribution passing through the i-th edge pixel (xi, yi) in a three-dimensional Hough space. It is.
- (A) is a figure which shows the example of a circle candidate with the high degree of distribution of the direction of the brightness
- (b) is a figure of the circle with the said low degree of distribution. It is a figure which shows the example of a candidate.
- (A) is a figure which shows the example of a circle candidate with the low dispersion
- (b) is a figure which shows the example of a circle candidate with the said high degree of dispersion
- FIG. 1 shows a configuration example of a monitoring device to which the number counting device according to the present invention is applied.
- the monitoring apparatus of this example includes an imaging apparatus 101, a video input circuit 102, an image processor 103, a program memory 104, a work memory 105, an external I / F circuit 106, a video output circuit 107, and a data bus. 108, an instruction device 109, and a display device 110.
- the video input circuit 102, the image processor 103, the program memory 104, the work memory 105, the external I / F circuit 106, and the video output circuit 107 are connected to the data bus 108.
- the configuration of the imaging device includes components for controlling the imaging device and various external recording devices, but this example is used for the sake of simplicity. Is omitted.
- the imaging device 101 images a region to be monitored.
- the video of the monitoring area obtained by the imaging device 101 is given to the video input circuit 102 and recorded in the work memory 105 via the video input circuit 102.
- imaging is performed from the viewpoint of looking down on the monitoring area from vertically above.
- the image processor 103 processes the video recorded in the work memory 105 according to the program recorded in the program memory 104 and displays the processing result on the display device 110 via the video output circuit 107. Also, the image processor 103 changes the program parameters while changing or correcting the program parameters based on instructions from the operator using the pointing device 109 such as a mouse and a keyboard, which are input via the external I / F circuit 106. Processed video. The image processor 103 performs a number counting process for counting the number of persons included in the video.
- FIG. 2 shows an example of a flowchart of the number counting process by the monitoring device of this example.
- the head of the person in the plane image obtained by photographing the person from above vertically has a circular outline as shown in FIG.
- the number of persons included in the video is counted.
- image input step 201 image input step 201, object edge extraction step 202, circular area extraction step 203, person feature amount calculation step 204, object candidate determination step 205, object tracking / trajectory detection step 206, trajectory evaluation / object
- image input step 201 image input step 201, object edge extraction step 202, circular area extraction step 203, person feature amount calculation step 204, object candidate determination step 205, object tracking / trajectory detection step 206, trajectory evaluation / object
- the count step 207 and the background learning step 208 are repeated in order.
- FIG. 3A shows an example of an input image, in which one person 303 is shown from above.
- FIG. 3B is an example of an edge image generated from the input image of FIG.
- the edge image is composed of edge pixels having a predetermined value representing an edge and non-edge pixels having another predetermined value.
- a region having a circular edge shape is specified from the edge image generated in the object edge extraction step 202, and this is extracted as a circle candidate.
- a region having a circular edge shape is specified using a known generalized Hough transform technique as follows. That is, among N edge pixels included in an edge image, a circle (point (xi, yi)) passing through an i-th (i is an integer from 1 to N) edge pixel (xi, yi) Assuming a circle above). As shown in FIG. 4A, this circle can be expressed by three parameters, center coordinates (x, y) and radius r.
- an infinite number of circles passing through an edge pixel (xi, yi) can be assumed, and if expressed using a three-dimensional Hough space composed of three parameters (x, y, r), an edge pixel ( The distribution of the parameters of the circle passing through xi, yi) is as shown in FIG. Therefore, for all edge pixels (points) included in the edge image, the distribution of the parameters of the circle that is assumed to pass through the position is examined, and the parameter having the concentrated distribution is specified, so that an area having a circular edge shape is obtained. (Circle candidates) can be detected.
- a parameter array h [x] [y] [r] representing each coordinate in the three-dimensional Hough space is prepared, and all values are initialized to 0. Then, all the edge pixels included in the edge image are targeted in order, and the parameter array h [x] [y] [r corresponding to the parameters (x, y, r) of all the circles assumed to pass through the positions. ] Are added (voted) one by one. After voting for all edge pixels, a predetermined number (for example, within 10) is selected in descending order of the parameter array h [x] [y] [r] (in descending order of the number of votes). Thus, a region having a circular edge shape (a circle candidate) can be detected.
- each of the regions of the human head 302, left shoulder 302, and right shoulder 303 has a circular edge shape in part or all.
- Each region is detected as a circle candidate.
- a person feature amount is calculated for each circle candidate extracted in the circular area extraction step 203.
- This person feature amount is an index representing how much a circle candidate (a region having a circular edge shape) is like a person's head.
- the X direction component dxi of the luminance gradient related to the edge pixel (xi, yi) for example, the luminance of the pixel (xi-1, yi) immediately before in the X direction and the pixel (xi +) after The luminance difference of 1, yi) can be used.
- the luminance gradient uniformity S related to these edge pixels is obtained, and this is used as the circle candidate. Is the human feature amount.
- the uniformity S of the luminance gradient related to each edge pixel constituting the edge of the circle candidate is, for example, whether the luminance gradient direction related to each edge pixel viewed from the center of the circle is the same direction, in other words, It can be expressed by a scale as to whether the distribution of the luminance gradient direction related to the edge pixel is distributed radially from the center of the circle.
- an angle value ⁇ i representing the direction of each edge pixel (xi, yi) viewed from the center (x0, y0) of the circle is calculated by (Equation 1), and each edge pixel (xi, yi) is calculated.
- the angle value ⁇ i representing the direction of the luminance gradient at is calculated by (Equation 2), and the score S1 obtained by summing up the absolute differences between these angle values ⁇ i and ⁇ i is calculated by (Equation 3). What is necessary is just to use as a feature-value.
- the score S1 calculated as described above, the higher the degree of distribution of the direction of the luminance gradient related to each edge pixel that constitutes the edge of the circle candidate, radially from the center of the circle, It represents that the uniformity of the brightness gradient is high. For this reason, the smaller the score S1, the higher the circularity (likeness of a person's head).
- the object candidate determination step 205 based on the person feature amount calculated in the person feature amount calculation step 204, it is determined whether or not the circle candidate extracted in the circular region extraction step 203 is a human head.
- the above-described score S1 is calculated as the human feature amount in the human feature amount calculation step 204, and the smaller the value, the higher the circularity (person's head-likeness).
- the score S1 calculated for the candidate is smaller than the predetermined threshold T1 (when the threshold T1 is less than or less than the threshold T1), the uniformity of the luminance gradient of each edge pixel constituting the circle candidate is higher than the reference
- the circle candidate is determined to be the head of a person.
- the region 302 has a high circularity (the value of the score S1 is small).
- the regions 303 and 304 are determined not to be human heads because they have low circularity (score S1 has a large value).
- a background image is prepared by removing a moving object from the input image, and the object edge extraction step 202 and the circular region extraction step 203 are also performed on this background region to obtain a circle candidate (circular shape).
- the region having an edge shape) is extracted, and the object candidate determination step 205 performs the above determination except for the circle candidates extracted from the background image.
- a circle candidate extracted from a background image is regarded as a background element (such as a floor pattern) and is not determined to be a human head even if the circularity is very high. .
- a person (circular edge region) is determined based on a circle candidate determined to be a person in the past input image and a circle candidate determined to be a person in the current input image. The movement is tracked and the locus is detected.
- the trajectory evaluation / object count step 207 the trajectory of the movement detected in the object tracking / trajectory detection step 206 is evaluated, and the number of persons included in the input image is counted. In this example, the number of trajectories is counted, and this is set as the number of persons.
- the background image is updated using the current input image. Specifically, an image obtained by multiplying the current input image by a predetermined coefficient ⁇ and an image obtained by multiplying the current background image by (1 ⁇ ) are combined (added), and this is used as the next background image. Like that. As a result, it is possible to obtain the latest background image from which a moving object (such as a person) in the input image is removed.
- a process of capturing a planar image obtained by photographing the monitoring area from above image input step 201
- a process of extracting an edge from the planar image object edge extraction step 202
- processing for detecting a circle candidate (region having a circular edge shape) included in the planar image circle region extraction step 203
- a luminance gradient is calculated for each of a plurality of edge pixels constituting an edge, and an index value indicating the uniformity of the luminance gradient related to each edge pixel (person feature amount calculating step 204), and the calculated index value
- a process for determining whether or not a circle candidate is a person's head compared to a predetermined threshold object candidate determination step 205
- a circle that has been determined to be a person's head object tracking / trajectory detection step 206 and trajectory evaluation / object counting step 207) and the background image included in
- the circle candidate is determined to be a person based on the uniformity of the luminance gradient of each edge pixel constituting the edge of the circle candidate. It is possible to effectively determine whether or not the head has a circular shape like the head, and the number of persons included in the planar image can be accurately counted.
- the above steps 201 to 208 may be performed for each image of one frame in the video of the monitoring area captured by the imaging device 101 or may be performed for each image of a predetermined number of frames. Further, in the above example, in order to increase the counting accuracy of the number of persons included in the planar image of the monitoring area, the number of movement trajectories detected in the object tracking / trajectory detection step 206 is counted and is used as the number of persons. However, the number of circular edge regions determined to be the head of the person in the object candidate determination step 205 may be counted and used as the number of persons.
- the human feature amount calculation step 204 as the personal feature amount of the circle candidate, the distribution of the direction of the luminance gradient related to each edge pixel constituting the edge of the circle candidate is determined from the center of the circle.
- the index value indicating the degree of radial distribution is calculated, another index value may be calculated as the person feature amount as follows.
- an index value indicating the degree of variation in the magnitude of the luminance gradient associated with each edge pixel constituting the edge of a circle candidate is calculated as the person feature amount.
- the X-direction components dxi and Y of the luminance gradient between adjacent pixels in the input image is calculated.
- the X direction component dxi of the luminance gradient related to the edge pixel (xi, yi) for example, the luminance of the pixel (xi-1, yi) immediately before in the X direction and the pixel (xi +) after The luminance difference of 1, yi) can be used.
- the Y direction component dyi of the magnitude of the luminance gradient related to the edge pixel (xi, yi) for example, the luminance of the pixel (xi, yi-1) immediately before in the Y direction and the pixel (1 after) The luminance difference of xi, yi + 1) can be used.
- the magnitude di of the luminance gradient relating to the edge pixel (xi, yi) is calculated by (Equation 4). Further, for a plurality of fan-shaped areas obtained by dividing a circle candidate area into a fan shape by a predetermined angle, the magnitude di of the luminance gradient related to each edge pixel in the fan-shaped area is summed up.
- the division number (number of fan-shaped areas) of the candidate circle area is M
- a score S2 representing the entropy relating to the total gj calculated for each fan-shaped region is calculated by (Equation 5), and the calculated score S2 is calculated from the circle. What is necessary is just to use as a person feature-value which a candidate has.
- the score S2 calculated as described above the lower the degree of variation in the magnitude of the luminance gradient related to each edge pixel constituting the edge of the circle candidate, and the higher the uniformity of the luminance gradient. Represents. For this reason, the smaller the score S2, the higher the circularity (the human headness). Therefore, in the subsequent object candidate determination step 205, if the score S2 calculated for a candidate for a certain circle is smaller than the predetermined threshold T2 (when it is less than or less than the threshold T2), the candidate for the circle is configured. It may be determined that the uniformity of the luminance gradient related to each edge pixel has a height higher than the reference, and the candidate for the circle is determined to be the head of a person.
- the value of the score S2 becomes small, so that the human head Is determined to be a part.
- FIG. 6B when the degree of variation in the magnitude of the luminance gradient related to each edge pixel constituting the edge of the circle candidate is high, the value of the score S2 becomes large, so that the human head It is determined that it is not a part.
- a score S1 that is an index related to the direction of the luminance gradient related to each edge pixel constituting the edge of the circle candidate and a score that is an index related to the magnitude of the luminance gradient related to each edge pixel constituting the edge of the circle candidate S2 is calculated, and if either one of the scores S1 and S2 is lower than the thresholds T1 and T2, the circle candidate may be determined to be the head of the person, or the score S1, If both of S2 are lower than the threshold values T1 and T2, the circle candidate may be determined to be the head of a person. Further, the result S1,2 obtained by multiplying each of the scores S1, S2 by the predetermined weight values ⁇ 1, ⁇ 2 is added to a predetermined threshold value T1,2 to determine whether the circle candidate is a human head. May be determined.
- the processing of the edge extraction unit according to the present invention is realized by the object edge extraction step 202
- the processing of the circle candidate detection unit according to the present invention is realized by the circular region extraction step 203.
- the processing of the person determination unit according to the present invention is realized by the person feature amount calculation step 204 and the object candidate determination step 205
- the processing of the person counting unit according to the present invention is performed by the object tracking / trajectory detection step 206 and the trajectory evaluation / object. This is realized by the count step 207.
- FIG. 7 shows an example of a flowchart of the number counting process when a fisheye lens is used as the lens of the imaging apparatus 101 as an example of another configuration.
- a fisheye lens when used, an image with a wide angle of view can be captured, but the distortion increases as the distance from the center of the image increases, and the head of a person who should be essentially a circle near the periphery of the image Since the image is captured in a distorted state, extraction of circle candidates in the circular area extraction step 203 is affected.
- the fisheye correction step 211 for performing the fisheye correction that adds the distortion in the reverse direction of the distortion by the fisheye lens to the edge image is performed.
- a circle (person's head) candidate can be accurately detected from an image with a wide angle of view, so that a wide range of monitoring areas can be set and the number of persons existing in that area can be accurately counted. Is possible.
- the fisheye correction step 211 is arranged after the object edge extraction step 202 so that the fisheye correction is performed on the edge image generated in the object edge extraction step 202.
- a fish-eye correction step 211 is arranged after the image input step 201, fish-eye correction is performed on the input image captured in the image input step 201, and an edge image is generated in the object edge extraction step 202 from the result image. You may do it.
- the present invention can also be provided as, for example, a method or method for executing the processing according to the present invention, a program for realizing such a method or method, or a storage medium for storing the program. is there.
- the number counting device (or method or program thereof) according to the present invention can be applied to various scenes for counting the number of persons existing in a target area.
- 101 Imaging device
- 102 Video input circuit
- 103 Image processor
- 104 Program memory
- 105 Work memory
- 106 External I / F circuit
- 107 Video output circuit
- 108 Data bus
- Instruction device 110 Display device 201: Image input step 202: Object edge extraction step 203: Circular region extraction step 204: Person feature amount calculation step 205: Object candidate determination step 206: Object tracking / trajectory detection step 207: Trajectory evaluation / object counting step, 208: Background learning step 208, 211: Fisheye correction step
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims (6)
- 対象領域の平面画像からエッジを抽出するエッジ抽出手段と、 前記エッジ抽出手段により抽出されたエッジに基づいて、前記平面画像に含まれる円の候補を検出する円候補検出手段と、 前記円候補検出手段により検出された円の候補毎に、当該円の候補のエッジを構成する複数のエッジ画素の各々について輝度勾配を算出し、各エッジ画素に係る輝度勾配の均一度が基準以上の高さを有する円の候補を人物の頭部であると判定する人物判定手段と、 前記人頭部判定手段により人物の頭部であると判定された円の候補を計数する人数計数手段と、 を備えたことを特徴とする人数計数装置。
- 前記人物判定手段は、前記円の候補のエッジを構成する各エッジ画素に係る輝度勾配の均一度を表す指標として、各エッジ画素に係る輝度勾配の方向の分布が円の中心から放射状に分布する度合いを算出し、この算出結果を所定の閾値と比較することで、当該円の候補のエッジを構成する各エッジ画素に係る輝度勾配の均一度が基準以上の高さを有するかを判断する、 ことを特徴とする請求項1に記載の人数計数装置。
- 前記人物判定手段は、前記円の候補のエッジを構成する各エッジ画素に係る輝度勾配の均一度を表す指標として、各エッジ画素に係る輝度勾配の大きさのばらつき度合いを算出し、この算出結果を所定の閾値と比較することで、当該円の候補のエッジを構成する各エッジ画素に係る輝度勾配の均一度が基準以上の高さを有するかを判断する、 ことを特徴とする請求項1に記載の人数計数装置。
- 対象領域の平面画像からエッジを抽出するエッジ抽出処理と、 前記エッジ抽出手段により抽出されたエッジに基づいて、前記平面画像に含まれる円の候補を検出する円候補検出処理と、 前記円候補検出手段により検出された円の候補毎に、当該円の候補のエッジを構成する複数のエッジ画素の各々について輝度勾配を算出し、各エッジ画素に係る輝度勾配の均一度が基準以上の高さを有する円の候補を人物の頭部であると判定する人物判定処理と、 前記人頭部判定手段により人物の頭部であると判定された円の候補を計数する人数計数処理と、 を備えたことを特徴とする人数計数方法。
- 前記人物判定処理は、前記円の候補のエッジを構成する各エッジ画素に係る輝度勾配の均一度を表す指標として、各エッジ画素に係る輝度勾配の方向の分布が円の中心から放射状に分布する度合いを算出し、この算出結果を所定の閾値と比較することで、当該円の候補のエッジを構成する各エッジ画素に係る輝度勾配の均一度が基準以上の高さを有するかを判断する、 ことを特徴とする請求項4に記載の人数計数方法。
- 前記人物判定処理は、前記円の候補のエッジを構成する各エッジ画素に係る輝度勾配の均一度を表す指標として、各エッジ画素に係る輝度勾配の大きさのばらつき度合いを算出し、この算出結果を所定の閾値と比較することで、当該円の候補のエッジを構成する各エッジ画素に係る輝度勾配の均一度が基準以上の高さを有するかを判断する、 ことを特徴とする請求項4に記載の人数計数方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/779,044 US9704259B2 (en) | 2013-03-26 | 2014-03-14 | People counting device and people counting method |
GB1516929.5A GB2527697B (en) | 2013-03-26 | 2014-03-14 | People counting device and people counting method |
JP2015508305A JP5976198B2 (ja) | 2013-03-26 | 2014-03-14 | 人数計数装置および人数計数方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013063652 | 2013-03-26 | ||
JP2013-063652 | 2013-03-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014156733A1 true WO2014156733A1 (ja) | 2014-10-02 |
Family
ID=51623714
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/056954 WO2014156733A1 (ja) | 2013-03-26 | 2014-03-14 | 人数計数装置および人数計数方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9704259B2 (ja) |
JP (1) | JP5976198B2 (ja) |
GB (1) | GB2527697B (ja) |
WO (1) | WO2014156733A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017021718A (ja) * | 2015-07-14 | 2017-01-26 | コニカミノルタ株式会社 | 頭部検出装置および頭部検出方法ならびに被監視者監視装置 |
JP2017129376A (ja) * | 2016-01-18 | 2017-07-27 | 株式会社日立製作所 | 移動体計測システム、及び計測対象の領域における人物の数を特定する方法 |
US10346688B2 (en) | 2016-01-12 | 2019-07-09 | Hitachi Kokusai Electric Inc. | Congestion-state-monitoring system |
CN114092890A (zh) * | 2022-01-20 | 2022-02-25 | 长沙海信智能系统研究院有限公司 | 区域内可容纳人数的确定方法、装置、设备及介质 |
KR20220047947A (ko) * | 2015-04-03 | 2022-04-19 | 한화테크윈 주식회사 | 사람 계수 방법 및 장치 |
KR20230137203A (ko) * | 2022-03-21 | 2023-10-04 | 주식회사 켈스 | 머신러닝과 이미지 프로세싱을 이용한 진단 키트 영상의 위치 및 색상 보정 장치 및 방법 |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160259980A1 (en) * | 2015-03-03 | 2016-09-08 | Umm Al-Qura University | Systems and methodologies for performing intelligent perception based real-time counting |
JP6568374B2 (ja) * | 2015-03-27 | 2019-08-28 | キヤノン株式会社 | 情報処理装置、情報処理方法、及びプログラム |
CN106845318B (zh) * | 2015-12-03 | 2019-06-21 | 杭州海康威视数字技术股份有限公司 | 客流信息采集方法及装置、客流信息处理方法及装置 |
US10652459B2 (en) | 2016-03-07 | 2020-05-12 | Ricoh Company, Ltd. | Information processing system, information processing method, and non-transitory computer-readable storage medium |
US10650249B2 (en) * | 2016-10-25 | 2020-05-12 | Shenzhen University | Method and device for counting pedestrians based on identification of head top of human body |
US10789502B2 (en) * | 2018-01-24 | 2020-09-29 | Denso Ten Limited | Extraneous-matter detecting apparatus and extraneous-matter detecting method |
US11488379B2 (en) * | 2018-03-13 | 2022-11-01 | Harman International Industries, Incorporated | Apparatus and method for automatic failure threshold detection for images |
JP6703679B1 (ja) * | 2019-02-01 | 2020-06-03 | 株式会社計数技研 | 計数装置、学習器製造装置、計数方法、学習器製造方法、及びプログラム |
CN110096959B (zh) * | 2019-03-28 | 2021-05-28 | 上海拍拍贷金融信息服务有限公司 | 人流量计算方法、装置以及计算机存储介质 |
CN110210603A (zh) * | 2019-06-10 | 2019-09-06 | 长沙理工大学 | 人群的计数模型构建方法、计数方法和装置 |
CN110717885A (zh) * | 2019-09-02 | 2020-01-21 | 平安科技(深圳)有限公司 | 顾客数量的统计方法及装置、电子设备及可读存储介质 |
JP7151675B2 (ja) * | 2019-09-20 | 2022-10-12 | 株式会社デンソーテン | 付着物検出装置および付着物検出方法 |
CN112733677B (zh) * | 2020-12-31 | 2021-11-30 | 桂林海威科技股份有限公司 | 一种人流量统计系统及方法 |
CN113343882A (zh) * | 2021-06-21 | 2021-09-03 | 平安普惠企业管理有限公司 | 人群计数方法、装置、电子设备及存储介质 |
CN113990214A (zh) * | 2021-10-22 | 2022-01-28 | 深圳倜傥国际设计有限公司 | 一种模块化区域可调节展示系统 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005135339A (ja) * | 2003-10-31 | 2005-05-26 | Konica Minolta Holdings Inc | 通行人検出方法および装置ならびに通行人計数装置 |
JP2005316978A (ja) * | 2004-03-29 | 2005-11-10 | Fuji Photo Film Co Ltd | 画像出力装置、画像出力方法およびそのプログラム |
JP2008204479A (ja) * | 2001-12-03 | 2008-09-04 | Microsoft Corp | 複数のキューを使用する複数の個人の自動検出および追跡のための方法およびコンピュータ可読な記憶媒体 |
JP2012212968A (ja) * | 2011-03-30 | 2012-11-01 | Secom Co Ltd | 画像監視装置 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3550874B2 (ja) * | 1996-04-12 | 2004-08-04 | オムロン株式会社 | 監視装置 |
JPH11259625A (ja) * | 1998-03-09 | 1999-09-24 | Mitsubishi Electric Corp | 人数検出装置及びエレベーターの利用客数検出装置 |
JPH11296653A (ja) * | 1998-04-06 | 1999-10-29 | Sanyo Electric Co Ltd | 画像処理装置及びこれを用いた人体検知装置 |
JP3879848B2 (ja) | 2003-03-14 | 2007-02-14 | 松下電工株式会社 | 自律移動装置 |
US7995239B2 (en) | 2004-03-29 | 2011-08-09 | Fujifilm Corporation | Image output apparatus, method and program |
JP2010198566A (ja) * | 2009-02-27 | 2010-09-09 | Nec Corp | 人数計測装置、方法及びプログラム |
JP5469391B2 (ja) * | 2009-07-21 | 2014-04-16 | 日本信号株式会社 | ホームドア確認システム |
JP5371685B2 (ja) * | 2009-10-20 | 2013-12-18 | キヤノン株式会社 | 情報処理装置及び制御方法、プログラム |
JP5520203B2 (ja) * | 2010-12-01 | 2014-06-11 | 株式会社日立製作所 | 混雑度推定装置 |
JP4893863B1 (ja) * | 2011-03-11 | 2012-03-07 | オムロン株式会社 | 画像処理装置、および画像処理方法 |
JP5726596B2 (ja) * | 2011-03-30 | 2015-06-03 | セコム株式会社 | 画像監視装置 |
-
2014
- 2014-03-14 US US14/779,044 patent/US9704259B2/en active Active
- 2014-03-14 WO PCT/JP2014/056954 patent/WO2014156733A1/ja active Application Filing
- 2014-03-14 GB GB1516929.5A patent/GB2527697B/en active Active
- 2014-03-14 JP JP2015508305A patent/JP5976198B2/ja active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008204479A (ja) * | 2001-12-03 | 2008-09-04 | Microsoft Corp | 複数のキューを使用する複数の個人の自動検出および追跡のための方法およびコンピュータ可読な記憶媒体 |
JP2005135339A (ja) * | 2003-10-31 | 2005-05-26 | Konica Minolta Holdings Inc | 通行人検出方法および装置ならびに通行人計数装置 |
JP2005316978A (ja) * | 2004-03-29 | 2005-11-10 | Fuji Photo Film Co Ltd | 画像出力装置、画像出力方法およびそのプログラム |
JP2012212968A (ja) * | 2011-03-30 | 2012-11-01 | Secom Co Ltd | 画像監視装置 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20220047947A (ko) * | 2015-04-03 | 2022-04-19 | 한화테크윈 주식회사 | 사람 계수 방법 및 장치 |
KR102584527B1 (ko) * | 2015-04-03 | 2023-10-05 | 한화비전 주식회사 | 사람 계수 방법 및 장치 |
JP2017021718A (ja) * | 2015-07-14 | 2017-01-26 | コニカミノルタ株式会社 | 頭部検出装置および頭部検出方法ならびに被監視者監視装置 |
US10346688B2 (en) | 2016-01-12 | 2019-07-09 | Hitachi Kokusai Electric Inc. | Congestion-state-monitoring system |
JP2017129376A (ja) * | 2016-01-18 | 2017-07-27 | 株式会社日立製作所 | 移動体計測システム、及び計測対象の領域における人物の数を特定する方法 |
CN114092890A (zh) * | 2022-01-20 | 2022-02-25 | 长沙海信智能系统研究院有限公司 | 区域内可容纳人数的确定方法、装置、设备及介质 |
KR20230137203A (ko) * | 2022-03-21 | 2023-10-04 | 주식회사 켈스 | 머신러닝과 이미지 프로세싱을 이용한 진단 키트 영상의 위치 및 색상 보정 장치 및 방법 |
KR102629904B1 (ko) * | 2022-03-21 | 2024-01-30 | 주식회사 켈스 | 머신러닝과 이미지 프로세싱을 이용한 진단 키트 영상의 위치 및 색상 보정 장치 및 방법 |
Also Published As
Publication number | Publication date |
---|---|
GB2527697A (en) | 2015-12-30 |
JP5976198B2 (ja) | 2016-08-23 |
US9704259B2 (en) | 2017-07-11 |
JPWO2014156733A1 (ja) | 2017-02-16 |
US20160055645A1 (en) | 2016-02-25 |
GB2527697B (en) | 2019-07-10 |
GB201516929D0 (en) | 2015-11-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5976198B2 (ja) | 人数計数装置および人数計数方法 | |
JP6525453B2 (ja) | オブジェクト位置推定システム、及びそのプログラム | |
CN108537112B (zh) | 图像处理装置、图像处理系统、图像处理方法及存储介质 | |
JP6549797B2 (ja) | 通行人の頭部識別方法及びシステム | |
US10212324B2 (en) | Position detection device, position detection method, and storage medium | |
US9377861B2 (en) | Information processing device and method, program and recording medium for identifying a gesture of a person from captured image data | |
JP6509027B2 (ja) | 被写体追跡装置、光学機器、撮像装置、被写体追跡装置の制御方法、プログラム | |
US10096091B2 (en) | Image generating method and apparatus | |
JP6273685B2 (ja) | 追尾処理装置及びこれを備えた追尾処理システム並びに追尾処理方法 | |
US11049259B2 (en) | Image tracking method | |
WO2012066785A1 (ja) | 人数カウント装置、人数カウント方法及び人数カウントプログラム | |
EP2662827B1 (en) | Video analysis | |
JP2013542528A (ja) | 夜景画像ボケ検出システム | |
Teng et al. | Detection of service activity in a badminton game | |
JP6558831B2 (ja) | オブジェクト追跡装置、方法およびプログラム | |
JP5217917B2 (ja) | 物体検知追跡装置,物体検知追跡方法および物体検知追跡プログラム | |
Hadi et al. | Fusion of thermal and depth images for occlusion handling for human detection from mobile robot | |
JP5419925B2 (ja) | 通過物体数計測方法、通過物体数計測装置、及びプログラム | |
JP6543546B2 (ja) | 特定動作検出装置及び特定動作検出方法 | |
CN110781712A (zh) | 一种基于人脸检测与识别的人头空间定位方法 | |
JP6516609B2 (ja) | 動物体検出装置及びその背景モデル構築方法 | |
US11587325B2 (en) | System, method and storage medium for detecting people entering and leaving a field | |
Niżałowska et al. | Indoor head detection and tracking on RGBD images | |
Yuthong et al. | Lung volume monitoring using flow-oriented incentive spirometer with video processing | |
JP6705316B2 (ja) | 移動体識別装置、移動体識別システム、移動体識別方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14774219 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015508305 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14779044 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 1516929 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20140314 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1516929.5 Country of ref document: GB |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14774219 Country of ref document: EP Kind code of ref document: A1 |