WO2005024726A1 - 対象物進行方向検出方法 - Google Patents
対象物進行方向検出方法 Download PDFInfo
- Publication number
- WO2005024726A1 WO2005024726A1 PCT/JP2004/012689 JP2004012689W WO2005024726A1 WO 2005024726 A1 WO2005024726 A1 WO 2005024726A1 JP 2004012689 W JP2004012689 W JP 2004012689W WO 2005024726 A1 WO2005024726 A1 WO 2005024726A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- time
- pixel
- pixel value
- template
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
Definitions
- the present invention relates to an object traveling direction detection method that detects an advancing direction of an object by processing an image.
- a number-of-passenger detection device that measures the number of passers-by entering and exiting through entrances such as department stores and exhibition halls (for example, (See Patent Document 1).
- the passer-by number detection device disclosed in Patent Document 1 the difference between the first captured image and the second captured image is binarized and fixed difference processing is performed to compress the number of pixels. By counting and discriminating the number and position of pixels, the movement direction and number of people are calculated.
- the present invention has been made in view of the above-described conventional circumstances, and an object of the present invention is to provide an object traveling direction detection method capable of efficiently detecting the traveling direction of an object with a small amount of information. Disclosure of Invention>
- the object traveling direction detection method of the present invention includes an image acquisition step for acquiring an image including an object, which is taken at a predetermined time interval, and an extraction image generation for generating an image obtained by extracting an arbitrary component from the image.
- Each pixel value of the image A pixel value storing step of storing the pixel value in the template, and scanning the template storing the pixel value on the extracted image at the time t + 1, so that each pixel value of the template matches the pixel value.
- a pixel value matching pixel detection step for detecting a position, and a pixel value set according to the moving direction of the object at the same coordinate position as the pixel position detected in the pixel value matching pixel detection step;
- a traveling direction extraction image creating step for creating a traveling direction extraction image;
- a plurality of moving direction extraction images are created by plotting pixel values respectively set according to the moving direction of the object at the same coordinate positions as the respective pixel positions detected in step, and the plurality of moving direction extraction images
- the traveling direction extraction image creation step includes a first image created based on an image photographed at time t and an image photographed at time t + 1. Based on the position of the first center of gravity, which is the center of gravity of the object, in the advancing direction extracted image, and the image taken at time t + 1 and the image taken at time t + 2.
- the method includes a speed calculation step of obtaining a speed based on a distance from the position of the second center of gravity, which is the center of gravity of the object, in the formed second traveling direction extraction image.
- the extraction image creation step selects and extracts a lightness component, a hue component, and a saturation component from a force image.
- the traveling direction of the object can be detected efficiently with a small amount of information using the three elements of color.
- the object traveling direction detection method of the present invention creates a spatio-temporal image in which images of a predetermined area extracted from each of the traveling direction extraction images created in the traveling direction extraction image creation step are arranged in time series. And a spatiotemporal image creation step.
- FIG. 1 is a diagram for explaining the operation of the object traveling direction detection device according to Embodiment 1 of the present invention.
- FIG. 2 is a diagram for explaining the operation of the target object traveling direction detection device according to the second embodiment of the present invention.
- FIG. 3 is a diagram for explaining the operation of the target object traveling direction detection device according to the third embodiment of the present invention.
- FIG. 4 is a diagram for explaining the operation of the target object traveling direction detection device according to the third embodiment of the present invention.
- FIG. 5 is a diagram for explaining the operation of the object traveling direction detection device according to Embodiment 4 of the present invention.
- 1 0 1, 1 0 2, 1 0 7, 1 0 8, 1 2 0, 1 2 1 are color images
- 1 0 3 a, 1 0 3 b, 1 04 a, 1 04 b, 1 0 9 a, 1 0 9 b, 1 1 0 a, 1 1 0 b, 1 2 2 a, 1 2 2 b, 1 2 3 a, 1 2 3 b are brightness images
- 1 1 1 TJP2004 / 012689 a, 1 1 1 b, 1 1 2 a, 1 1 2 b, 124 a, 124 b, 1 25 a, 1 25 b are chroma images
- 105, 1 1 3, 1 14, 1 28, 129 and 1 30 are templates
- 106 is a moving direction extraction image
- 1 1 5 and 1 3 1 are lightness information moving direction extraction images
- 1 1 6 and 1 32 are saturation information moving direction extraction images
- 1 1 7 is lightness 1 Saturation progress direction extraction image
- the present invention detects the traveling direction of an object based on a color image including the object.
- the traveling direction of a person is based on a color image obtained by photographing a person passing through a road from the sky. The case of detecting this will be described with reference to the drawings.
- a color image 101 at time t and a color image 102 at time t + 1 taken at a predetermined time interval by an imaging device such as a CCD camera, and the three elements of color Create a brightness image that extracts only brightness information that is one of the above.
- a brightness image 103 is created from the color image 101 at time t
- a brightness image 104 is created from the color image 102 at time t + 1.
- images (object extraction images) from which only the object is extracted are created.
- the brightness image (object extraction image) 10 3 at time t is scanned by using a Matritas template 105 of n X n (n is an odd number of 3 or more), and the position where the object exists is determined. To detect.
- the values of surrounding pixels (pixel values) around the pixel at that position are stored in the template 105.
- the brightness image 104 at time t + 1 is scanned using the template 105 storing each pixel value, and pixels matching the pixel value of the template 105 are detected (matching pixels are detected). Then, plot a point at the same coordinate position as the pixel position that matches the pixel value of template 105. The Since there are three objects in this figure, after creating a template for each, scan with each template and plot the points.
- the pixel value “1 2 5” be the pixel at which the template 10 5 could not be matched.
- the pixel value of the coordinate corresponding to the existing area of the target object is determined based on the matching pixel, and the traveling direction extraction image 106 is created.
- the pixel value of the object that moves in the direction from the right to the left of the screen is “0”.
- Objects that move in the right direction in the figure, the second and third objects from the top
- the brightness image of each of the two color images photographed at a predetermined time interval is generated, and the brightness image (target) of the previous time t (Object Extraction Image)
- the n x n template 1 0 5 is scanned on the image to detect the position where the object exists.
- each pixel value is stored in the template 105 with the pixel at the position where the object is detected as the center.
- the template 10 5 storing each pixel value is scanned with respect to the brightness image (object extraction image) at time t + 1 to detect a pixel that matches each pixel value of the template 10 5.
- a pixel value is set for each direction of movement of the object in the area where the object exists.
- a travel direction extraction image 1 0 6 is created.
- the traveling direction of the object can be detected efficiently with a small amount of information.
- only lightness information is extracted from a color image for processing, but saturation or hue information may be used.
- the moving speed may also be detected.
- the center of gravity of the object is detected in each of the above.
- the moving speed of the object can be obtained by detecting the moving distance of the center of gravity and dividing the moving distance of the center of gravity by the moving time.
- FIG. 2 is a diagram for explaining the operation of the target object traveling direction detection device according to the second embodiment of the present invention.
- the object traveling direction detection device according to the present embodiment has the same configuration as that of the object traveling direction detection device according to the first embodiment except for the function.
- the moving direction extraction method in the object moving direction detection device according to the present embodiment creates a saturation image, which is one of the three elements of color, in addition to the lightness image. It is an improvement.
- 0 (time t + 1) and saturation images 1 1 1 (time ij t) and 1 1 2 (time t + 1) from which only the saturation image is extracted are created.
- an n X n template 1 1 3 is created for the lightness image 1 0 9 at time t
- an n X n template 1 1 4 is created for the saturation image 1 1 1 at time t. Then, using the templates 1 1 3 and 1 1 4, the brightness image 1 0 9 at time t and the saturation image 1 1 1.
- the values of the surrounding pixels (pixel values) around the pixel at the position where the object is detected are stored in templates 1 1 3 and 1 1 4, respectively.
- the template 1 1 3 is used to scan the lightness image 1 1 0 at the time t + 1
- the template 1 1 4 is used to scan the saturation image 1 1 1 at the time t + 1 to obtain the template 1 1 3, 1 1 4 Detects a pixel that matches the pixel value (detects a matching pixel).
- the pixel values are distinguished for each traveling direction.
- the position of the pixel that matches the template 1 1 3 created based on the image at time t 1 and the image at time t + 1 is If the object moves from the left to the right, the pixel corresponding to the target area is set to the pixel value “2 5 5”. Conversely, if the object moves from the right to the left on the screen, the pixel value is “0”.
- the pixel at the position where matching by the template 1 1 3 could not be performed is the pixel value “1 2 5 j.
- the position of the matching pixel in the template 1 1 4 created based on the image at time t and the image at time t + 1 moves from the left to the right of the screen, it corresponds to the region where this object exists.
- the pixel to be set is the pixel value “2 5 5”. Conversely, if the pixel moves from the right to the left, the pixel value is “0”. In addition, the pixel value “1 2 5” is the pixel at which the template 1 1 4 could not be matched.
- the pixel of the object that moves in the direction from the right to the left of the screen (the top object in the figure) is “0”, so the saturation information progress direction extraction image 1 1 6 becomes white, Objects moving in the direction from the left to the right of the screen (second and third objects from the top in the figure) will appear black.
- the brightness information and saturation Create a lightness-saturation progression direction extraction image 1 1 7 using information.
- the brightness image and the saturation image of each of the two color images photographed at a predetermined time interval are generated, and in the brightness image, Scan the n x n templates 1 1 3 on the brightness image (object extraction image) at the previous time t, detect the position where the object exists, and center the pixel at the position where the object was detected Store each pixel value in 1 1 3. Then, the template 113 storing each pixel value is scanned on the brightness image at the time t + 1 to detect a pixel that matches each pixel value of the template 113.
- Travel direction extraction image 1 1 5 is created.
- n x n templates 1 1 4 are scanned on the saturation image (object extraction image) at the previous time t to detect the position where the object exists,
- Each pixel value is stored in template 1 1 4 around the pixel at the detected position.
- the template 1 1 4 storing each pixel value is scanned on the saturation image (object extraction image) at time t + 1 to detect a pixel that matches each pixel value of the template 1 1 4 .
- the saturation information extraction direction is extracted by setting the pixel value in the target area for each target direction. Create image 1 1 6 Then, based on the lightness information progress direction extraction image 1 15 and the saturation information progress direction extraction image 1 16, a lightness-saturation progress direction extraction image 1 17 is created.
- the traveling direction of the object can be detected efficiently with a small amount of information, and the accuracy can be improved.
- the lightness information and the saturation information are used for processing.
- the lightness information and the hue information or the saturation information and the hue information may be used.
- the moving speed may be detected.
- the lightness-saturation progression direction extraction image that extracts the progression direction of the object between time t and time t + 1 and the object between time ⁇ iJ t + 1 and time t + 2
- the center of gravity of the object is detected in each of the lightness-saturation progress direction extracted images obtained by extracting the travel direction of the object.
- the moving speed of the object can be obtained by detecting the moving distance of the center of gravity and dividing the moving distance of the center of gravity by the moving time.
- FIG. 3 and 4 are diagrams for explaining the operation of the target object traveling direction detection device according to the third embodiment of the present invention.
- the object advancing direction detection apparatus which concerns on this Embodiment 4 012689 has the same configuration as that of the target object traveling direction detection device according to the first embodiment except that the function is different.
- the method of extracting the direction of movement in the object direction detection apparatus uses saturation and hue, which are one of the three elements of color in addition to the brightness image.
- saturation and hue are one of the three elements of color in addition to the brightness image.
- the accuracy of the images is improved by creating each image.
- a brightness image 1 2 2 (time t) obtained by extracting only brightness information from a color image 1 2 0 at time t and a color image 1 2 1 at time t + 1 taken at a predetermined time interval
- N x n template 1 2 8 is created for brightness image 1 2 2
- n x n template 1 2 9 is created for saturation image 1 2 4 at time t
- hue image at time t For 1 2 6 create an n X n template 1 3 0.
- the lightness image 1 2 2, the saturation image 1 2 4 and the hue image 1 2 6 at time t are scanned, and only the target object is scanned.
- the values of the surrounding pixels (pixel values) around the pixel at the position where the object is detected are stored in templates 1 2 8, 1 2 9 and 1 3 0, respectively.
- the template 1 2 8 is used to scan the brightness image 1 2 3 at time t + 1
- the template 1 2 9 is used to scan the saturation image 1 2 5 at time t + 1
- the template 1 3 Scan the hue image 1 2 7 at time t + 1 using 0 to detect pixels that match the pixel values of templates 1 2 8, 1 2 9, and 1 3 0 (detect matching pixels).
- a point is plotted at the same coordinate position as the pixel position coincident with the pixel value of the template 1 28 and the brightness information progress direction extraction image 1 31 is created. Also, a point is plotted at the same coordinate position as the position of the pixel that matches the pixel value of template 1 29 to create a saturation information progress direction extraction image 1 3 2. Further, the hue information progress direction extraction image 1 3 3 is created by plotting points at the same coordinate position as the pixel position matching the pixel value of the template 1 30.
- the pixel values are distinguished for each traveling direction.
- the position of the pixel matched between the template 1 2 8 created based on the image at time t 1 and the image at time t + 1 from the left of the screen If it is moving to the right, the pixel corresponding to the area where the object exists is set to pixel value “2 5 5”, and if it is moving from the right to the left on the screen, it is set to pixel value “0”.
- the pixel at the position where the template 1 2 8 cannot be matched is defined as the pixel value “1 2 5”.
- the pixel of the object moving in the direction from the right to the left of the screen becomes “0”, so the brightness information progress direction extraction image 1 3 1 becomes white, and the screen Objects moving in the direction from left to right (second and third objects from the top in the figure) will appear black.
- the existence area of this object is the pixel value “2 5 5”, and conversely, if it is moving from the right to the left of the screen, the pixel value is “0”.
- the pixel value “1 2 5” is the pixel where the template 1 2 9 could not be matched.
- the pixel of the object that moves in the direction from the right to the left of the screen becomes “0”, so the saturation information progress direction extraction image 1 3 2 becomes white, Objects in the direction from left to right on the screen (second and third objects from the top in the figure) will appear black.
- the position of the matching pixel in the template 13 0 created based on the image at time t and the image at time t + 1 moves from the left to the right of the screen, it corresponds to the region where this object exists.
- the pixel to be set is the pixel value “2 5 5”. Conversely, if the pixel moves from the right to the left, the pixel value is “0”. In addition, the pixel value “1 2 5” is the pixel where the template 1 30 cannot be matched.
- the pixel of the object that moves in the direction from the right to the left of the screen is “0”, so the hue information progress direction extraction image 1 3 3 becomes white, and the screen Objects moving in the direction from left to right (second and third objects from the top in the figure) will appear black.
- a lightness-saturation-hue progression direction extraction image 1 3 4 using lightness information, saturation information, and hue information is created.
- the brightness image, the saturation image, and the hue image of each of the two color images photographed at a predetermined time interval are generated, and the brightness In the image, n x n templates 1 2 8 are scanned on the brightness image (object extraction image) at the previous time t to detect the position where the object exists, and the pixel at the position where the object is detected Each pixel value is stored in template 1 2 8 centering on. Then, the template 1 28 storing each pixel value is scanned on the brightness image at the time t + 1 to detect a pixel that matches each pixel value of the template 1 28.
- the pixel value is set for each direction of movement of the target object in the target region, and the brightness information progresses.
- Direction extraction image 1 3 1 is created.
- n x n templates 1 2 9 are scanned on the saturation image (object extraction image) at the previous time t, and the position where the object exists is detected.
- Each pixel value is stored in template 1 29 with the pixel at the detected position as the center.
- the template 1 2 9 storing each pixel value is scanned on the saturation image (object extraction image) at time t + 1 to detect a pixel that matches each pixel value of the template 1 29. .
- the saturation value progression direction extraction image 1 is set by setting the pixel value for each of the progression direction of the subject in the region where the subject exists.
- 3 Create 2. Furthermore, in the hue image, n x n templates 1 30 are scanned on the previous hue image (object extraction image) at time t to detect the position where the object exists, and the object is detected. Each pixel value is stored in template 130, centering on the pixel at the detected position. Then, the template 130 storing each pixel value is scanned on the hue image (object extraction image) at time t + 1 to detect a pixel that matches each pixel value of the template 130.
- the pixel value is set for the moving direction of the target object in the target region, and the hue information moving direction extraction image 1 3 Create 3.
- the lightness 1 saturation 1 hue progression direction extraction image 1 3 4 is created To do.
- the moving speed may also be detected.
- the extracted image of the direction of travel of the object between the time t and the time t + 1 and the object between the time t + 1 and the time t + 2 The center of gravity of the object is detected in each of the extracted images of the lightness, saturation, and hue advancement direction extracted from the travel direction.
- the moving speed of the object can be obtained by detecting the moving distance of the center of gravity and dividing the moving distance of the center of gravity by the moving time.
- FIG. 5 is a diagram for explaining the operation of the spatiotemporal image creation device according to Embodiment 4 of the present invention. It should be noted that the spatiotemporal image creation device according to the present embodiment creates a spatiotemporal image from the traveling direction extraction image created by the method described in the first to third embodiments.
- images 146, 147, 148, 149, and 150 are created by extracting pixels in arbitrarily defined regions from the traveling direction extracted images 141, 142, 143, 144, and 145 forces. Note that this area has the same position and size in each traveling direction extraction image. Then, the spatio-temporal image 1 51 is created by arranging the images 146, 147, 148, 149., 1550 in time series.
- pixels of a strip-shaped area that crosses the center of the direction-extracted image to the left and right are extracted from each direction-extracted image, and an image based on the image (35) at time t and the image 1 36 at time t + 1 ( Image on the coefficient line) Creates a spatio-temporal image 1 5 1 by arranging images 1 to 50 based on the image 1 39 at time t + 4 and image 1 50 based on image 146 at time t + 5 in chronological order from the back to the front To do.
- the spatio-temporal image shows an object that has passed through the area from time t to time t + 5.
- the traveling direction of the object can be detected efficiently with a small amount of information, and the accuracy can be improved.
- the moving speed may also be detected.
- the extracted image of the direction of travel of the object between the time t and the time t + 1 and the object between the time t + 1 and the time t + 2 The center of gravity of the object is detected in each of the extracted images of the lightness, saturation, and hue advancement direction extracted from the travel direction.
- the moving speed of the object can be obtained by detecting the moving distance of the center of gravity and dividing the moving distance of the center of gravity by the moving time.
- the target object is a person
- a vehicle or the like may be used as the target object in addition to the person.
- the object direction detection method of the present invention has the effect of efficiently detecting the direction of movement of an object with a small amount of information, and the number of people entering and leaving the entrance / exit of department stores, exhibition halls, concert halls, etc. This is useful when measuring.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/550,954 US7346193B2 (en) | 2003-09-02 | 2004-08-26 | Method for detecting object traveling direction |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003309902A JP3787783B2 (ja) | 2003-09-02 | 2003-09-02 | 対象物進行方向検出方法 |
JP2003-309902 | 2003-09-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005024726A1 true WO2005024726A1 (ja) | 2005-03-17 |
Family
ID=34269624
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/012689 WO2005024726A1 (ja) | 2003-09-02 | 2004-08-26 | 対象物進行方向検出方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US7346193B2 (ja) |
JP (1) | JP3787783B2 (ja) |
CN (1) | CN100527166C (ja) |
WO (1) | WO2005024726A1 (ja) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2007259040B2 (en) * | 2006-06-16 | 2012-04-05 | Bae Systems Plc | Improvements relating to target tracking |
EP1909229B1 (en) * | 2006-10-03 | 2014-02-19 | Nikon Corporation | Tracking device and image-capturing apparatus |
US20090086027A1 (en) * | 2007-10-01 | 2009-04-02 | Benjamin Antonio Chaykin | Method And System For Providing Images And Graphics |
JP5353196B2 (ja) * | 2008-11-17 | 2013-11-27 | 株式会社リコー | 移動計測装置および移動計測プログラム |
JP4911395B1 (ja) * | 2010-09-28 | 2012-04-04 | Necエンジニアリング株式会社 | 画像読取装置 |
CN112329722B (zh) * | 2020-11-26 | 2021-09-28 | 上海西井信息科技有限公司 | 行车方向检测方法、系统、设备及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6472286A (en) * | 1987-09-14 | 1989-03-17 | Japan Broadcasting Corp | Image analyzing device |
JPH1196376A (ja) * | 1997-09-24 | 1999-04-09 | Oki Electric Ind Co Ltd | 移動物体追跡装置及び方法 |
JP2001118182A (ja) * | 1999-10-18 | 2001-04-27 | Mitsubishi Electric Corp | 移動体検出装置および移動体検出方法 |
JP2002092613A (ja) * | 2000-09-11 | 2002-03-29 | Sony Corp | 画像処理装置および画像処理方法、並びに記録媒体 |
JP2002170096A (ja) * | 2000-11-30 | 2002-06-14 | Sumitomo Osaka Cement Co Ltd | 通過物体計数装置及び計数方法 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5019975A (en) * | 1986-08-08 | 1991-05-28 | Fuji Photo Film Co., Ltd. | Method for constructing a data base in a medical image control system |
JP3189870B2 (ja) * | 1996-12-24 | 2001-07-16 | シャープ株式会社 | 画像処理装置 |
JPH1191169A (ja) * | 1997-09-19 | 1999-04-06 | Fuji Photo Film Co Ltd | 画像処理装置 |
DE69937476T2 (de) * | 1998-07-31 | 2008-08-28 | Canon K.K. | Bildverarbeitungsvorrichtung und -Verfahren und Speichermedium |
US6217520B1 (en) * | 1998-12-02 | 2001-04-17 | Acuson Corporation | Diagnostic medical ultrasound system and method for object of interest extraction |
JP2000357234A (ja) * | 1999-06-16 | 2000-12-26 | Canon Inc | 画像処理装置およびその方法 |
US6470151B1 (en) * | 1999-06-22 | 2002-10-22 | Canon Kabushiki Kaisha | Camera, image correcting apparatus, image correcting system, image correcting method, and computer program product providing the image correcting method |
US7756567B2 (en) * | 2003-08-29 | 2010-07-13 | Accuray Incorporated | Image guided radiosurgery method and apparatus using registration of 2D radiographic images with digitally reconstructed radiographs of 3D scan data |
-
2003
- 2003-09-02 JP JP2003309902A patent/JP3787783B2/ja not_active Expired - Fee Related
-
2004
- 2004-08-26 CN CNB2004800085610A patent/CN100527166C/zh not_active Expired - Fee Related
- 2004-08-26 WO PCT/JP2004/012689 patent/WO2005024726A1/ja active Application Filing
- 2004-08-26 US US10/550,954 patent/US7346193B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6472286A (en) * | 1987-09-14 | 1989-03-17 | Japan Broadcasting Corp | Image analyzing device |
JPH1196376A (ja) * | 1997-09-24 | 1999-04-09 | Oki Electric Ind Co Ltd | 移動物体追跡装置及び方法 |
JP2001118182A (ja) * | 1999-10-18 | 2001-04-27 | Mitsubishi Electric Corp | 移動体検出装置および移動体検出方法 |
JP2002092613A (ja) * | 2000-09-11 | 2002-03-29 | Sony Corp | 画像処理装置および画像処理方法、並びに記録媒体 |
JP2002170096A (ja) * | 2000-11-30 | 2002-06-14 | Sumitomo Osaka Cement Co Ltd | 通過物体計数装置及び計数方法 |
Also Published As
Publication number | Publication date |
---|---|
CN100527166C (zh) | 2009-08-12 |
JP3787783B2 (ja) | 2006-06-21 |
US7346193B2 (en) | 2008-03-18 |
CN1768353A (zh) | 2006-05-03 |
US20060193495A1 (en) | 2006-08-31 |
JP2005078482A (ja) | 2005-03-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8254643B2 (en) | Image processing method and device for object recognition | |
Marin et al. | Learning appearance in virtual scenarios for pedestrian detection | |
KR101355974B1 (ko) | 복수의 객체를 추적하는 객체 추적 방법 및 장치 | |
JP5774425B2 (ja) | 画像解析装置および画像評価装置 | |
CN111753782B (zh) | 一种基于双流网络的假脸检测方法、装置及电子设备 | |
CN101605209A (zh) | 摄像装置及图像再生装置 | |
US7260243B2 (en) | Intruding-object detection apparatus | |
JP2017201745A (ja) | 画像処理装置、画像処理方法およびプログラム | |
JP4975801B2 (ja) | 階層的外観モデルを用いた監視方法及び監視装置 | |
CN103186905A (zh) | 车用颜色检测器 | |
JP7092615B2 (ja) | 影検出装置、影検出方法、影検出プログラム、学習装置、学習方法、及び学習プログラム | |
JPH09153137A (ja) | 画像認識方法 | |
WO2005024726A1 (ja) | 対象物進行方向検出方法 | |
JP2010136207A (ja) | 歩行者検出表示システム | |
CN113435514A (zh) | 一种基于元深度学习的建筑垃圾精细分类方法、装置 | |
JP2002342758A (ja) | 視覚認識システム | |
JPH11306348A (ja) | 対象物検出装置及び対象物検出方法 | |
JP2000030033A (ja) | 人物検出方法 | |
WO2022022809A1 (en) | Masking device | |
JP2005140754A (ja) | 人物検知方法、監視システム、およびコンピュータプログラム | |
JP3421456B2 (ja) | 画像処理装置 | |
JPH06348991A (ja) | 移動車の走行環境認識装置 | |
JP6527729B2 (ja) | 画像処理装置及び画像処理プログラム | |
CN110264491A (zh) | 客流统计方法、装置、计算机设备和可读存储介质 | |
JP2003179930A (ja) | 動オブジェクト抽出方法及び抽出装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006193495 Country of ref document: US Ref document number: 10550954 Country of ref document: US Ref document number: 20048085610 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 10550954 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |