WO2005093657A1 - 道路景観解析装置及び方法 - Google Patents
道路景観解析装置及び方法 Download PDFInfo
- Publication number
- WO2005093657A1 WO2005093657A1 PCT/JP2005/005050 JP2005005050W WO2005093657A1 WO 2005093657 A1 WO2005093657 A1 WO 2005093657A1 JP 2005005050 W JP2005005050 W JP 2005005050W WO 2005093657 A1 WO2005093657 A1 WO 2005093657A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- road
- landscape
- image
- analysis
- vehicle
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/40—Analysis of texture
- G06T7/41—Analysis of texture based on statistical description of texture
- G06T7/48—Analysis of texture based on statistical description of texture using fractals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/165—Anti-collision systems for passive traffic, e.g. including static obstacles, trees
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- the present invention relates to a road landscape analysis device and method for analyzing a road landscape in front of a vehicle.
- a road landscape image captured by a camera is taken into a computer and the entire image is analyzed.
- fractal analysis is performed to determine the complexity of the landscape for the entire image, the green visibility of vegetation, etc. is calculated, and the ratio of roads to the image is calculated as the road ratio. It has been adopted to calculate as.
- An object of the present invention is to provide a road landscape analysis device and method capable of obtaining an accurate analysis result of a road landscape ahead of a vehicle.
- the road landscape analysis device of the present invention includes a camera mounted on a vehicle and capturing an image of the front of the vehicle, and a road landscape analysis device that analyzes a road landscape indicated by the image of the front of the vehicle captured by the camera.
- the front of the vehicle photographed by the camera Image dividing means for dividing the image into a plurality of areas by a diagonal line; and analyzing means for individually analyzing the image contents of each of the plurality of areas.
- the road landscape analysis method of the present invention is a road landscape analysis method for analyzing a road landscape indicated by the image in front of the vehicle obtained by photographing the front of the vehicle. And an analysis step of individually analyzing the image contents of each of the plurality of regions.
- FIG. 1 is a block diagram showing an embodiment of the present invention.
- FIG. 2 is a flowchart showing the road landscape analysis processing.
- FIG. 3 is a diagram showing the four divisions of the road landscape image.
- FIG. 4 is a flowchart showing the road analysis processing.
- FIG. 5 is a flowchart showing the scenery analysis processing.
- FIG. 6 is a flowchart showing the background analysis processing.
- FIG. 7 is a diagram showing each index and road comfort level as a result of the road analysis processing.
- FIG. 8 is a diagram showing each index and the scenery comfort level as a result of the scenery analysis process.
- FIG. 9 is a diagram showing each index and background comfort level as a result of the background analysis processing.
- FIG. 1 shows a road landscape analysis device according to the present invention.
- the road scene analysis device is mounted on a vehicle and includes a camera 1, an image processing device 2, an input device 3, and an output device 4.
- the camera 1 is, for example, a CCD camera, and is attached to the vehicle so as to photograph the front of the vehicle.
- the output of the camera 1 is connected to the image processing device 2.
- the image processing device 2 includes, for example, a microcomputer, and inputs image data supplied from the camera 1 and analyzes a road landscape indicated by the image data. Details of the analysis processing will be described later.
- the input device 3 and the output device 4 are connected to the image processing device 2.
- the input device 3 includes, for example, a keyboard, and supplies a command corresponding to an input operation to the image processing device 2.
- the output device 4 includes, for example, a display, and displays the analysis process of the road scene by the image processing device 2.
- a processor (not shown) in the image processing device 2 first captures image data from the camera 1 (step S 1) as shown in FIG. It is determined whether an obstacle exists in the still image) (step S2).
- the obstacle is a preceding vehicle or a parked vehicle other than the road scene.
- the image data acquired this time is compared with a plurality of image data immediately before that to determine that an obstacle exists. If there is an obstacle, the flow returns to step S1 to fetch new image data. Since there is little case where no obstacle exists, it may be determined in step S2 whether or not the total amount of the obstacle portion shown in the image is larger than the threshold.
- the image is divided into four areas diagonally (step S3).
- the image is a quadrilateral and is divided into four areas, upper and lower, and left and right by diagonal lines A and B.
- the upper area is the background area
- the lower area is the road area
- the left and right areas are the landscape area.
- Road analysis processing is performed according to the image in the lower area (step S4), landscape analysis processing is performed according to the image in the left and right areas (step S5), and the background is determined according to the image in the upper area.
- An analysis process is performed (step S6).
- white line recognition and approximate straight line calculation are performed (step S41). That is, a white line on the road is recognized, and an approximate straight line of the white line is calculated.
- a white line recognition method for example, there is a method disclosed in Japanese Patent Application Laid-Open No. Hei 6-333392.
- white line candidate points are extracted based on image data, and a frequency distribution of an angle with respect to a reference line of each two-point line of the white line candidate points is obtained.
- the actual angle of the white line with respect to the reference line and the actual candidate points included in the white line are extracted, and an approximate straight line of the white line is determined based on the actual angle and the actual candidate points.
- step S42 straight line distance measurement and lane width measurement are performed (step S42).
- the point where the recognized white line deviates from the approximate straight line is obtained.
- the straight-line distance to that point is taken as the straight-line distance, and the longer the straight-line distance, the more comfortable and comfortable the road is, and a higher score is set. Also, the lower the score, the higher the sharply curved road on which straight line approximation cannot be made.
- a lane width measuring method for example, there is a method disclosed in Japanese Patent Application Laid-Open No. 2002-166364. That is, the lane position on the road is specified, and the lane width is estimated based on the current lane position and its past history.
- step S43 road surface condition recognition and score conversion are performed (step S43).
- Recognition of the road surface condition is to identify whether or not the road surface is paved by color distribution analysis. You may also be aware of road surface conditions that respond to weather such as dry, wet and snow.
- Japanese Patent Application Laid-Open No. 2001-888636 discloses a method for recognizing road surface conditions such as snow and gravel roads, and this method may be used. In scoring, the pavement surface is considered to have a high score, Unpaved roads are considered low scores.
- the straightness of the road, the width of the road, and the cleanness of the road surface are set according to the road parameter values obtained by executing steps S41 to S43 (step S44). That is, the straightness of the road is set according to the straight-line distance, the width of the road is set according to the lane width, and the cleanness of the road is set according to the road surface condition value.
- the straightness of the road, the width of the road, and the cleanness of the road surface are set in the range of 0 to 100 in accordance with the similarity with each reference value.
- the average value of the straightness of the road, the width of the road, and the cleanness of the road surface set in step S44 are calculated (step S45). This average value indicates the road area comfort.
- step S51 the green rate and the blue rate of each of the left and right regions are analyzed (step S51).
- the number of pixels in the green portion (including similar colors) in the region is extracted, and the ratio of the number of pixels in the green portion to the total number of pixels in the region is defined as the green ratio.
- the number of pixels in the blue portion (including similar colors) in the area is extracted, and the ratio of the number of pixels in the blue portion to the total number of pixels in the region is defined as the blue ratio.
- the green ratio is the ratio of the forest in each of the left and right regions
- the blue ratio is the ratio of the sea in each of the left and right regions.
- a color distribution analysis is performed (step S52).
- the color distribution is obtained by calculating the number of pixels of each color in each of the left and right regions as a histogram.
- a fractal dimension analysis is performed for each of the left and right regions (step S53).
- the quality of the landscape is evaluated by the value of the fractal dimension.
- Japanese Patent Application Laid-Open No. 2000-57353 discloses a method for evaluating the quality of a landscape using fractal dimension analysis. In this patent document, when the value of the fractal dimension is in the range of 1.50 to 1.65 in a value between 0 and 2, landscape Is evaluated as having high quality.
- step S54 the proportion of forest and sea, the number of signboards, and the complexity of the landscape are set (step S54). That is, the ratio of forest and sea is set according to the green and blue rates, the number of signboards is set according to the color distribution, and the complexity is set according to the value of the fractal dimension. The ratio of forest and sea, the number of signboards, and the complexity are set in the range of 0 to 100 according to the similarity with each reference value. Then, the average values of the forest, the ratio of the sea, the number of signboards, and the complexity set in step S54 are calculated for each of the left and right regions (step S55). The average value indicates the comfort level of the left and right scenery
- the blue ratio in the upper region is analyzed (step S61).
- the number of pixels in the blue region (including similar colors) in the upper region is extracted, and the ratio of the number of pixels in the blue region to the total number of pixels in the region is defined as the blue ratio.
- the blue ratio is the ratio of the blue sky in the upper region.
- a color distribution analysis is performed (step S62).
- the color distribution is obtained by calculating the number of pixels of each color in the upper region as a histogram, and the signboard, overpass, and distant mountain ranges are analyzed.
- a distance measurement is performed (step S63). This is to measure the distance to major background objects such as the sky, distant mountains, overpasses, and tunnels in color distribution analysis. Using the captured image and the image of the previous frame, the optical flow is determined, and the distance of the object in the area is measured. If it is at infinity, it is determined that there is no object.
- 6-107096 discloses an optical flow system that moves the same point on a target object shown in two temporally successive images in a series of captured foreground moving images. It is shown to be detected as a vector.
- the blue sky ratio of the background, the number of signs, and the degree of opening are set according to the respective background parameter values obtained by executing steps S61 to S63 (step S64). That is, the blue sky ratio is set according to the blue color ratio, the number of signs is small according to the color distribution, and the openness is set according to the distance to the sky, distant mountains, overpasses, and tunnels.
- the values of the blue sky ratio, the number of signs, and the degree of opening are set in the range of 0 to 100 in accordance with the similarity to each reference value. Then, the average value of the blue sky ratio, the number of signs, and the degree of opening set in step S64 is calculated (step S65). This average value indicates the degree of background comfort.
- step S7 After the analysis processing for each area in this manner, the average value of the obtained road area comfort, the right and left landscape comfort and the background comfort is calculated as the road landscape comfort in front of the vehicle (step S7), Document data showing the characteristics of each area is created and output to the output device 4 (step S8).
- step S8 90 or more characteristic items in each area are detected, and document data is created by connecting the characteristic items. The content of the created document data is displayed on the output device 4 together with the degree of road scenery comfort.
- the results of the road analysis processing in step S4 show that the values of the linearity of the road area, the width of the road width, and the cleanness of the road surface and the road area Comfort is obtained as shown in Figure 7.
- the landscape analysis processing in step S5 the values of the forest and the sea, the number of signboards, and the complexity of the landscape for each of the left and right regions and the landscape comfort are obtained as shown in FIG.
- the background analysis processing in step S6 each value of the background blue sky ratio, the number of signs, and the degree of openness, and the background comfort are obtained as shown in FIG.
- step S7 the left landscape comfort 80 in Fig. 8 and the right landscape comfort 78.3, and the background comfort 83.3 in Fig. 9 are the average values of step S7.
- the road landscape comfort level is 80.4.
- Over 90 features in each area Since the items are “width of road”, “ratio of forest and sea” and “blue sky ratio”, in step S8, the document data is “wide road with two or more lanes surrounded by blue sky and forest”. It is created as follows.
- step S9 it is determined whether or not to continue the road landscape analysis processing. For example, if the processing is to be continued in response to an operation input of the input device 3, the process returns to step S1 and returns to Steps S1 to S9 are repeated. On the other hand, if not continued, the road landscape analysis processing ends.
- the obstacle in the image is determined, and the analysis of the road landscape is performed on the image with few obstacles. Therefore, the road landscape can be automatically and accurately determined.
- the analysis of the road landscape may be performed after removing the obstacle from the image, instead of performing the analysis of the road landscape on the image having few obstacles.
- the screen area is fixed by diagonal lines, but may be variable.
- the road area is defined by the white line recognition up to the outermost white line.
- the background area can be divided.
- other techniques such as frequency analysis may be used for the image analysis technique for each area.
- the image obtained by photographing the front of the vehicle is divided into a plurality of areas by diagonal lines, and the image contents of each of the plurality of areas are individually analyzed. An accurate analysis result can be obtained for the road landscape.
- the present invention can be applied to a car audio device and a car navigation device.
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/594,946 US8310545B2 (en) | 2004-03-29 | 2005-03-15 | Apparatus and method for analyzing road view |
JP2006511451A JP4185545B2 (ja) | 2004-03-29 | 2005-03-15 | 道路景観解析装置及び方法 |
EP05721200A EP1734476A4 (en) | 2004-03-29 | 2005-03-15 | ROAD VISION ANALYSIS DEVICE AND METHOD |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004095734 | 2004-03-29 | ||
JP2004-095734 | 2004-03-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005093657A1 true WO2005093657A1 (ja) | 2005-10-06 |
Family
ID=35056399
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/005050 WO2005093657A1 (ja) | 2004-03-29 | 2005-03-15 | 道路景観解析装置及び方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8310545B2 (ja) |
EP (1) | EP1734476A4 (ja) |
JP (1) | JP4185545B2 (ja) |
WO (1) | WO2005093657A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190103523A (ko) * | 2018-02-13 | 2019-09-05 | 코가플렉스 주식회사 | 자율 주행 장치 및 방법 |
JP2021121960A (ja) * | 2020-07-17 | 2021-08-26 | ベイジン バイドゥ ネットコム サイエンス アンド テクノロジー カンパニー リミテッド | 画像注釈方法、装置、電子設備、記憶媒体、及びプログラム |
CN114074657A (zh) * | 2020-08-18 | 2022-02-22 | 现代摩比斯株式会社 | 基于视觉的避免碰撞的系统和方法 |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5100845B2 (ja) * | 2008-10-28 | 2012-12-19 | 株式会社パスコ | 道路計測装置及び道路計測方法 |
KR101603261B1 (ko) * | 2009-07-24 | 2016-03-14 | 엘지이노텍 주식회사 | 프랙탈차원을 이용한 자동초점조절 기능을 가지는 영상촬상장치 및 영상촬상장치의 자동초점조절방법 |
WO2012026322A1 (ja) * | 2010-08-27 | 2012-03-01 | 富士フイルム株式会社 | オブジェクトのレイアウト編集方法及び装置 |
JP5501477B2 (ja) * | 2010-11-19 | 2014-05-21 | 三菱電機株式会社 | 環境推定装置及び車両制御装置 |
DE102012112724A1 (de) * | 2012-12-20 | 2014-06-26 | Continental Teves Ag & Co. Ohg | Verfahren zur Bestimmung eines Fahrbahnzustands aus Umfeldsensordaten |
DE102013223367A1 (de) | 2013-11-15 | 2015-05-21 | Continental Teves Ag & Co. Ohg | Verfahren und Vorrichtung zur Bestimmung eines Fahrbahnzustands mittels eines Fahrzeugkamerasystems |
US11281915B2 (en) * | 2019-12-06 | 2022-03-22 | Black Sesame Technologies Inc. | Partial frame perception |
KR20210148756A (ko) | 2020-06-01 | 2021-12-08 | 삼성전자주식회사 | 경사 추정 장치 및 이의 동작 방법 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06333192A (ja) * | 1993-05-21 | 1994-12-02 | Mitsubishi Electric Corp | 自動車用白線検出装置 |
JP2003067727A (ja) * | 2001-08-28 | 2003-03-07 | Toyota Central Res & Dev Lab Inc | 環境複雑度演算装置、環境認識度合推定装置及び障害物警報装置 |
JP2004054751A (ja) * | 2002-07-23 | 2004-02-19 | Panasonic Communications Co Ltd | 画像処理システム及び画像処理方法 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2674354A1 (fr) * | 1991-03-22 | 1992-09-25 | Thomson Csf | Procede d'analyse de sequences d'images routieres, dispositif pour sa mise en óoeuvre et son application a la detection d'obstacles. |
DE4332612C2 (de) * | 1992-09-25 | 1996-02-22 | Yazaki Corp | Außenansichts-Überwachungsverfahren für Kraftfahrzeuge |
JP2863381B2 (ja) | 1992-09-25 | 1999-03-03 | 矢崎総業株式会社 | 車両用監視方法 |
JP3619628B2 (ja) * | 1996-12-19 | 2005-02-09 | 株式会社日立製作所 | 走行環境認識装置 |
JP2000057353A (ja) | 1998-08-07 | 2000-02-25 | Tomoyuki Oikawa | 景観のディジタル評価方式とそれを用いた視覚装置、知的ロボットビジョン |
JP3463858B2 (ja) * | 1998-08-27 | 2003-11-05 | 矢崎総業株式会社 | 周辺監視装置及び方法 |
JP3575346B2 (ja) * | 1999-09-03 | 2004-10-13 | 日本電気株式会社 | 道路白線検出システム、道路白線検出方法および道路白線検出用プログラムを記録した記録媒体 |
JP3272701B2 (ja) | 1999-09-22 | 2002-04-08 | 富士重工業株式会社 | 車外監視装置 |
JP4707823B2 (ja) | 2000-11-24 | 2011-06-22 | 富士重工業株式会社 | 車線変位補正システムおよび車線変位補正方法 |
TWI246665B (en) * | 2001-07-12 | 2006-01-01 | Ding-Jang Tzeng | Method for aiding the driving safety of road vehicle by monocular computer vision |
JP3868876B2 (ja) * | 2002-09-25 | 2007-01-17 | 株式会社東芝 | 障害物検出装置及び方法 |
US7764808B2 (en) * | 2003-03-24 | 2010-07-27 | Siemens Corporation | System and method for vehicle detection and tracking |
-
2005
- 2005-03-15 EP EP05721200A patent/EP1734476A4/en not_active Withdrawn
- 2005-03-15 WO PCT/JP2005/005050 patent/WO2005093657A1/ja active Application Filing
- 2005-03-15 US US10/594,946 patent/US8310545B2/en not_active Expired - Fee Related
- 2005-03-15 JP JP2006511451A patent/JP4185545B2/ja not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06333192A (ja) * | 1993-05-21 | 1994-12-02 | Mitsubishi Electric Corp | 自動車用白線検出装置 |
JP2003067727A (ja) * | 2001-08-28 | 2003-03-07 | Toyota Central Res & Dev Lab Inc | 環境複雑度演算装置、環境認識度合推定装置及び障害物警報装置 |
JP2004054751A (ja) * | 2002-07-23 | 2004-02-19 | Panasonic Communications Co Ltd | 画像処理システム及び画像処理方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1734476A4 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190103523A (ko) * | 2018-02-13 | 2019-09-05 | 코가플렉스 주식회사 | 자율 주행 장치 및 방법 |
KR102048999B1 (ko) * | 2018-02-13 | 2019-11-27 | 코가플렉스 주식회사 | 자율 주행 장치 및 방법 |
JP2021121960A (ja) * | 2020-07-17 | 2021-08-26 | ベイジン バイドゥ ネットコム サイエンス アンド テクノロジー カンパニー リミテッド | 画像注釈方法、装置、電子設備、記憶媒体、及びプログラム |
CN114074657A (zh) * | 2020-08-18 | 2022-02-22 | 现代摩比斯株式会社 | 基于视觉的避免碰撞的系统和方法 |
EP3957548A1 (en) * | 2020-08-18 | 2022-02-23 | Hyundai Mobis Co., Ltd. | System and mehtod for avoiding collision based on vision |
US11767033B2 (en) | 2020-08-18 | 2023-09-26 | Hyundai Mobis Co., Ltd. | System and method for avoiding collision based on vision |
CN114074657B (zh) * | 2020-08-18 | 2023-12-29 | 现代摩比斯株式会社 | 基于视觉的避免碰撞的系统和方法 |
Also Published As
Publication number | Publication date |
---|---|
EP1734476A1 (en) | 2006-12-20 |
US20070211144A1 (en) | 2007-09-13 |
JPWO2005093657A1 (ja) | 2008-02-14 |
JP4185545B2 (ja) | 2008-11-26 |
US8310545B2 (en) | 2012-11-13 |
EP1734476A4 (en) | 2007-06-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4185545B2 (ja) | 道路景観解析装置及び方法 | |
CN109977812B (zh) | 一种基于深度学习的车载视频目标检测方法 | |
CN110175576B (zh) | 一种结合激光点云数据的行驶车辆视觉检测方法 | |
US7062071B2 (en) | Apparatus, program and method for detecting both stationary objects and moving objects in an image using optical flow | |
WO2019114036A1 (zh) | 人脸检测方法及装置、计算机装置和计算机可读存储介质 | |
JP4157620B2 (ja) | 移動物体検出装置及びその方法 | |
JP4587038B2 (ja) | 車両位置検出方法、並びに車両速度検出方法及び装置 | |
US7508983B2 (en) | Method and device for dividing target image, device for image recognizing process, program and storage media | |
US7536035B2 (en) | Object velocity measuring apparatus and object velocity measuring method | |
CN110956069B (zh) | 一种行人3d位置的检测方法及装置、车载终端 | |
CN108280450A (zh) | 一种基于车道线的高速公路路面检测方法 | |
CN103366572B (zh) | 一种交叉口的视频交通参数检测方法 | |
JP2003123197A (ja) | 道路標示等認識装置 | |
US8170284B2 (en) | Apparatus and method for displaying image of view in front of vehicle | |
KR100965800B1 (ko) | 차량 영상 검지 및 속도 산출방법 | |
CN111932496A (zh) | 车牌图像质量的确定方法、装置、存储介质及电子装置 | |
Charbonnier et al. | Road markings recognition using image processing | |
JP3807651B2 (ja) | 白線認識装置 | |
JP5189556B2 (ja) | 車線検出装置 | |
JP2829934B2 (ja) | 移動車の環境認識装置 | |
JP2010136207A (ja) | 歩行者検出表示システム | |
JP4070450B2 (ja) | 前方車両認識装置及び認識方法 | |
JPH08320998A (ja) | レーンマーカ検出装置 | |
JP2020095621A (ja) | 画像処理装置および画像処理方法 | |
JP3194301B2 (ja) | ガイドライン検出装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006511451 Country of ref document: JP |
|
DPEN | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005721200 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10594946 Country of ref document: US Ref document number: 2007211144 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2005721200 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 10594946 Country of ref document: US |