WO2017145541A1 - 移動体 - Google Patents
移動体 Download PDFInfo
- Publication number
- WO2017145541A1 WO2017145541A1 PCT/JP2017/000694 JP2017000694W WO2017145541A1 WO 2017145541 A1 WO2017145541 A1 WO 2017145541A1 JP 2017000694 W JP2017000694 W JP 2017000694W WO 2017145541 A1 WO2017145541 A1 WO 2017145541A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- calculated
- image
- moving body
- feature points
- speed
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 claims description 27
- 238000004364 calculation method Methods 0.000 abstract description 10
- 239000000284 extract Substances 0.000 abstract description 3
- 238000000034 method Methods 0.000 description 13
- 238000001914 filtration Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 4
- 238000009434 installation Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000001514 detection method Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P3/00—Measuring linear or angular speed; Measuring differences of linear or angular speeds
- G01P3/36—Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20164—Salient point detection; Corner detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the present invention relates to a moving body such as a robot or an automobile.
- autonomous driving technology and driving support technology have been developed to detect information on the surrounding environment and control driving according to the situation in order to improve safety and convenience.
- an object of the present invention is to provide a moving body capable of calculating a moving amount with low processing load and high accuracy even when a three-dimensional object that is stationary or moving is reflected in a camera image.
- a representative mobile body of the present invention includes an imaging device that captures an image of a road surface, and an image processing unit that calculates the amount of movement of the mobile body based on an image captured by the imaging device.
- the image processing unit extracts a plurality of first feature points from the first image captured at the first timing, and the second image captured at the second timing after the first timing.
- a plurality of second feature points are extracted from the image, each of the plurality of first feature points is tracked to each of the plurality of second feature points, and the amount of movement and the speed of each of the plurality of second feature points And the amount of movement of the moving body is calculated based on the feature points whose speed is within a predetermined range among the plurality of second feature points.
- the present invention it is possible to provide a moving object capable of calculating a moving amount with a low processing load and high accuracy even when a three-dimensional object that is stationary or moving is reflected in a camera image.
- the block diagram of a moving body The flowchart of an image process part. Detailed view of velocity filtering. Explanatory drawing of the movement amount calculation on a road. Explanatory drawing of the height and distance calculation of a solid object.
- FIG. 1 is a configuration diagram of a moving body.
- the moving body 1 captures an image of the surrounding environment, processes the image captured by the imaging apparatus 2, calculates the amount of movement of the moving body 1, and outputs a display or control signal according to the calculation result. It is comprised by the processing apparatus 3 to perform.
- the processing device 3 is configured by a computer system or the like, and includes an image processing unit 4 that processes an image captured by the imaging device 2, a control unit (CPU) 5 that performs various controls based on the processed image, movement amount calculation, and the like.
- a memory 6 for storing various data used in the control unit, a display unit 7 for outputting calculation results of the control unit 5 and a bus 8 for connecting these components to each other are provided.
- the imaging device 2 is, for example, a monocular camera or a stereo camera installed facing the front of the moving body 1.
- the imaging device 2 is a monocular camera
- the pixel position on the image and the actual positional relationship (x, y) are constant, so that the feature points can be calculated geometrically.
- the imaging device 2 is a stereo camera
- the distance to the feature point can be measured more accurately.
- any standard camera, wide-angle camera, or stereo camera can be used as long as it has a viewing angle from which feature points can be extracted during driving. Good.
- Each camera finally generates one image, and the imaging apparatus 2 may be configured by combining a plurality of cameras.
- the imaging device 2 acquires an image when a command is input from the control unit 5 or at regular time intervals, and outputs the acquired image and the acquisition time to the image processing unit 4 via the memory 6.
- the original image and the acquisition time of the acquired image are stored in the memory 6, and an intermediate processed image is created in the image processing unit 4.
- These intermediate images are also stored in the memory 6 as necessary, and are used for determination and processing by the control unit 5 and the like. Note that the result data used for processing in the control unit 5 is also stored in the memory 6 as appropriate.
- the bus 8 for transmitting data between the blocks can be composed of an IEBUS (Inter Equipment Bus), a LIN (Local Interconnect Network), a CAN (Controller Area Network), or the like.
- the image processing unit 4 calculates the amount of movement by the imaging device 2 based on the captured image while the moving body 1 is traveling. First, feature points on the image transmitted from the imaging device 2 are extracted. Further, feature points on the next transmitted image are also extracted. Then, the feature point extracted first (previous) and the next (current) feature point are tracked (processing for associating feature points with each other: see FIG. 4), and the moving amount of the moving object is calculated. Is output to the control unit 5.
- IEBUS Inter Equipment Bus
- LIN Local Interconnect Network
- CAN Controller Area Network
- the control unit 5 calculates the position of the moving body 1 based on the movement amount calculated by the image processing unit 4, determines the future moving direction and speed, and controls the moving body 1. Then, by displaying a necessary detection result on the display unit 7, information is provided to the operator of the moving body 1.
- FIG. 2 is a diagram showing a flowchart of the image processing unit 4.
- the image processing unit 4 acquires an image captured by the imaging device 2 from the memory 6 (step 20).
- a region of interest for extracting feature points is set on the image acquired in step 20 (step 21).
- a traveling road surface that can be accurately converted from a pixel position of an image to a meter position on a road surface is set as a region of interest.
- feature points are extracted from the region of interest set in step 21 (step 22).
- the feature points are edges, corners, and the like on the image, and techniques such as Canny, Sobel, FAST, Hessian, and Gaussian are used.
- the feature points extracted from the image captured at the first timing (previous frame), that is, the image captured before the image captured at the second timing (current frame) are captured this time. Tracking is performed on the recorded image (step 23). For tracking, techniques such as the Lucas-Kanade method and the Shi-Tomasi method are used.
- step 24 it is checked whether or not the tracking in step 23 is successful (step 24). If the tracking is successful, the process proceeds to step 25, and if unsuccessful, the process is terminated. Since tracking requires two images captured in different frames, the first image captured first after the moving body 1 is activated cannot be tracked in step 23. After step 24, the processing is performed. finish.
- n is the number of feature points that can be tracked
- m is the number of times the moving object 1 is calculated.
- a relative speed v mn ⁇ d mn / ⁇ t m of each feature point is calculated using the difference ⁇ t m between the acquisition time of the previous frame and the current imaging time.
- the movement amount ⁇ D m of the moving body 1 is calculated using the feature points not filtered in step 26 (step 27).
- o the number of feature points not filtered
- a calculation method for example, Rigid Body Transformation, Sliding Window, least square method, median filter, or the like can be used.
- step 28 it is checked whether or not there is a feature point filtered in step 26 (step 28). If there is a filtered feature point, the process proceeds to step 29. If there is no filtered feature point, the process ends.
- step 29 When there is a filtered feature point, the height of the three-dimensional object, the speed to the three-dimensional object, the speed of the three-dimensional object, using each feature point filtered in step 26 and the movement amount ⁇ D m of the moving object 1 calculated in step 27. Is calculated (step 29).
- FIG. 3 is a diagram showing details of the speed filtering in step 26.
- Figure 3 (A) shows a time transition of the movement amount [Delta] D m of the moving body 1 calculated in step 27. Since the processing time for moving amount calculation varies depending on the number of feature points extracted in each frame, the difference ⁇ t m in imaging time between frames fluctuates irregularly, and the moving amount ⁇ D m calculated during the difference ⁇ t m is also Change irregularly. For this reason, there is no correlation between the movement amount calculated this time and the movement amount calculated in the previous frame.
- FIG. 3B shows the velocity V m of the moving body 1 calculated in step 25.
- the number of data described in FIG. 3B and the difference ⁇ t m are the same as in FIG. 3A, but change regularly by dividing the irregularly moving amount ⁇ D m by ⁇ t m. It can be converted to velocity V m. Physically, the speed does not change significantly in a short time (millisecond unit), so if V m ⁇ 1 is known, V m is within a threshold V threshold around V m ⁇ 1 . Therefore, in step 26, the relative speed v mn of each feature point calculated in step 25 is compared with the speeds V m ⁇ 1 , V m ⁇ 2 ... Of past moving objects stored in time series. Can be filtered.
- the function f for calculating the current speed of the moving body 1 is the average of the time series information (V m ⁇ 1 , V m ⁇ 2 , V m ⁇ 3... ) Of the speed of the moving body calculated in the past and the current speed. It can be composed of polynomials for interpolation.
- V e f (Sensor_V m ⁇ 1 , Sensor_V m ⁇ 2 , Sensor_V m ⁇ 3 ). Is calculated based on the speed Sensor_V m of the moving body 1 calculated by a speed sensor such as a wheel encoder, radar, or laser. Good.
- FIG. 4 is a diagram showing calculation of the amount of movement on the road.
- the frame 41 is an image captured by the imaging device 2.
- (U, v) is the coordinates in pixel units on the image, and (x, y) is the coordinates in meters of the road of the imaging apparatus 2.
- the road 42 is a road surface on which the moving body 1 travels.
- Feature point 43a is the feature point extracted in steps 20 to 22 in frame A. After tracking the feature point 43a in the frame B (step 23), the feature point 43a in the frame A (white circle in the frame B) moves to the feature point 43b in the frame B. After the feature point 43b is tracked in the frame C (step 23), the feature point 43b (the white circle in the frame C) moves to the feature point 43c in the frame C.
- the movement amount 44b is the relative movement amount ⁇ d Bn (where n is the feature point number) calculated in step 25 based on the positions of the feature points 43a and 43b, and the movement amount 44c is the feature point 43b.
- the movement amount ⁇ d Cn calculated in step 25 based on the position of the feature point 43c.
- the relative velocity v Bn ⁇ d Bn / ⁇ t B of each feature point 43b is calculated. Since the relative movement amounts ⁇ d Bn of the feature points are all calculated as the relative movement amounts of the feature points 43a on the same road surface, there is no significant difference between the plurality of feature points 43a (there is a slight difference because there is a turning component). Different).
- the feature point 43b is filtered by the speed filtering condition
- > V threshold in step 26, and the movement amount ⁇ D m ⁇ D B and the speed V are used in step 27 by using the unfiltered v Bn.
- B calculates the ⁇ D B / t B. It is also possible to calculate the direction change ⁇ B from the distribution of the movement amount ⁇ d Cn on the road coordinates.
- the three-dimensional object 45 is a dynamic three-dimensional object shown in the frame B image.
- the feature point 46b is a feature point of the three-dimensional object 45 in the frame B extracted in steps 20-22.
- the feature point 46c is a point obtained by tracking the feature point 46b in step 23 in the frame C.
- the movement amount 47 is the movement amount from the feature point 46 b to 46 c calculated in step 25.
- the frame C includes a movement amount 44c from the feature points 43b to 43c calculated in steps 20 to 24 and a movement amount 47 of the three-dimensional object 45 from the feature points 46b to 46c.
- the relative speed v Cn of each feature point is calculated using the time ⁇ t C between frames.
- > When the speed V B of the moving body 1 calculated in the frame B is used as the speed V e of V threshold , V B is substituted for V e and Step 26 is executed. To do.
- the feature point 46c Since the amount of movement 47 of the three-dimensional object 45 from the feature point 46b to 46c is different from the amount of movement 44c from the feature point 43b to 43c of the road surface, the feature point 46c satisfies the condition
- FIG. 5 is a diagram showing details of step 29.
- Steps 20 to 22 are executed at the position P1, and feature points 53a on the road surface 51 are extracted. If the road surface 51 is flat, the pixel position (u, v) on the image and the actual positional relationship (x, y) are constant, and therefore, from the pixel position (U, V) feature of the feature point 53a on the image. The actual position (X, Y) feature of the feature point 53a can be converted, and the distance d1 from the position P1 to the feature point 53a can be calculated.
- the three-dimensional object 55 is located at a height h from the road surface 51 in FIG. 5B.
- the feature point 53b of the three-dimensional object 55 is reflected in the pixel position (U, V) feature on the image at the position P1
- the erroneous coordinate 53a ′ (X, X) is detected from the pixel (U, V) feature .
- Y) Since it is converted to feature , the distance d3 from the position P1 to the feature point 53b is also calculated with an incorrect value d1.
- the imaging device 2 moves to the position P2
- the distance from the position P2 to the feature point 53b is calculated as an incorrect value d2 ′.
- ⁇ db is an error of the movement amount ⁇ d caused by tracking the feature point 53b of the three-dimensional object 55.
- the distance dist is the distance from the position P2 to the three-dimensional object 55
- the distance l1 is the distance from the three-dimensional object 55 to the point 53a ′
- the distance l2 is the difference between the distance d2 ′ and the distance dist.
- Formula (3) and Formula (4) are obtained from Formula (1) and Formula (2).
- the movement amount ⁇ d” calculated erroneously when the three-dimensional object 55 is present can be expressed by the installation height H of the camera and the height h of the three-dimensional object 55.
- h H * ( ⁇ d′ ⁇ d) / ⁇ d ′ (4)
- the height h of the three-dimensional object can be calculated.
- the installation height H of the camera is fixed and known.
- the distance from the position P2 to the three-dimensional object 55 is calculated geometrically by the equation (5).
- the distance dist to each feature point can be calculated.
- the height h and the distance dist of the three-dimensional object 55 can be calculated by combining the information acquired by the GPS or speed sensor and the information acquired by the imaging device 2.
- the more accurate movement amount ⁇ d is calculated from the GPS
- the more accurate height h can be calculated by substituting the movement amount ⁇ d into Equation (3).
- the distance dist to the three-dimensional object 55 can be calculated more accurately by substituting the height h into the equation (5).
- the speed sensor is mounted
- the height h is calculated from the equation (4)
- the equation (5) is used. The distance dist is calculated.
- FIG. 5C shows a case where the three-dimensional object 55 is a dynamic three-dimensional object with a movement amount ⁇ d object between frames.
- ⁇ d is the amount of movement of the moving object
- d1 is a converted value of the distance to the three-dimensional object when the moving object is at the point P1
- d5 is a converted value of the distance to the three-dimensional object when the moving object is at the point P2
- H is the installation height of the camera
- h is the height of the three-dimensional object. Equation (6) is obtained from the relationship shown in the figure.
- ⁇ d, d1, and d5 are known, and when two or more feature points of a three-dimensional object can be acquired and acquired from Equation (6), the movement amount and height of the three-dimensional object can be calculated simultaneously.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Power Engineering (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
- Image Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
画像処理部4は、移動体1の走行中に撮像装置2が撮像画像に基づいて、移動量を算出する。まず、撮像装置2から伝送された画像上の特徴点を抽出する。更に、次に伝送された画像上の特徴点も抽出する。そして、先に(前回)抽出された特徴点と次に(今回)抽出された特徴点をトラッキング(特徴点同士を対応付ける処理:図4参照)し、移動体の移動量を算出し、その結果を制御部5に出力する。
また、移動量算出装置の起動後、過去の移動体1の過去の速度情報はないため、抽出した特徴点のほとんどを用いて、現在速度を算出する。即ち、速度Veの初期値を0とし、Vthresholdを大きく設定する。
h/H=l2/d2’=l1/d1 (1)
Δdb=l1-l2=h*(d1-d2’)/H (2)
Δd’=Δdb+Δd=d1-d2’
=h*(d1-d2’)/H + Δd
=h*Δd’/H + Δd
よって、Δd’=Δd*H/(H-h) (3)
h=H*(Δd’- Δd)/Δd’(4)
以上より、カメラの設置高さHと立体物55がある場合に誤って算出した移動量Δd’= Δdb+Δdと、立体物55がない場合に正しく算出した移動量Δdを式(3)に代入することで、立体物の高さhを算出できる。カメラの設置高さHは固定で既知である。立体物55がある場合の誤った移動量Δd’= Δdb+Δdはステップ20~25で算出でき、立体物がない場合の正しい移動量Δdはステップ26でフィルタリングされなかった特徴点を用いてステップ27で算出できる。立体物の影響で全ての特徴点がフィルタリングされた場合、移動体1の移動量はステップ26で算出したVeとフレーム間の撮像時刻の差分Δtmを用いて、Δd=Ve/Δtmで算出する。ステップ29では、移動量Δdと移動量Δd’を式(4)に代入し、立体物55の高さhを算出する。式(4)に特徴点の相対移動量Δd’=Δdmnを代入すれば、それぞれの特徴点の高さを算出できる。しかし、全ての特徴点の高さを算出するには時間がかかるため、ステップ26でフィルタリングされた特徴点の高さのみ算出すればよい。
dist = d2’-l2 = d2’-d2’*h/H = d2’*(H-h)/H (5)
移動体1にGPSや速度センサを搭載することでGPSや速度センサで取得した情報と撮像装置2で取得した情報を組み合わせて立体物55の高さhや距離distを算出できる。例えば、GPSからより正確な移動量Δdを算出した場合は、移動量Δdを式(3)に代入することで、より正確な高さhを算出できる。更に、高さhを式(5)に代入することで立体物55までの距離distもより正確に算出できる。速度センサを搭載する場合、速度センサで取得した移動体1の速度Vを用いて移動量Δd=V*Δtmを算出し、式(4)より高さhを算出し、式(5)より距離distを算出する。
dobject=Δd+d4-d3=Δd+(d1-d5)*(H-h)/H (6)
前述のように、Δd、d1、d5は既知であり、式(6)より、立体物の特徴点が2つ以上取得取得できると、立体物の移動量と高さが同時に算出できる。
2 撮像装置
3 処理装置
4 画像処理部
5 制御部
6 メモリ
7 表示部
8 バス
Claims (3)
- 路面の画像を撮像する撮像装置と、
前記撮像装置が撮像した画像に基づいて、移動体の移動量を算出する画像処理部と、を備え、
前記画像処理部は、
第1のタイミングで撮像された第1の画像から第1の特徴点を複数抽出し、
前記第1のタイミングより後の第2のタイミングで撮像された第2の画像から第2の特徴点を複数抽出し、
前記複数の第1の特徴点の各々を前記複数の第2の特徴点の各々にトラッキングし、
前記複数の第2の特徴点の各々の移動量と速度を算出し、
前記複数の第2の特徴点のうち、その速度が所定の範囲内の特徴点に基づいて、前記移動体の移動量を算出する、移動体。 - 前記複数の第2の特徴点のうち、その速度が前記所定の範囲より大きい特徴点に基づいて、前記撮像装置により撮像された路面上の立体物の高さ、及び/または、前記移動体から該立体物までの距離、及び/または、該立体物の速度を算出する、請求項1記載の移動体。
- 前記移動量と、前記撮像装置で画像を取得した時刻に基づいて、前記移動体の速度を算出する、請求項1又は2に記載の移動体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018501027A JP6708730B2 (ja) | 2016-02-23 | 2017-01-12 | 移動体 |
EP17755983.8A EP3422293B1 (en) | 2016-02-23 | 2017-01-12 | Mobile object |
US16/079,372 US10740908B2 (en) | 2016-02-23 | 2017-01-12 | Moving object |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016031570 | 2016-02-23 | ||
JP2016-031570 | 2016-02-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017145541A1 true WO2017145541A1 (ja) | 2017-08-31 |
Family
ID=59686123
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/000694 WO2017145541A1 (ja) | 2016-02-23 | 2017-01-12 | 移動体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10740908B2 (ja) |
EP (1) | EP3422293B1 (ja) |
JP (1) | JP6708730B2 (ja) |
WO (1) | WO2017145541A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019073772A1 (ja) * | 2017-10-11 | 2019-04-18 | 日立オートモティブシステムズ株式会社 | 移動体の位置推定装置及び位置推定方法 |
JP2020071202A (ja) * | 2018-11-02 | 2020-05-07 | コーデンシ株式会社 | 移動体検知システム及び移動体検知システム用プログラム |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10482610B2 (en) * | 2017-11-01 | 2019-11-19 | Adobe Inc. | Detection of partially motion-blurred video frames |
CN109537455A (zh) * | 2019-01-10 | 2019-03-29 | 上海市机械施工集团有限公司 | 3d打印建筑的装置及方法 |
CN111104920B (zh) * | 2019-12-27 | 2023-12-01 | 深圳市商汤科技有限公司 | 视频处理方法及装置、电子设备和存储介质 |
JP2022036537A (ja) * | 2020-08-24 | 2022-03-08 | 富士通株式会社 | 移動体速度導出方法及び移動体速度導出プログラム |
JP2022142515A (ja) * | 2021-03-16 | 2022-09-30 | 本田技研工業株式会社 | 移動体の移動量を推定する情報処理装置、情報処理方法、及びプログラム |
DE102021207834A1 (de) * | 2021-07-22 | 2023-01-26 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren für Kraftfahrzeuge zur Höhenerkennung von erhabenen Objekten |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013003110A (ja) * | 2011-06-21 | 2013-01-07 | Denso Corp | 車両状態検出装置 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10222665A (ja) * | 1997-01-31 | 1998-08-21 | Fujitsu Ten Ltd | 画像認識装置 |
JP2003178309A (ja) | 2001-10-03 | 2003-06-27 | Toyota Central Res & Dev Lab Inc | 移動量推定装置 |
JP4800163B2 (ja) * | 2006-09-29 | 2011-10-26 | 株式会社トプコン | 位置測定装置及びその方法 |
US8605947B2 (en) * | 2008-04-24 | 2013-12-10 | GM Global Technology Operations LLC | Method for detecting a clear path of travel for a vehicle enhanced by object detection |
JP5830876B2 (ja) * | 2011-02-18 | 2015-12-09 | 富士通株式会社 | 距離算出プログラム、距離算出方法及び距離算出装置 |
US9357208B2 (en) * | 2011-04-25 | 2016-05-31 | Magna Electronics Inc. | Method and system for dynamically calibrating vehicular cameras |
JP2013031310A (ja) * | 2011-07-29 | 2013-02-07 | Sanyo Electric Co Ltd | 制御装置、バッテリシステム、電動車両、移動体、電力貯蔵装置および電源装置 |
JP6340957B2 (ja) * | 2014-07-02 | 2018-06-13 | 株式会社デンソー | 物体検出装置および物体検出プログラム |
JP6281460B2 (ja) * | 2014-09-24 | 2018-02-21 | 株式会社デンソー | 物体検出装置 |
JP6281459B2 (ja) * | 2014-09-24 | 2018-02-21 | 株式会社デンソー | 物体検出装置 |
US10282914B1 (en) * | 2015-07-17 | 2019-05-07 | Bao Tran | Systems and methods for computer assisted operation |
US9996981B1 (en) * | 2016-03-07 | 2018-06-12 | Bao Tran | Augmented reality system |
-
2017
- 2017-01-12 EP EP17755983.8A patent/EP3422293B1/en active Active
- 2017-01-12 WO PCT/JP2017/000694 patent/WO2017145541A1/ja active Application Filing
- 2017-01-12 US US16/079,372 patent/US10740908B2/en active Active
- 2017-01-12 JP JP2018501027A patent/JP6708730B2/ja active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013003110A (ja) * | 2011-06-21 | 2013-01-07 | Denso Corp | 車両状態検出装置 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019073772A1 (ja) * | 2017-10-11 | 2019-04-18 | 日立オートモティブシステムズ株式会社 | 移動体の位置推定装置及び位置推定方法 |
JP2019070983A (ja) * | 2017-10-11 | 2019-05-09 | 日立オートモティブシステムズ株式会社 | 移動体の位置推定装置及び位置推定方法 |
US11151729B2 (en) | 2017-10-11 | 2021-10-19 | Hitachi Automotive Systems, Ltd. | Mobile entity position estimation device and position estimation method |
JP2020071202A (ja) * | 2018-11-02 | 2020-05-07 | コーデンシ株式会社 | 移動体検知システム及び移動体検知システム用プログラム |
JP7148126B2 (ja) | 2018-11-02 | 2022-10-05 | コーデンシ株式会社 | 移動体検知システム及び移動体検知システム用プログラム |
Also Published As
Publication number | Publication date |
---|---|
US10740908B2 (en) | 2020-08-11 |
US20190066312A1 (en) | 2019-02-28 |
JPWO2017145541A1 (ja) | 2018-11-08 |
EP3422293A1 (en) | 2019-01-02 |
EP3422293B1 (en) | 2024-04-10 |
EP3422293A4 (en) | 2019-10-16 |
JP6708730B2 (ja) | 2020-06-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017145541A1 (ja) | 移動体 | |
EP3280976B1 (en) | Object position measurement with automotive camera using vehicle motion data | |
JP6623044B2 (ja) | ステレオカメラ装置 | |
US9599706B2 (en) | Fusion method for cross traffic application using radars and camera | |
US20170337434A1 (en) | Warning Method of Obstacles and Device of Obstacles | |
CN107121132B (zh) | 求取车辆环境图像的方法和设备及识别环境中对象的方法 | |
US20180137376A1 (en) | State estimating method and apparatus | |
CN110023951B (zh) | 信息处理设备、成像设备、装置控制系统、信息处理方法和计算机可读记录介质 | |
CN106447730B (zh) | 参数估计方法、装置和电子设备 | |
JP6776202B2 (ja) | 車載カメラのキャリブレーション装置及び方法 | |
EP2924655B1 (en) | Disparity value deriving device, equipment control system, movable apparatus, robot, disparity value deriving method, and computer-readable storage medium | |
US20090297036A1 (en) | Object detection on a pixel plane in a digital image sequence | |
CN111164648B (zh) | 移动体的位置推断装置及位置推断方法 | |
JP6032034B2 (ja) | 物体検知装置 | |
US10554951B2 (en) | Method and apparatus for the autocalibration of a vehicle camera system | |
KR20160077684A (ko) | 객체 추적 장치 및 방법 | |
JP6035095B2 (ja) | 車両の衝突判定装置 | |
CN114419098A (zh) | 基于视觉变换的运动目标轨迹预测方法及装置 | |
JP6564127B2 (ja) | 自動車用視覚システム及び視覚システムを制御する方法 | |
EP3486871B1 (en) | A vision system and method for autonomous driving and/or driver assistance in a motor vehicle | |
CN113643355B (zh) | 一种目标车辆位置和朝向的检测方法、系统及存储介质 | |
US12106492B2 (en) | Computer vision system for object tracking and time-to-collision | |
JP4231883B2 (ja) | 画像処理装置及びその方法 | |
JP6704307B2 (ja) | 移動量算出装置および移動量算出方法 | |
JP4144464B2 (ja) | 車載用距離算出装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2018501027 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017755983 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017755983 Country of ref document: EP Effective date: 20180924 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17755983 Country of ref document: EP Kind code of ref document: A1 |