WO2018159398A1 - Dispositif et procédé d'estimation de l'emplacement d'un corps mobile - Google Patents

Dispositif et procédé d'estimation de l'emplacement d'un corps mobile Download PDF

Info

Publication number
WO2018159398A1
WO2018159398A1 PCT/JP2018/006148 JP2018006148W WO2018159398A1 WO 2018159398 A1 WO2018159398 A1 WO 2018159398A1 JP 2018006148 W JP2018006148 W JP 2018006148W WO 2018159398 A1 WO2018159398 A1 WO 2018159398A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving body
image
imaging device
moving
processing unit
Prior art date
Application number
PCT/JP2018/006148
Other languages
English (en)
Japanese (ja)
Inventor
アレックス益男 金子
山本 健次郎
茂規 早瀬
Original Assignee
日立オートモティブシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立オートモティブシステムズ株式会社 filed Critical 日立オートモティブシステムズ株式会社
Publication of WO2018159398A1 publication Critical patent/WO2018159398A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present invention relates to a technique for estimating the position of a moving body such as a robot or an automobile.
  • a control device mounted on a moving body integrates the speed or angular velocity of the moving body calculated by, for example, an IMU, or uses GPS positioning to determine the position (self-position) of the moving body itself. presume. Furthermore, the control device collects surrounding information with a laser sensor or a camera, and detects a road surface paint or a landmark such as a sign that serves as a reference for position estimation. And a control apparatus corrects the present position of a mobile body by comparing the position of the detected landmark with map information.
  • the odometry method is used to estimate the relative position of objects around the moving body.
  • the first vehicle position estimation unit estimates the estimated value and accuracy of the position and orientation of the vehicle on the ground fixed coordinate system based on the surrounding environment image of the vehicle.
  • the relative position relationship between the reference point and direction and the vehicle position is estimated by integrating the amount of change in the vehicle position estimated based on the operation amount and the movement state of the vehicle by the second vehicle position estimation unit.
  • a section having a high degree of difficulty in position estimation by the first vehicle position estimation unit is estimated on the target travel route.
  • a landmark having a high possibility of entering the shooting range of the surrounding environment shooting unit in the near side area of the section estimated to have a high degree of difficulty is detected from the map information.
  • the vehicle position estimation switching unit switches from the first vehicle position estimation unit to the second vehicle position estimation unit while traveling in a section where the position estimation difficulty is high. While traveling in a section with a high degree of position estimation difficulty, a section other than the section with a high position estimation difficulty is executed by executing travel control of the own vehicle based on the output of the second vehicle position estimation section based on the result of the own vehicle position estimation switching section. Then, the traveling control of the host vehicle is executed based on the output of the first host vehicle position estimation unit based on the result of the host vehicle position estimation switching unit.
  • the vehicle's travel control is executed.
  • travel control of the host vehicle is executed based on the output of the first host vehicle position estimation unit based on the result of the host vehicle position estimation switching unit.
  • a mobile body position estimation device obtains a plurality of position candidates of the mobile body based on an imaging device attached to the mobile body and an image captured by the imaging device.
  • An image processing unit that identifies and estimates the position of the moving body based on the plurality of position candidates and the moving speed of the moving body.
  • FIG. 1 is a configuration diagram of a position estimation device for a moving body according to an embodiment of the present invention.
  • a flowchart showing an image processing procedure performed by the image processing unit 14 Explanatory drawing of the maximum moving speed of the moving body 100 determined from the imaging range of the imaging device 12
  • An example of one frame image in the captured video image Specific example of weighting by speed Example of weighting by obstacle Example in which the speed of the moving body 100 changes under the control of the control unit 15 Example in the case of a single imaging device 12
  • FIG. 1 is a configuration diagram of a position estimation apparatus 1 according to an embodiment of the present invention.
  • the position estimation device 1 is mounted on a moving body 100 such as an automobile or a robot.
  • the position estimation device 1 includes one or more imaging devices 12 (12a, 12b,... 12n) and an information processing device 13.
  • the imaging device 12 may be a still camera or a video camera, for example.
  • the imaging device 12 may be a single application camera or a stereo camera.
  • the information processing device 13 processes the image picked up by the image pickup device 12 and calculates the position or amount of movement of the moving body 100.
  • the information processing apparatus 13 may perform display according to the calculated position or movement amount, or may output a signal related to control of the moving body 100.
  • the information processing device 13 is, for example, a general computer, and includes an image processing unit 14 that processes an image captured by the imaging device 12, a control unit (CPU) 15 that performs processing based on the image processing result, and a memory 16. And a display unit 17 such as a display, and a bus 18 for connecting these components to each other.
  • the information processing apparatus 13 may perform the following processing by causing the image processing unit 14 and the control unit 15 to execute a predetermined computer program.
  • the imaging device 12a is installed in front of the moving body 100, for example.
  • the lens of the imaging device 12 a is directed to the front of the moving body 100.
  • the imaging device 12a images, for example, a distant view in front of the moving body 100.
  • the other imaging devices 12b,..., And the imaging device 12n are installed at positions different from the imaging device 12a, and take images in an imaging direction or area different from the imaging device 12a.
  • the imaging device 12b may be installed, for example, downward behind the moving body 100.
  • the imaging device 12b may capture a close view behind the moving body 100.
  • the imaging device 12 is a monocular camera, if the road surface is flat, the pixel position on the image and the actual positional relationship (x, y) are constant, so the distance from the imaging device 12 to the feature point is geometrically determined. Can be calculated.
  • the imaging device 12 is a stereo camera, the distance to the feature point on the image can be measured more accurately.
  • a camera having a standard lens of a single application is adopted will be described, but other cameras (such as a camera having a wide-angle lens or a stereo camera) may be used.
  • the imaging devices 12a, imaging devices 12b,..., Imaging devices 12n may capture different objects at a certain time.
  • the imaging device 12a may capture a distant view in front of the moving body 100.
  • feature points such as obstacles or landmarks for position estimation may be extracted from an image obtained by capturing a distant view.
  • the imaging device 12b may image a close view such as a road surface around the moving body 100.
  • a white line around the moving body 100, road surface paint, or the like may be detected from an image obtained by capturing a near view.
  • the imaging device 12a, the imaging device 12b,..., The imaging device 12n may be installed on the moving body 100 under conditions that are not affected by environmental disturbances such as rain and sunlight.
  • the imaging device 12a may be installed forward in front of the moving body 100, whereas the imaging device 12b may be installed backward or downward in the rear of the moving body 100.
  • the imaging device 12a, the imaging device 12b,..., The imaging device 12n may shoot under different imaging conditions (aperture value, white balance, etc.).
  • an image pickup apparatus that adjusts parameters for a bright place and an image pickup apparatus that adjusts parameters for a dark place may be mounted to enable image pickup regardless of the brightness of the environment.
  • the imaging device 12a, the imaging device 12b,..., The imaging device 12n may capture images when receiving a command to start shooting from the control unit 15, or at regular time intervals.
  • the data of the captured image and the imaging time are stored in the memory 16.
  • the memory 16 includes a main storage device (main memory) of the information processing device 13 and an auxiliary storage device such as a storage.
  • the image processing unit 14 performs various image processing based on the image data stored in the memory 16 and the imaging time.
  • image processing for example, an intermediate image is created and stored in the memory 16.
  • the intermediate image may be used for determination and processing by the control unit 15 in addition to processing by the image processing unit 14.
  • the image processing unit 14 specifies a plurality of position candidates of the moving body based on the image captured by the imaging device 12, and determines the position of the moving body 100 based on the plurality of position candidates and the moving speed of the moving body 100. presume.
  • the image processing unit 14 processes, for example, an image captured by the imaging device 12 while the moving body 100 is traveling, and estimates the position of the moving body 100. For example, the image processing unit 14 may calculate the moving amount of the moving body 100 from the video image captured by the imaging device 12, and may add the moving amount to the start point to estimate the current position.
  • the image processing unit 14 may extract feature points from each frame image of the video image. The image processing unit 14 further extracts the same feature point in the subsequent and subsequent frame images. Then, the image processing unit 14 may calculate the amount of movement of the moving body 100 by tracking feature points.
  • the control unit 15 may output a command regarding the moving speed to the moving body 100 based on the result of the image processing of the image processing unit 14. For example, the control unit 15 instructs or decreases the moving speed of the moving body 100 according to the number of obstacle pixels in the image, the number of outliers among the feature points in the image, the type of image processing, or the like. Or a command to maintain may be output.
  • FIG. 2 is a flowchart showing an image processing procedure performed by the image processing unit 14.
  • the image processing unit 14 performs a plurality of moving body position candidate calculation processes 21 (21a, 21b,... 21n) and an estimation process 26 for estimating a moving body position from the plurality of moving body position candidates.
  • the plurality of mobile object position candidate calculation processes 21 may be executed in parallel with each other.
  • an image captured by the imaging device 12a may be processed.
  • the moving body position candidate calculation processing 21b,... 21n an image captured by the imaging device 12b,.
  • the image processing unit 14 may specify the position candidate of each moving body 100 from the image captured by each imaging device 12.
  • the plurality of moving object position candidate calculation processes 21 (21a, 21b,... 21n) execute common steps. Hereinafter, each step will be described.
  • the image processing unit 14 acquires image data from the memory 16 (S22a, S22b,... S22n).
  • the image data acquired here may be data of a plurality of frame images.
  • the image processing unit 14 extracts feature points in each acquired frame image (S23a, S23b,... S23n).
  • the feature point may be, for example, an edge or a corner in the image.
  • techniques such as Canny, Sover , FAST, Hessian, and Gaussian may be used.
  • a specific algorithm is appropriately selected according to the feature of the image.
  • the image processing unit 14 calculates the amount of movement of the moving body 100 based on the result of the feature point tracking, and calculates the current position candidate of the moving body 100 (S25a).
  • the movement amount for example, Rigid Body Transformation, Sliding Window, basic matrix, median, and the like can be used. Further, outlier detection and optimization processing such as median filter, RANSAC, bundle adjustment, and key frame adjustment may be performed.
  • a plurality of position candidates are calculated by performing processing for different video images in the plurality of moving body position candidate calculation processes 21a, 21b,.
  • the position estimation process in step S26 estimates the current position of the moving body 100 by integrating the position candidates calculated by the respective moving body position candidate calculation processes 21a, 21b,. Details of the position estimation process will be described later.
  • the maximum moving speed of the moving body 100 determined from the imaging range of the imaging device 12 will be described with reference to FIG.
  • FIG. 3 is a side view schematically showing the imaging device 12 attached to the moving body 100. That is, the imaging device 12 is installed at a height H from the road surface 30 so that the optical axis is at an angle ⁇ with respect to the road surface 30. Further, the angle of view of the imaging device 12 is ⁇ , the maximum imaging distance from the imaging device 12 to the road surface is D, and the maximum imaging range is R.
  • FIG. 4 shows an example of one frame image I in the captured video image.
  • the figure shows an example in which an obstacle 34 is shown in the image 31.
  • the obstacle 34 is not a target for feature point extraction. Therefore, the amount of information obtained from the image 31 decreases as the area occupied by the obstacle 34 in the image increases. That is, as the area occupied by the obstacle 34 increases, the number of feature points detected from the image 31 also decreases, and the tracking accuracy decreases. Therefore, the image processing unit 14 may perform position estimation according to the area of the obstacle in step S26. For example, the image processing unit 14 may perform position estimation using a weighting factor W ob due to an obstacle.
  • the image processing unit 14 may perform position estimation according to the outlier 37 among the feature points in the image 31 in step 26. For example, the image processing unit 14 performs position estimation using a weighting factor W fp based on an outlier.
  • W fp the weighting factor W fp by the outlier.
  • the image processing unit 14 can use one or more of these in step S26.
  • the image processing unit 14 may use one of these weighting factors individually, or integrate two or more weighting factors to identify one weighting factor W for each frame image I. May be.
  • the weighting factor W (t) for the frame image I (t) at a certain time t is related to the weighting factor W v (t) due to the speed at the same time, the weighting factor W ob (t) for the obstacle, and the outlier of the feature point. Based on the weight coefficient W fp (t), for example, it is determined by the following equation (6).
  • W (t) average (W v (t), W ob (t), W fp (t)) (6)
  • the image processing unit 14 estimates the vehicle position from a plurality of vehicle position candidates in step S26. For example, in order to estimate the vehicle position Pos (t) at a certain time t, the image processing unit 14 uses a weighting coefficient W (t) for the frame image I (t) at that time.
  • the host vehicle position Pos (t) at a certain time t may be calculated by, for example, Expression (7).
  • Pos (t) (Pa (t) Wa (t) + Pb (t) Wb (t) +... + Pn (t) Wn (t)) / (Wa (t) + Wb (t) +. (T)) (7)
  • the current position candidates calculated from a plurality of images can be integrated to estimate the current position of the moving object 100.
  • FIG. 5A shows an example of the moving body 100 equipped with the imaging device 12a and the imaging device 12b.
  • the imaging device 12a is an imaging device that is installed toward the front of the moving body 100 and images a distant view.
  • the imaging device 12a has an imaging range RS .
  • the imaging device 12b is an imaging device that is installed to face downward behind the moving body 100 and images a close-up view.
  • the imaging device 12b has an imaging range RA .
  • FIG. 5B is a graph showing the relationship between the time t and the vehicle speed V.
  • the speed Vmax S is the maximum speed at which the feature point tracking is possible with the image captured by the imaging device 12a
  • the speed Vmax A is the maximum speed at which the feature point tracking is possible with the image captured by the imaging device 12b.
  • FIG. 5C is a graph showing the relationship between the time t and the weighting factor Wv depending on the speed.
  • Time t S is the time when the speed of the moving body 100 becomes Vmax S.
  • Time t A is the time when the speed of the moving body 100 becomes Vmax A.
  • the weight WvS is a weight based on the vehicle speed V for the image of the imaging device 12a.
  • the weight W vA is a weight based on the vehicle speed V for the image of the imaging device 12b.
  • FIG. 6A shows an example of the moving body 100 on which the imaging device 12a and the imaging device 12b are mounted as in FIG.
  • an obstacle 51 may further exist in the imaging range of the imaging device 12a.
  • FIG. 6B is a graph showing the relationship between the time t and the vehicle speed V, as in FIG. 5B.
  • Figure 6C is a graph showing the relationship between the weight coefficient W ob by time t and speed of the weight coefficient W v and obstacles.
  • Weight coefficient W v by the speed is the same as in FIG. 5C.
  • Time t ob is the time when the obstacle 51 enters the imaging range RS and begins to appear in the image to be captured.
  • Time t obs ′ is the time when the obstacle 51 is out of the imaging range R S and no longer appears in the captured image.
  • the control unit 15 may control the moving body 100 according to the processing result of the image processing unit 14.
  • An example is shown in FIG.
  • the detection of an obstacle is used as a trigger, but other triggers may be used. For example, it is good also as a trigger that the ratio of the number of outliers among a feature point is more than predetermined.
  • FIG. 7 shows an example in which the speed of the moving body 100 changes under the control of the control unit 15 triggered by the detection of an obstacle under the same situation as FIG.
  • FIG. 7A is a graph showing the relationship between time t and host vehicle speed V, as in FIGS. 5B and 6B.
  • the obstacle 51 enters the image in the imaging range R S of the imaging device 12 a, and the image processing unit 14 detects this and notifies the control unit 15.
  • the number of pixels of the obstacle 51 may be greater than or equal to a predetermined number, and the ratio of the number of pixels of the obstacle 51 to the number of pixels of the entire image may be greater than or equal to the predetermined criterion.
  • the control unit 15 Upon receiving this notification, the control unit 15 outputs a command for maintaining the speed at that time.
  • the velocity of the moving body 100 is maintained, the weight coefficient W v by the speed is also maintained. Instead of maintaining the speed of the moving body, the speed of the moving body may be decreased.
  • the obstacle 51 disappears from the image in the imaging range RS of the imaging device 12a.
  • the image processing unit 14 detects this and notifies the control unit 15 of it.
  • the number of pixels of the obstacle 51 may be equal to or less than a predetermined number, and the ratio of the number of pixels of the obstacle 51 in the total number of pixels may be equal to or less than the predetermined value.
  • the control unit 15 Upon receiving this notification, the control unit 15 outputs a command to cancel the speed maintenance up to that point. Thereby, the mobile body 100 accelerates again. As the speed increases, the weight coefficient due to the speed also increases.
  • the current position can be accurately estimated according to the surrounding conditions observed by the moving body 100.
  • the speed is kept constant or reduced to prevent further deterioration in position estimation accuracy due to speed increase.
  • FIG. 8 shows an example in the case of a single imaging device 12.
  • FIG. 8A shows an example of the moving body 100 equipped with one imaging device 12a.
  • the imaging device 12a is attached to the front of the moving body 100
  • the attachment position of the imaging device is not limited to the front.
  • the moving body position candidate calculation process 21 is applied to each image captured by each imaging device 12 to calculate a position candidate for each image, whereas the imaging device 12 has one imaging device 12. In this case, different mobile object position candidate calculation processing 21 is applied to the common image.
  • the moving object position candidate calculation process 21a is a non-robust process that is more susceptible to an obstacle, although the position estimation accuracy is higher than the moving object position candidate calculation process 21b.
  • the mobile object position candidate calculation process 21b is a robust process that is less affected by obstacles in position estimation but has low accuracy in position estimation.
  • the number of pixels of the obstacle 51 occupying the number of pixels of the entire image is constant.
  • FIG. 8B shows the relationship between time t and the accuracy of the moving object position candidate calculation process 21a.
  • FIG. 8C shows the relationship between the time t and the accuracy of the moving object position candidate calculation process 21b.
  • the accuracy of the current position candidate calculation in the moving body position candidate calculation processes 21a and 21b changes according to the feature of the image.
  • the image processing unit 14 in step S26 may be the position candidate P 2 calculated in a more accurate vehicle location candidate calculation process 21b as an estimated position Pos (t).
  • an example in which two processes having different susceptibility to an obstacle are applied to a common image may be used.
  • one of the processing results may be prioritized according to the speed of the moving body.
  • priority may be given to either processing result depending on the number of outliers. Further, these may be combined.
  • the number of processing is not limited as long as it is two or more.
  • 1 position estimation device 12 imaging device, 13 information processing device, 14 image processing unit, 15 control unit, 16 memory, 17 display unit, 100 moving body

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Navigation (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention estime très précisément l'emplacement actuel d'un corps mobile. Selon la présente invention, un dispositif (1) d'estimation d'emplacement de corps mobile comprend : un dispositif d'imagerie (12) qui est fixé à un corps mobile (100) ; et une unité (14) de traitement d'images qui spécifie une pluralité de candidats de position pour le corps mobile (100) en fonction d'images capturées par le dispositif d'imagerie (12), et qui estime l'emplacement du corps mobile (100) en fonction de la pluralité de candidats d'emplacement et de la vitesse de déplacement du corps mobile (100).
PCT/JP2018/006148 2017-03-03 2018-02-21 Dispositif et procédé d'estimation de l'emplacement d'un corps mobile WO2018159398A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017040234A JP6865064B2 (ja) 2017-03-03 2017-03-03 移動体の位置推定装置及び方法
JP2017-040234 2017-03-03

Publications (1)

Publication Number Publication Date
WO2018159398A1 true WO2018159398A1 (fr) 2018-09-07

Family

ID=63371255

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/006148 WO2018159398A1 (fr) 2017-03-03 2018-02-21 Dispositif et procédé d'estimation de l'emplacement d'un corps mobile

Country Status (2)

Country Link
JP (1) JP6865064B2 (fr)
WO (1) WO2018159398A1 (fr)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110220524A (zh) * 2019-04-23 2019-09-10 炬星科技(深圳)有限公司 路径规划方法、电子设备、机器人及计算机可读存储介质
JP7347194B2 (ja) * 2019-12-18 2023-09-20 株式会社豊田自動織機 走行制御装置
JP7393987B2 (ja) * 2020-03-18 2023-12-07 株式会社豊田中央研究所 地図作成装置、位置推定装置、車両制御システム、地図作成方法、コンピュータプログラム、および位置推定方法
JP7402993B2 (ja) * 2020-08-11 2023-12-21 日立Astemo株式会社 位置推定システム
JP2024049400A (ja) * 2021-01-29 2024-04-10 株式会社Nttドコモ 情報処理システム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002197470A (ja) * 2000-12-27 2002-07-12 Nissan Motor Co Ltd 車線検出装置
JP2007255979A (ja) * 2006-03-22 2007-10-04 Nissan Motor Co Ltd 物体検出方法および物体検出装置
JP2008175717A (ja) * 2007-01-19 2008-07-31 Xanavi Informatics Corp 現在位置算出装置、現在位置算出方法
JP2011065308A (ja) * 2009-09-16 2011-03-31 Hitachi Ltd 自律移動システム及び自律移動装置
JP2012150655A (ja) * 2011-01-19 2012-08-09 Toyota Central R&D Labs Inc 運動推定装置及びプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002197470A (ja) * 2000-12-27 2002-07-12 Nissan Motor Co Ltd 車線検出装置
JP2007255979A (ja) * 2006-03-22 2007-10-04 Nissan Motor Co Ltd 物体検出方法および物体検出装置
JP2008175717A (ja) * 2007-01-19 2008-07-31 Xanavi Informatics Corp 現在位置算出装置、現在位置算出方法
JP2011065308A (ja) * 2009-09-16 2011-03-31 Hitachi Ltd 自律移動システム及び自律移動装置
JP2012150655A (ja) * 2011-01-19 2012-08-09 Toyota Central R&D Labs Inc 運動推定装置及びプログラム

Also Published As

Publication number Publication date
JP6865064B2 (ja) 2021-04-28
JP2018146326A (ja) 2018-09-20

Similar Documents

Publication Publication Date Title
WO2018159398A1 (fr) Dispositif et procédé d'estimation de l'emplacement d'un corps mobile
KR101188588B1 (ko) 모노큘러 모션 스테레오 기반의 주차 공간 검출 장치 및방법
JP6567659B2 (ja) レーン検出装置およびレーン検出方法
JP6872128B2 (ja) 情報処理装置、情報処理方法、およびプログラム
JP6129981B2 (ja) 移動体位置推定装置および移動体位置推定方法
US11132813B2 (en) Distance estimation apparatus and method
JP5966747B2 (ja) 車両走行制御装置及びその方法
US20040125207A1 (en) Robust stereo-driven video-based surveillance
US11004233B1 (en) Intelligent vision-based detection and ranging system and method
EP2924655B1 (fr) Dispositif et procédé de déduction d'une valeur de disparité, système de commande d'équipement, appareil mobile, robot et support de stockage lisible par ordinateur
US11151729B2 (en) Mobile entity position estimation device and position estimation method
JP6032034B2 (ja) 物体検知装置
WO2019208101A1 (fr) Dispositif d'estimation de position
JP6780648B2 (ja) 情報処理装置、情報処理方法、およびプログラム
US11346670B2 (en) Position estimating device
BR102015005652A2 (pt) aparelho para controlar operações de formação de imagem de uma câmera óptica, e, sistema para reconhecer um número de veículo de uma placa de licença de um veículo
US11120292B2 (en) Distance estimation device, distance estimation method, and distance estimation computer program
WO2020230410A1 (fr) Objet mobile
US10643077B2 (en) Image processing device, imaging device, equipment control system, equipment, image processing method, and recording medium storing program
JP6704307B2 (ja) 移動量算出装置および移動量算出方法
US20230023651A1 (en) Information processing apparatus, control system for mobile object, information processing method, and storage medium
KR20160090650A (ko) 설정 가능한 센서네트워크에서의 사용자 위치 추정 장치 및 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18760888

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18760888

Country of ref document: EP

Kind code of ref document: A1