US20150292891A1 - Vehicle position estimation system - Google Patents
Vehicle position estimation system Download PDFInfo
- Publication number
- US20150292891A1 US20150292891A1 US14/251,015 US201414251015A US2015292891A1 US 20150292891 A1 US20150292891 A1 US 20150292891A1 US 201414251015 A US201414251015 A US 201414251015A US 2015292891 A1 US2015292891 A1 US 2015292891A1
- Authority
- US
- United States
- Prior art keywords
- marker
- edge
- image
- vehicle
- map data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 239000003550 marker Substances 0.000 claims abstract description 86
- 238000001514 detection method Methods 0.000 claims abstract description 29
- 238000000034 method Methods 0.000 claims description 16
- 230000006870 function Effects 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000012952 Resampling Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/16—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/802—Longitudinal distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/804—Relative longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/50—External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3647—Guidance involving output of stored or live camera images or video streams
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/485—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
- G06V10/245—Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
Definitions
- the present invention generally relates to an autonomous vehicle, More specifically, the present invention relates to a system for determining or estimating a position of an autonomous vehicle.
- Conventional vehicle position determination systems compute the position of a vehicle by comparing an image of a stored three dimensional map and a camera image.
- an edge image is extracted from the actual image acquired by a vehicle camera with which the vehicle is equipped.
- the position and attitude angle of the vehicle camera. is adjusted so that a virtual image from a three dimensional map which recorded the position and type of edge of the environment by three dimensions is projected on the positional attitude of the vehicle camera. Accordingly, the position and attitude angle in three dimensional space of the vehicle camera can be estimated.
- successive images from cameras can be compared to determine the movement of the vehicle. Specifically, by comparing the location of a plurality of matching pixels from successive images, distance information can be obtained. The distance information can be compiled to determine movement of the vehicle in various directions and angles.
- a system for determining a position of a vehicle includes a camera, a marker, a storage device, a positioning system, and a controller.
- the camera is configured to capture an image of an area adjacent the vehicle, the image including an edge.
- the marker detection device is configured to detect a marker in the area adjacent the vehicle.
- the storage device is configured to store map data, the stored map data including edge data.
- the positioning system is configured to determine the location of the camera relative to the stored map data.
- the controller is configured to combine the marker detected by the marker detection device and the edge in the image captured by the camera, and compare the combined marker detected by the marker detection device and the edge in the image captured by the camera to the stored map data.
- a method for determining a position of a vehicle includes capturing an image of an area adjacent the vehicle, the image including an edge, detecting a marker in the area adjacent the vehicle, reading stored map data, the stored map data including edge data, determining the location of the vehicle relative to the stored map data, combining the detected marker and the edge in the image, and comparing the combined detected marker and the edge in the image captured by the camera to the stored map data.
- FIG. 1 is a schematic top view of an autonomous vehicle having a vehicle position determining system according to one embodiment
- FIG. 2 is a top plan view of the vehicle of FIG. 1 illustrating camera views of vehicle position determining system according to one embodiment
- FIG. 3 is a schematic view of an image captured by a camera from the vehicle position determining system
- FIG. 4 is a schematic view of the image captured by the camera of FIG. 3 with the edge images illustrated;
- FIG. 5 is a schematic view of an image generated by a positioning system, including edge images.
- FIG. 6 is a flow chart illustrating steps executed by a controller according to a disclosed embodiment.
- the disclosed embodiments are for a vehicle position determining or estimating system 12 (e.g., a vehicle map matching system) disposed on a host autonomous vehicle, and configured to determine or estimate the position of the host autonomous vehicle 10 relative to a virtual map. It is noted that the vehicle position determining system 12 may be used in non-autonomous vehicles, to assist drivers, if desired. The vehicle position determining system 12 enables detection of markers and edges adjacent the host vehicle 10 to accurately calculate the estimated position of the vehicle 10 relative to the virtual map.
- a vehicle position determining or estimating system 12 e.g., a vehicle map matching system
- the vehicle position determining system 12 includes a controller 14 , a plurality of cameras 16 , 18 , 20 , 22 , a positioning system 24 , an image display device 26 , and a marker detection device 28 .
- the controller 14 preferably includes a microcomputer with a control program that controls the vehicle position determining system 12 as discussed below.
- the controller 14 can also include other conventional components such as an input interface circuit, an output interface circuit, and storage devices such as a ROM (Read Only Memory) device and a RAM (Random Access Memory) device.
- the microcomputer of the controller 14 is programmed to control one or more of the plurality of cameras 16 , 18 , 20 , 22 , the image display device 26 , the marker detection device 28 and the positioning system 24 , and to make determinations or decisions, as discussed herein.
- the memory circuit stores processing results and control programs, such as ones for the plurality of cameras 16 , 18 , 20 , 22 , the image display device 26 , the marker detection device 28 and the positioning system 24 operation that are run by the processor circuit,
- the controller 14 is operatively coupled to the plurality of cameras 16 , 18 , 20 , 22 , the image display device 26 , the marker detection device 28 and the positioning system 24 in a conventional manner, as well as other electrical systems in the vehicle, such the turn signals, windshield wipers, lights and any other suitable systems. Such a connection enables he controller 14 to monitor and control any of these systems as desired.
- the internal RAM of the controller 14 stores statuses of operational flags and various control data.
- the internal ROM of the controller 14 stores the information for various operations.
- the controller 14 is capable of selectively controlling any of the components of the vehicle position determining system 12 in accordance with the control program. It will be apparent to those skilled in the art from this disclosure that the precise structure and algorithms for the controller 14 can be any combination of hardware and software that will carry out the functions of the present invention.
- a plurality of cameras 16 , 18 , 20 , 22 are disposed on the external surface of the vehicle.
- the optical sensors are preferably cameras 16 , 18 , 20 , 22
- the optical sensors may be any type of suitable optical sensors.
- the cameras 16 , 18 , 20 , 22 include four digital cameras disposed in a front 30 of the vehicle 10 , a rear 32 of the vehicle 10 , on the left side mirror 34 of the vehicle 10 and right side mirror 36 .
- the cameras 16 , 18 , 20 , 22 may be mounted on any suitable external portion of the host vehicle, including the front and rear quarter panels, or any combination of suitable areas.
- the cameras 16 , 18 , 20 , 22 are preferably solid state image pickup devices, such as charge coupled device (CCD). Additionally, as illustrated in FIG. 2 , the cameras 16 , 18 , 20 , 22 are arranged around the vehicle 10 and have lenses that enable imaging substantially surrounding or completely surrounding the host vehicle 10 (e.g., fish-eye cameras 16 , 18 , 20 , 22 ., which have an enlarged angular field).
- CCD charge coupled device
- the positioning system 24 can include a plurality of vehicle sensors 38 , 40 , 42 , and 44 that are configured to detect a remote object in proximity to the vehicle.
- the remote vehicle sensors 38 , 40 , 42 , and 44 are preferably mounted externally on the front quarter panels 46 a and 46 b and rear quarter panels 48 a and 48 b of the vehicle 10 .
- the sensors 38 , 40 , 42 , and 44 may be mounted on any suitable external portion of the vehicle 10 , including the front and rear bumpers, the external mirrors or any combination of suitable areas.
- the sensors 38 , 40 , 42 , and 44 transmit data to the positioning system 24 , which is then capable of using the sensor data to calculate the position of the vehicle 10 using odometry.
- the vehicle sensors 38 , 40 , 42 , and 44 can be any type of sensors desirable.
- the front sensors can include a long-range radar device for object detection in front of the host vehicle.
- the front sensor may be configured to detect objects at a predetermined distance (e.g., distances up to 200 m), and thus may have a narrow field of view angle (e.g., around 15°). Due to the narrow field of view angle, the long range radar may not detect all objects in the front of the host vehicle.
- the front corner sensors can include short-range radar devices to assist in monitoring the region in front of the host vehicle.
- the rear sensors may include short-range radar devices to assist in monitoring oncoming traffic beside and behind the host vehicle, Placement of the aforementioned sensors permits monitoring of traffic flow including remote vehicles and other objects around the host vehicle, and the position of the vehicle 10 with respect to maintaining lane position or lane departure.
- the sensors 38 , 40 , 42 , and 44 can be disposed in any position of the vehicle 10 and may include any type and/or combination of sensors to enable detection of a remote objects.
- the sensors may be cameras, radar sensors, photo sensors or any combination thereof.
- FIG. 1 illustrates four sensors, 38 , 40 , 42 , and 44 there can be as few or as many sensors desirable or suitable.
- sensors 38 , 40 , 42 , and 44 preferably are electronic detection devices that transmit either electronic electromagnetic waves (e.g., radar), these sensors can be any suitable sensors that, for example, take computer-processed images with a digital camera and analyzes the images or emit lasers, as is known in the art.
- the sensors may be capable of detecting at least the speed, direction, yaw, acceleration and distance of the vehicle 10 relative to a remote object.
- the sensors 38 , 40 , 42 , and 44 may include object-locating sensing devices including range sensors, such as FM-CW (Frequency Modulated Continuous Wave) radars, pulse and FSK (Frequency Shift Keying) radars, sonar and Lidar (Light Detection and Ranging) devices, and ultrasonic devices which rely upon effects such as Doppler-effect measurements to locate forward objects.
- object-locating devices may include charged-coupled devices (CCD) or complementary metal oxide semi-conductor (CMOS) video image sensors, and other known camera/video image processors which utilize digital photographic methods to “view” forward objects including one or more remote vehicles.
- the sensors are in communication with the controller 14 through position system 24 , and are capable of transmitting information to the controller 14 ,
- the positioning system 24 may include a wireless communications device, such as a GPS,
- the vehicle 10 receives a GPS satellite signal.
- the GPS processes the GPS satellite signal to determine positional information (such as location, speed, acceleration, yaw, and direction, just to name a few) of the vehicle 10 .
- the positioning system is in communication with the controller 14 , and is capable of transmitting such positional information regarding the vehicle 10 to the controller 14 .
- the positioning system 24 also can include a storage device that stores map data. Thus, in determining the position of the vehicle 10 using any of the herein described methods, devices or systems, the positioning of the vehicle 10 may be compared to the known data stored in the storage device.
- the storage device may also store any additional information including the current or predicted vehicle position and any past vehicle position or any other suitable information.
- the vehicle 10 is provided with a marker detection device 28 that detects the position of the vehicle 10 in the driving lane in order to detect the lane departure tendency of the host vehicle.
- the marker detection device 28 includes lane detection software in a lane departure device.
- the lane departure device generally includes an imaging device that has a picture processing function and preferably includes a camera.
- the lane departure device may use cameras 16 , 18 , 20 , 22 .
- the cameras in the lane departure device can be suitable camera and may be a stand-alone camera or any one or more the cameras 16 , 18 , 20 , 22 .
- the imaging unit is designed to detect the position of the vehicle 10 in the driving lane in order to detect the lane departure tendency of the host vehicle.
- the lane departure device is configured to detect markers on the road surface or any area adjacent the vehicle.
- the controller 14 communicates with the imaging device in the lane departure device and is preferably configured and arranged to detect white lines or other markers, for example, from the imaging picture, preferably from the front of the vehicle 10 .
- the driving lane is detected based on the detected lane markers.
- the imaging device can calculate the angle (yaw angle) formed by the driving lane of the vehicle 10 and the longitudinal axis of the vehicle 10 , the lateral displacement from the center of the driving lane, the driving lane curvature, the lane width, and so forth.
- the imaging device outputs the calculated yaw angle, the calculated lateral displacement, the calculated driving lane curvature, the lane width, and the like to the controller 14 .
- the vehicle position determining system 12 further includes a display device 26 (i.e., an image displaying device) that is mounted in an interior of the vehicle 10 such as in an instrument panel of the vehicle 10 as illustrated in FIG. 1 .
- the display device 26 is configured and arranged to display the display image generated by the controller 14 for a driver of the vehicle.
- the display device 26 is operatively connected to the controller 14 in a conventional manner such as using wireless communication or wires such that the controller 14 can control the operations of the display device 26 .
- the controller 14 is configured to generate a video image including the regions directly forward, rearward and laterally of the vehicle 10 based on the images captured by the cameras 16 , 18 , 20 , 22 , and to display the generated image on the display device 26 .
- the display device 26 is operatively connected to the cameras 16 , 18 , 20 , 22 via the controller 14 to display images captured by the cameras 16 , 18 , 20 , 22 .
- the controller 14 is programmed to process the images of the cameras 16 , 18 , 20 , 22 to display a vehicle 10 peripheral view (i.e., a composite 360 degree top view image) around the vehicle.
- At least one of the cameras 16 , 18 , 20 , 22 captures an image of an area adjacent the vehicle.
- the image is formed from an array of pixels and preferably includes an edge 50 or a plurality of edges.
- An edge in this image may refer to a part of the image in which the luminance of a pixel sharply changes (e.g., curbs, lane markers or edges of roads).
- the Canny edge detecting method may he used. However, it is noted that any suitable edge detection method or device may be used.
- the marker detection system then detects a marker 52 in the area adjacent the vehicle.
- the marker 52 may be a white line, a stop line, traffic signal, car pool lanes (high occupancy vehicle 10 lanes), crosswalk or any other suitable marker.
- the first marker being, for example, a stop line
- the second marker being, for example, a line identifying the lane
- the controller 14 can be configured to add more value to the pixel in the array of pixels that corresponds to the stop line.
- the value of any desired marker can he more, less or equal to any other marker.
- the controller 14 then combines the marker detected by the marker detection device 28 and the edges in the image captured by the camera. That is, the controller 14 compares information stored in the storage device in the positioning system 24 to the marker data and edges in the image captured by the camera.
- the vehicle position determining system 12 determines the position of the vehicle 10 using the positioning system 24 . That is, as would be understood, a resampling of a particle filter based on a previous predicted vehicle location can be used to determine vehicle location. Additionally, if desired, the positioning system 24 may use a GPS to determine the vehicle location, or any suitable system or method or combination of systems or methods. The controller 14 then predicts the position of the vehicle 10 based on odometry information. Such information may be acquired via the sensors or in any other suitable manner.
- the prediction of the vehicle location includes the controller 14 reading stored map data from the storage device and projecting a 3D map to a hypothetical camera.
- the 3D map includes projected edge images.
- the controller 14 compares the combined detected marker and the edge in the image captured by the camera to the 3D map from the stored map data. Since the vehicle position has been determined by the positioning system 24 , the 3D map from the stored map data generally corresponds to the image captured by the camera.
- each pixel (e.g., pixel 54 ; see FIGS. 4 and 5 ) in the pixel array is compared to the 3D map.
- a value is given to each pixel in the camera image that matches a hypothetical pixel in the 3D map, and when the pixel 54 corresponds to a marker 52 additional value is added. That is, the controller 14 is configured to add value to a pixel 54 in the array of pixels that corresponds to the marker 52 when the pixel that corresponds to the marker matches a marker 56 in the image generated by the controller 14 . Additionally, if desired, the controller 14 is configured to add value to a pixel in the array of pixels that corresponds to the edge when the pixel that corresponds to the edge matches an edge in the image generated by the controller 14 .
- Value may be added to both the position likelihood and the angular likelihood (i.e., the angle of the vehicle).
- the controller 14 determines whether at least one pixel from the combined detected marker and the edge in the image match a pixel from the stored map data. When at least one pixel from the combined detected marker and the edge in the image matches a pixel from the stored map data, value is added to the pixel.
- the controller 14 uses this date to calculate an estimated position of the vehicle.
- the value of the matching pixel may be adding to increase the predicted estimation of position likelihood and/or angular likelihood. When no pixels match the controller 14 restarts the process and causes the cameras 16 , 18 , 20 , 22 to capture another image adjacent the vehicle.
- the controller 14 estimates the position of the vehicle 10 and attitude angle using the three-dimensional image captured by the cameras 16 , 18 , 20 , 22 and the three-dimensional map data stored in the storage device. For example, in one embodiment, the controller 14 compares the captured image imaged by the camera with a virtual image converted three dimensional map data to the image imaged from the virtual position and the virtual attitude angle, and estimates the position and attitude angle of the vehicle. The marker detected and the edge detected can be used to increase the likelihood of correct vehicle position estimation.
- the vehicle position determining system 12 captures an image adjacent the vehicle 10 using a camera
- the marker detection device 28 detects markers 52 in an area adjacent the vehicle.
- the positioning system 24 reads map data stored in the storage device, and determines the location of the vehicle 10 relative to the stored map data.
- the controller 14 combines the detected marker 52 and the edge 50 in the image, and compares the combined detected marker 52 and the edge 50 in the image to the stored map data. If at least one pixel in the combined detected marker and the edge in the image does not match a pixel from the stored map data, the controller 14 instructs the cameras 16 , 18 , 20 , 22 to capture another image, and restart the procedure.
- the controller 14 adds value to the pixel in the array of pixels that correspond to the marker or markers.
- the value can be related to position and/or vehicle angle.
- the estimated position of the vehicle 10 is then calculated.
- This vehicle position determining system 12 is capable of accurately estimating a position of an autonomous vehicle 10 or any other system or device.
- the term “comprising” and its derivatives, as used herein, are intended to be open ended terms that specify the presence of the stated features, elements, components, groups, integers, and/or steps, but do not exclude the presence of other unstated features, elements, components, groups, integers and/or steps.
- the foregoing also applies to words having similar meanings such as the terms, “including”, “having” and their derivatives.
- the terms “part,” “section,” “portion,” or “element” when used in the singular can have the dual meaning of a single part or a plurality of parts.
- the following directional terms “front”, and “rear”, as well as any other similar directional terms refer to those directions of a vehicle equipped with the vehicle position determining system. Accordingly, these terms, as utilized to describe the present invention should be interpreted relative to a vehicle equipped with the vehicle position determining system.
- detect as used herein to describe an operation or function carried out by a component, a section, a device or the like includes a component, a section, a device or the like that does not require physical detection, but rather includes determining, measuring, modeling, predicting or computing or the like to carry out the operation or function.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Electromagnetism (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
Description
- 1. Field of the Invention
- The present invention generally relates to an autonomous vehicle, More specifically, the present invention relates to a system for determining or estimating a position of an autonomous vehicle.
- 2. Background Information
- Conventional vehicle position determination systems compute the position of a vehicle by comparing an image of a stored three dimensional map and a camera image. In particular, an edge image is extracted from the actual image acquired by a vehicle camera with which the vehicle is equipped. The position and attitude angle of the vehicle camera. is adjusted so that a virtual image from a three dimensional map which recorded the position and type of edge of the environment by three dimensions is projected on the positional attitude of the vehicle camera. Accordingly, the position and attitude angle in three dimensional space of the vehicle camera can be estimated.
- Moreover, successive images from cameras can be compared to determine the movement of the vehicle. Specifically, by comparing the location of a plurality of matching pixels from successive images, distance information can be obtained. The distance information can be compiled to determine movement of the vehicle in various directions and angles.
- It has been discovered that in vehicle position determination systems, expanding the matching target from an edge to specific markers, such as white line markers or stop line markers increases accuracy. That is, if pixels of extracted markers from an image match a marker from a virtual map, the pixel will be assigned a higher likelihood of being correct. Such a system results in increased accuracy for determination of position.
- In one disclosed embodiment, a system for determining a position of a vehicle includes a camera, a marker, a storage device, a positioning system, and a controller. The camera is configured to capture an image of an area adjacent the vehicle, the image including an edge. The marker detection device is configured to detect a marker in the area adjacent the vehicle. The storage device is configured to store map data, the stored map data including edge data. The positioning system is configured to determine the location of the camera relative to the stored map data. The controller is configured to combine the marker detected by the marker detection device and the edge in the image captured by the camera, and compare the combined marker detected by the marker detection device and the edge in the image captured by the camera to the stored map data.
- In another embodiment, a method for determining a position of a vehicle includes capturing an image of an area adjacent the vehicle, the image including an edge, detecting a marker in the area adjacent the vehicle, reading stored map data, the stored map data including edge data, determining the location of the vehicle relative to the stored map data, combining the detected marker and the edge in the image, and comparing the combined detected marker and the edge in the image captured by the camera to the stored map data.
- Referring now to the attached drawings which form a part of this original disclosure:
-
FIG. 1 is a schematic top view of an autonomous vehicle having a vehicle position determining system according to one embodiment; -
FIG. 2 is a top plan view of the vehicle ofFIG. 1 illustrating camera views of vehicle position determining system according to one embodiment; -
FIG. 3 is a schematic view of an image captured by a camera from the vehicle position determining system; -
FIG. 4 is a schematic view of the image captured by the camera ofFIG. 3 with the edge images illustrated; -
FIG. 5 is a schematic view of an image generated by a positioning system, including edge images; and -
FIG. 6 is a flow chart illustrating steps executed by a controller according to a disclosed embodiment. - Selected embodiments will now be explained with reference to the drawings. It will be apparent to those skilled in the art from this disclosure that the following descriptions of the embodiments are provided for illustration only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
- The disclosed embodiments are for a vehicle position determining or estimating system 12 (e.g., a vehicle map matching system) disposed on a host autonomous vehicle, and configured to determine or estimate the position of the host
autonomous vehicle 10 relative to a virtual map. It is noted that the vehicleposition determining system 12 may be used in non-autonomous vehicles, to assist drivers, if desired. The vehicleposition determining system 12 enables detection of markers and edges adjacent thehost vehicle 10 to accurately calculate the estimated position of thevehicle 10 relative to the virtual map. - Referring initially to
FIG. 1 , anautonomous vehicle 10 having a vehicleposition determining system 12 is illustrated in accordance with a first embodiment. The vehicleposition determining system 12 includes acontroller 14, a plurality ofcameras positioning system 24, an image display device 26, and amarker detection device 28. - The
controller 14 preferably includes a microcomputer with a control program that controls the vehicleposition determining system 12 as discussed below. Thecontroller 14 can also include other conventional components such as an input interface circuit, an output interface circuit, and storage devices such as a ROM (Read Only Memory) device and a RAM (Random Access Memory) device. The microcomputer of thecontroller 14 is programmed to control one or more of the plurality ofcameras marker detection device 28 and thepositioning system 24, and to make determinations or decisions, as discussed herein. The memory circuit stores processing results and control programs, such as ones for the plurality ofcameras marker detection device 28 and thepositioning system 24 operation that are run by the processor circuit, Thecontroller 14 is operatively coupled to the plurality ofcameras marker detection device 28 and thepositioning system 24 in a conventional manner, as well as other electrical systems in the vehicle, such the turn signals, windshield wipers, lights and any other suitable systems. Such a connection enables he controller 14 to monitor and control any of these systems as desired. The internal RAM of thecontroller 14 stores statuses of operational flags and various control data. The internal ROM of thecontroller 14 stores the information for various operations. Thecontroller 14 is capable of selectively controlling any of the components of the vehicleposition determining system 12 in accordance with the control program. It will be apparent to those skilled in the art from this disclosure that the precise structure and algorithms for thecontroller 14 can be any combination of hardware and software that will carry out the functions of the present invention. - As illustrated in
FIGS. 1 and 2 , in one disclosed embodiment, a plurality ofcameras cameras cameras front 30 of thevehicle 10, a rear 32 of thevehicle 10, on theleft side mirror 34 of thevehicle 10 andright side mirror 36. However, thecameras cameras FIG. 2 , thecameras vehicle 10 and have lenses that enable imaging substantially surrounding or completely surrounding the host vehicle 10 (e.g., fish-eye cameras - In one embodiment, the
positioning system 24 can include a plurality ofvehicle sensors FIG. 1 , theremote vehicle sensors front quarter panels rear quarter panels vehicle 10. However, thesensors vehicle 10, including the front and rear bumpers, the external mirrors or any combination of suitable areas. Thesensors positioning system 24, which is then capable of using the sensor data to calculate the position of thevehicle 10 using odometry. - The
vehicle sensors vehicle 10 with respect to maintaining lane position or lane departure. However, thesensors vehicle 10 and may include any type and/or combination of sensors to enable detection of a remote objects. In addition, the sensors may be cameras, radar sensors, photo sensors or any combination thereof. AlthoughFIG. 1 illustrates four sensors, 38, 40, 42, and 44 there can be as few or as many sensors desirable or suitable. - Although
sensors vehicle 10 relative to a remote object. Further, thesensors controller 14 throughposition system 24, and are capable of transmitting information to thecontroller 14, - Moreover, as illustrated in
FIGS. 1 and 2 , thepositioning system 24 may include a wireless communications device, such as a GPS, In one embodiment thevehicle 10 receives a GPS satellite signal. As is understood, the GPS processes the GPS satellite signal to determine positional information (such as location, speed, acceleration, yaw, and direction, just to name a few) of thevehicle 10. As noted herein, the positioning system is in communication with thecontroller 14, and is capable of transmitting such positional information regarding thevehicle 10 to thecontroller 14. - The
positioning system 24 also can include a storage device that stores map data. Thus, in determining the position of thevehicle 10 using any of the herein described methods, devices or systems, the positioning of thevehicle 10 may be compared to the known data stored in the storage device. The storage device may also store any additional information including the current or predicted vehicle position and any past vehicle position or any other suitable information. - Preferably, the
vehicle 10 is provided with amarker detection device 28 that detects the position of thevehicle 10 in the driving lane in order to detect the lane departure tendency of the host vehicle. Themarker detection device 28 includes lane detection software in a lane departure device. The lane departure device generally includes an imaging device that has a picture processing function and preferably includes a camera. In one embodiment, the lane departure device may usecameras cameras vehicle 10 in the driving lane in order to detect the lane departure tendency of the host vehicle. Moreover, as discussed herein, the lane departure device, is configured to detect markers on the road surface or any area adjacent the vehicle. - The
controller 14 communicates with the imaging device in the lane departure device and is preferably configured and arranged to detect white lines or other markers, for example, from the imaging picture, preferably from the front of thevehicle 10. Thus, the driving lane is detected based on the detected lane markers, Furthermore, the imaging device can calculate the angle (yaw angle) formed by the driving lane of thevehicle 10 and the longitudinal axis of thevehicle 10, the lateral displacement from the center of the driving lane, the driving lane curvature, the lane width, and so forth. The imaging device outputs the calculated yaw angle, the calculated lateral displacement, the calculated driving lane curvature, the lane width, and the like to thecontroller 14. - Moreover, the vehicle
position determining system 12 further includes a display device 26 (i.e., an image displaying device) that is mounted in an interior of thevehicle 10 such as in an instrument panel of thevehicle 10 as illustrated inFIG. 1 . The display device 26 is configured and arranged to display the display image generated by thecontroller 14 for a driver of the vehicle. Thus, the display device 26 is operatively connected to thecontroller 14 in a conventional manner such as using wireless communication or wires such that thecontroller 14 can control the operations of the display device 26. More specifically, thecontroller 14 is configured to generate a video image including the regions directly forward, rearward and laterally of thevehicle 10 based on the images captured by thecameras cameras controller 14 to display images captured by thecameras controller 14 is programmed to process the images of thecameras vehicle 10 peripheral view (i.e., a composite 360 degree top view image) around the vehicle. - As illustrated in
FIGS. 2 and 3 , in one embodiment, at least one of thecameras edge 50 or a plurality of edges. An edge in this image may refer to a part of the image in which the luminance of a pixel sharply changes (e.g., curbs, lane markers or edges of roads). In one embodiment, the Canny edge detecting method may he used. However, it is noted that any suitable edge detection method or device may be used. The marker detection system then detects amarker 52 in the area adjacent the vehicle. As stated above, themarker 52 may be a white line, a stop line, traffic signal, car pool lanes (high occupancy vehicle 10 lanes), crosswalk or any other suitable marker. Further, if desired when there are two markers (i.e., afirst marker 52 a and asecond marker 52 b), as shown inFIGS. 2 and 3 , the first marker being, for example, a stop line and the second marker being, for example, a line identifying the lane, thecontroller 14 can be configured to add more value to the pixel in the array of pixels that corresponds to the stop line. Thus, when there are multiple markers, the value of any desired marker can he more, less or equal to any other marker. - As shown in
FIG. 4 , thecontroller 14 then combines the marker detected by themarker detection device 28 and the edges in the image captured by the camera. That is, thecontroller 14 compares information stored in the storage device in thepositioning system 24 to the marker data and edges in the image captured by the camera. - The vehicle
position determining system 12 determines the position of thevehicle 10 using thepositioning system 24. That is, as would be understood, a resampling of a particle filter based on a previous predicted vehicle location can be used to determine vehicle location. Additionally, if desired, thepositioning system 24 may use a GPS to determine the vehicle location, or any suitable system or method or combination of systems or methods. Thecontroller 14 then predicts the position of thevehicle 10 based on odometry information. Such information may be acquired via the sensors or in any other suitable manner. - Moreover, as shown in
FIG. 5 , the prediction of the vehicle location includes thecontroller 14 reading stored map data from the storage device and projecting a 3D map to a hypothetical camera. In this embodiment, the 3D map includes projected edge images. Thecontroller 14 then compares the combined detected marker and the edge in the image captured by the camera to the 3D map from the stored map data. Since the vehicle position has been determined by thepositioning system 24, the 3D map from the stored map data generally corresponds to the image captured by the camera. - Thus, in this embodiment, each pixel (e.g.,
pixel 54; seeFIGS. 4 and 5 ) in the pixel array is compared to the 3D map. A value is given to each pixel in the camera image that matches a hypothetical pixel in the 3D map, and when thepixel 54 corresponds to amarker 52 additional value is added. That is, thecontroller 14 is configured to add value to apixel 54 in the array of pixels that corresponds to themarker 52 when the pixel that corresponds to the marker matches amarker 56 in the image generated by thecontroller 14. Additionally, if desired, thecontroller 14 is configured to add value to a pixel in the array of pixels that corresponds to the edge when the pixel that corresponds to the edge matches an edge in the image generated by thecontroller 14. - Value may be added to both the position likelihood and the angular likelihood (i.e., the angle of the vehicle). In other words, the
controller 14 determines whether at least one pixel from the combined detected marker and the edge in the image match a pixel from the stored map data. When at least one pixel from the combined detected marker and the edge in the image matches a pixel from the stored map data, value is added to the pixel. Thecontroller 14 uses this date to calculate an estimated position of the vehicle. The value of the matching pixel may be adding to increase the predicted estimation of position likelihood and/or angular likelihood. When no pixels match thecontroller 14 restarts the process and causes thecameras - In other words, the
controller 14 estimates the position of thevehicle 10 and attitude angle using the three-dimensional image captured by thecameras controller 14 compares the captured image imaged by the camera with a virtual image converted three dimensional map data to the image imaged from the virtual position and the virtual attitude angle, and estimates the position and attitude angle of the vehicle. The marker detected and the edge detected can be used to increase the likelihood of correct vehicle position estimation. - Basically, as illustrated in
FIG. 6 , the vehicleposition determining system 12 captures an image adjacent thevehicle 10 using a camera, Themarker detection device 28 detectsmarkers 52 in an area adjacent the vehicle. Thepositioning system 24, reads map data stored in the storage device, and determines the location of thevehicle 10 relative to the stored map data. Thecontroller 14 combines the detectedmarker 52 and theedge 50 in the image, and compares the combined detectedmarker 52 and theedge 50 in the image to the stored map data. If at least one pixel in the combined detected marker and the edge in the image does not match a pixel from the stored map data, thecontroller 14 instructs thecameras pixel 54 in the combined detected marker and the edge in the image does match a pixel from the stored map data, thecontroller 14 adds value to the pixel in the array of pixels that correspond to the marker or markers. The value can be related to position and/or vehicle angle. The estimated position of thevehicle 10 is then calculated. This vehicleposition determining system 12 is capable of accurately estimating a position of anautonomous vehicle 10 or any other system or device. - In understanding the scope of the present invention, the term “comprising” and its derivatives, as used herein, are intended to be open ended terms that specify the presence of the stated features, elements, components, groups, integers, and/or steps, but do not exclude the presence of other unstated features, elements, components, groups, integers and/or steps. The foregoing also applies to words having similar meanings such as the terms, “including”, “having” and their derivatives. Also, the terms “part,” “section,” “portion,” or “element” when used in the singular can have the dual meaning of a single part or a plurality of parts. Also as used herein to describe the above embodiment(s), the following directional terms “front”, and “rear”, as well as any other similar directional terms refer to those directions of a vehicle equipped with the vehicle position determining system. Accordingly, these terms, as utilized to describe the present invention should be interpreted relative to a vehicle equipped with the vehicle position determining system.
- The term “detect” as used herein to describe an operation or function carried out by a component, a section, a device or the like includes a component, a section, a device or the like that does not require physical detection, but rather includes determining, measuring, modeling, predicting or computing or the like to carry out the operation or function.
- The term “configured” as used herein to describe a component, section or part of a device includes hardware and/or software that is constructed and/or programmed to carry out the desired function.
- The terms of degree such as “substantially”, as used herein mean a reasonable amount of deviation of the modified term such that the end result is not significantly changed.
- While only selected embodiments have been chosen to illustrate the present invention, it will be apparent to those skilled in the art from this disclosure that various changes and modifications can be made herein without departing from the scope of the invention as defined in the appended claims. For example, the size, shape, location or orientation of the various components can be changed as needed and/or desired. Components that are shown directly connected or contacting each other can have intermediate structures disposed between them. The functions of one element can be performed by two, and vice versa. The structures and functions of one embodiment can be adopted in another embodiment. It is not necessary for all advantages to be present in a particular embodiment at the same time. Every feature which is unique from the prior art, alone or in combination with other features, also should be considered a separate description of further inventions by the applicant, including the structural and/or functional concepts embodied by such feature(s). Thus, the foregoing descriptions of the embodiments according to the present invention are provided for illustration only, and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/251,015 US9151626B1 (en) | 2014-04-11 | 2014-04-11 | Vehicle position estimation system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/251,015 US9151626B1 (en) | 2014-04-11 | 2014-04-11 | Vehicle position estimation system |
Publications (2)
Publication Number | Publication Date |
---|---|
US9151626B1 US9151626B1 (en) | 2015-10-06 |
US20150292891A1 true US20150292891A1 (en) | 2015-10-15 |
Family
ID=54203738
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/251,015 Active US9151626B1 (en) | 2014-04-11 | 2014-04-11 | Vehicle position estimation system |
Country Status (1)
Country | Link |
---|---|
US (1) | US9151626B1 (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9625264B1 (en) * | 2016-01-20 | 2017-04-18 | Denso Corporation | Systems and methods for displaying route information |
JP2017116363A (en) * | 2015-12-24 | 2017-06-29 | アイシン・エィ・ダブリュ株式会社 | Vehicle position estimation system, method, and program |
JP2017117386A (en) * | 2015-12-25 | 2017-06-29 | 学校法人千葉工業大学 | Self-motion estimation system, control method and program of self-motion estimation system |
JP2017223483A (en) * | 2016-06-14 | 2017-12-21 | 日立オートモティブシステムズ株式会社 | Own vehicle position estimation device |
EP3293488A3 (en) * | 2016-08-19 | 2018-06-13 | Dura Operating, LLC | System and method of simulataneously generating a multiple lane map and localizing a vehicle in the generated map |
EP3421936A1 (en) | 2017-06-30 | 2019-01-02 | Panasonic Automotive & Industrial Systems Europe GmbH | Optical marker element for geo location information |
WO2019128496A1 (en) * | 2017-12-29 | 2019-07-04 | 北京三快在线科技有限公司 | Device motion control |
US10430968B2 (en) | 2017-03-14 | 2019-10-01 | Ford Global Technologies, Llc | Vehicle localization using cameras |
WO2020146102A1 (en) * | 2019-01-08 | 2020-07-16 | Qualcomm Incorporated | Robust lane association by projecting 2-d image into 3-d world using map information |
CN112013859A (en) * | 2020-10-19 | 2020-12-01 | 四川京炜数字科技有限公司 | Method for rapidly acquiring accurate position of road marking |
US11009356B2 (en) * | 2018-02-14 | 2021-05-18 | Tusimple, Inc. | Lane marking localization and fusion |
US11009365B2 (en) | 2018-02-14 | 2021-05-18 | Tusimple, Inc. | Lane marking localization |
WO2021115455A1 (en) * | 2019-12-13 | 2021-06-17 | 上海商汤临港智能科技有限公司 | Traffic information identification and smart traveling method, device, apparatus, and storage medium |
EP3893150A1 (en) * | 2020-04-09 | 2021-10-13 | Tusimple, Inc. | Camera pose estimation techniques |
US11573095B2 (en) | 2017-08-22 | 2023-02-07 | Tusimple, Inc. | Verification module system and method for motion-based lane detection with multiple sensors |
US20230408264A1 (en) * | 2018-02-14 | 2023-12-21 | Tusimple, Inc. | Lane marking localization and fusion |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9613386B1 (en) | 2015-12-01 | 2017-04-04 | Google Inc. | Pickup and drop off zones for autonomous vehicles |
EP3190022B1 (en) * | 2016-01-11 | 2018-08-29 | Delphi Technologies, Inc. | Lane extension for vision steered automated vehicle |
DE102016213782A1 (en) | 2016-07-27 | 2018-02-01 | Volkswagen Aktiengesellschaft | A method, apparatus and computer readable storage medium having instructions for determining the lateral position of a vehicle relative to the lanes of a lane |
DE102016213817B4 (en) | 2016-07-27 | 2019-03-07 | Volkswagen Aktiengesellschaft | A method, apparatus and computer readable storage medium having instructions for determining the lateral position of a vehicle relative to the lanes of a lane |
US10377375B2 (en) * | 2016-09-29 | 2019-08-13 | The Charles Stark Draper Laboratory, Inc. | Autonomous vehicle: modular architecture |
US10599150B2 (en) | 2016-09-29 | 2020-03-24 | The Charles Stark Kraper Laboratory, Inc. | Autonomous vehicle: object-level fusion |
US10101745B1 (en) | 2017-04-26 | 2018-10-16 | The Charles Stark Draper Laboratory, Inc. | Enhancing autonomous vehicle perception with off-vehicle collected data |
US10202048B2 (en) | 2017-06-28 | 2019-02-12 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for adjusting operation of a vehicle according to HOV lane detection in traffic |
DE102017216954A1 (en) * | 2017-09-25 | 2019-03-28 | Robert Bosch Gmbh | Method and device for determining a highly accurate position and for operating an automated vehicle |
FR3078398B1 (en) * | 2018-02-27 | 2020-08-07 | Renault Sas | METHOD FOR ESTIMATING THE POSITION OF A VEHICLE ON A MAP |
KR102420476B1 (en) * | 2018-05-25 | 2022-07-13 | 에스케이텔레콤 주식회사 | Apparatus and method for estimating location of vehicle and computer recordable medium storing computer program thereof |
US11249184B2 (en) | 2019-05-07 | 2022-02-15 | The Charles Stark Draper Laboratory, Inc. | Autonomous collision avoidance through physical layer tracking |
NL2023628B9 (en) * | 2019-08-09 | 2021-08-20 | Wilhelmus Maria Van Bentum Johannes | System for controlling an autonomous driving vehicle or (air)vehicle, autonomously driving vehicle or (air)vehicle, which can be controlled on the basis of steering and acceleration values, provided with such a system. |
US11189007B2 (en) * | 2019-12-03 | 2021-11-30 | Imagry (Israel) Ltd | Real-time generation of functional road maps |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5757289A (en) * | 1994-09-14 | 1998-05-26 | Aisin Aw Co., Ltd. | Vehicular navigation system |
US5642106A (en) * | 1994-12-27 | 1997-06-24 | Siemens Corporate Research, Inc. | Visual incremental turn detector |
KR100224326B1 (en) * | 1995-12-26 | 1999-10-15 | 모리 하루오 | Car navigation system |
JP2001289654A (en) * | 2000-04-11 | 2001-10-19 | Equos Research Co Ltd | Navigator, method of controlling navigator and memory medium having recorded programs |
JP5910180B2 (en) | 2012-03-06 | 2016-04-27 | 日産自動車株式会社 | Moving object position and orientation estimation apparatus and method |
-
2014
- 2014-04-11 US US14/251,015 patent/US9151626B1/en active Active
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017116363A (en) * | 2015-12-24 | 2017-06-29 | アイシン・エィ・ダブリュ株式会社 | Vehicle position estimation system, method, and program |
JP2017117386A (en) * | 2015-12-25 | 2017-06-29 | 学校法人千葉工業大学 | Self-motion estimation system, control method and program of self-motion estimation system |
US9625264B1 (en) * | 2016-01-20 | 2017-04-18 | Denso Corporation | Systems and methods for displaying route information |
JP2017223483A (en) * | 2016-06-14 | 2017-12-21 | 日立オートモティブシステムズ株式会社 | Own vehicle position estimation device |
EP3293488A3 (en) * | 2016-08-19 | 2018-06-13 | Dura Operating, LLC | System and method of simulataneously generating a multiple lane map and localizing a vehicle in the generated map |
US10210406B2 (en) | 2016-08-19 | 2019-02-19 | Dura Operating, Llc | System and method of simultaneously generating a multiple lane map and localizing a vehicle in the generated map |
US11216972B2 (en) | 2017-03-14 | 2022-01-04 | Ford Global Technologies, Llc | Vehicle localization using cameras |
US10430968B2 (en) | 2017-03-14 | 2019-10-01 | Ford Global Technologies, Llc | Vehicle localization using cameras |
EP3421936A1 (en) | 2017-06-30 | 2019-01-02 | Panasonic Automotive & Industrial Systems Europe GmbH | Optical marker element for geo location information |
US11573095B2 (en) | 2017-08-22 | 2023-02-07 | Tusimple, Inc. | Verification module system and method for motion-based lane detection with multiple sensors |
US11874130B2 (en) | 2017-08-22 | 2024-01-16 | Tusimple, Inc. | Verification module system and method for motion-based lane detection with multiple sensors |
WO2019128496A1 (en) * | 2017-12-29 | 2019-07-04 | 北京三快在线科技有限公司 | Device motion control |
US11009356B2 (en) * | 2018-02-14 | 2021-05-18 | Tusimple, Inc. | Lane marking localization and fusion |
US11009365B2 (en) | 2018-02-14 | 2021-05-18 | Tusimple, Inc. | Lane marking localization |
US20210278221A1 (en) * | 2018-02-14 | 2021-09-09 | Tusimple, Inc. | Lane marking localization and fusion |
US11852498B2 (en) | 2018-02-14 | 2023-12-26 | Tusimple, Inc. | Lane marking localization |
US20230408264A1 (en) * | 2018-02-14 | 2023-12-21 | Tusimple, Inc. | Lane marking localization and fusion |
US11740093B2 (en) * | 2018-02-14 | 2023-08-29 | Tusimple, Inc. | Lane marking localization and fusion |
WO2020146102A1 (en) * | 2019-01-08 | 2020-07-16 | Qualcomm Incorporated | Robust lane association by projecting 2-d image into 3-d world using map information |
US11227168B2 (en) | 2019-01-08 | 2022-01-18 | Qualcomm Incorporated | Robust lane association by projecting 2-D image into 3-D world using map information |
WO2021115455A1 (en) * | 2019-12-13 | 2021-06-17 | 上海商汤临港智能科技有限公司 | Traffic information identification and smart traveling method, device, apparatus, and storage medium |
US11810322B2 (en) | 2020-04-09 | 2023-11-07 | Tusimple, Inc. | Camera pose estimation techniques |
EP3893150A1 (en) * | 2020-04-09 | 2021-10-13 | Tusimple, Inc. | Camera pose estimation techniques |
CN112013859A (en) * | 2020-10-19 | 2020-12-01 | 四川京炜数字科技有限公司 | Method for rapidly acquiring accurate position of road marking |
Also Published As
Publication number | Publication date |
---|---|
US9151626B1 (en) | 2015-10-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9151626B1 (en) | Vehicle position estimation system | |
US9863775B2 (en) | Vehicle localization system | |
JP7301909B2 (en) | Determination of yaw error from map data, lasers, and cameras | |
US11836989B2 (en) | Vehicular vision system that determines distance to an object | |
US11312353B2 (en) | Vehicular control system with vehicle trajectory tracking | |
US10962638B2 (en) | Vehicle radar sensing system with surface modeling | |
US11315348B2 (en) | Vehicular vision system with object detection | |
US11508122B2 (en) | Bounding box estimation and object detection | |
US8559674B2 (en) | Moving state estimating device | |
JP2019128350A (en) | Image processing method, image processing device, on-vehicle device, moving body and system | |
KR20190067578A (en) | Collision warning device and method using heterogeneous cameras having overlapped capture area | |
US10249056B2 (en) | Vehicle position estimation system | |
JP2020047210A (en) | Object detection device | |
US20220176960A1 (en) | Vehicular control system with vehicle control based on stored target object position and heading information | |
US20240227693A1 (en) | Object detector and method for object detection | |
CN114868150A (en) | Information processing device, sensing device, moving object, and information processing method | |
JP2006286010A (en) | Obstacle detecting device and its method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NISSAN NORTH AMERICA, INC., TENNESSEE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KOJO, NAOKI;REEL/FRAME:032660/0147 Effective date: 20140410 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: NISSAN MOTOR CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NISSAN NORTH AMERICA, INC.;REEL/FRAME:038059/0146 Effective date: 20160308 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |