US20090252380A1 - Moving object trajectory estimating device - Google Patents
Moving object trajectory estimating device Download PDFInfo
- Publication number
- US20090252380A1 US20090252380A1 US12/413,659 US41365909A US2009252380A1 US 20090252380 A1 US20090252380 A1 US 20090252380A1 US 41365909 A US41365909 A US 41365909A US 2009252380 A1 US2009252380 A1 US 2009252380A1
- Authority
- US
- United States
- Prior art keywords
- moving object
- trajectory
- vehicle
- specified
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
- G08G1/163—Decentralised systems, e.g. inter-vehicle communication involving continuous checking
Definitions
- the invention relates to a moving object trajectory estimating device, which estimates the trajectory of a vehicle or other moving object.
- a moving object trajectory estimating device is described in, for example, Japanese Patent Application Publication No. 2007-230454 (JP-A-2007-230454).
- the device estimates the trajectory that a specified object out of a plurality of object may follow; changes in positions that the plurality of objects might possibly take with the lapse of time are generated as tracks on a space-time constituted of time and space; uses the tracks to predict the trajectories of the plurality of objects; and, based on the prediction result, quantitatively calculates the degree of interference between the trajectory of the specified object may follow and the trajectories that the other objects may follow.
- the invention provides a moving object trajectory estimating device that estimates an appropriate trajectory.
- a moving object trajectory estimating device includes: a surrounding information acquisition part that acquires information on the surroundings of a moving object; a trajectory estimating part that specifies another moving object around the moving object based on the surrounding information acquired by the surrounding information acquisition part and estimates the trajectory of the specified moving object; and a recognition information acquisition part that acquires recognition information on a recognizable area of the specified moving object, wherein the trajectory estimating part estimates the trajectory of the specified moving object based on the recognition information of the specified moving object acquired by the recognition information acquisition part.
- the trajectory of the specified moving object can be estimated more accurately. Therefore, estimation of the trajectory of the specified moving object from the perspective of the specified moving object allows appropriate trajectory estimation. In addition, because it is not necessary to take into consideration any information other than the information recognized by the specified moving object in this case, the speed of the estimation processing can be improved, and the accuracy of the trajectory estimation can be enhanced.
- the recognition information here includes not only information that is directly visible to the specified moving object but also information that is not directly visible but can be obtained through communication.
- a moving object trajectory estimating device that performs appropriate trajectory estimation can be provided.
- FIG. 1 is a block diagram showing the structure of a moving object trajectory estimating device according to a first embodiment of the invention
- FIG. 2 is an explanatory diagram showing a situation in which the moving object trajectory estimating device according to first and second embodiments of the invention is applied on a T intersection;
- FIG. 3 is a flowchart showing an operation of the moving object trajectory estimating device according to the first embodiment of the invention
- FIG. 4 is a block diagram showing the structure of a moving object trajectory estimating device according to the second embodiment of the invention.
- FIG. 5 is a flowchart showing an operation of the moving object trajectory estimating device according to the second embodiment of the invention.
- FIG. 6 is a block diagram showing the structure of a moving object trajectory estimating device according to a third embodiment of the invention.
- FIG. 7 is an explanatory diagram showing a situation in which the moving object trajectory estimating device according to the third embodiment of the invention is applied at a T intersection;
- FIG. 8 is a flowchart showing an operation of the moving object trajectory estimating device according to the third embodiment of the invention.
- a moving object trajectory estimating device 1 may be applied to a controller of an automatically driven vehicle and estimates the trajectories of other vehicles.
- FIG. 1 is a block diagram showing the structure of the moving object trajectory estimating device according to the first embodiment of the invention.
- the moving object trajectory estimating device I has an object detection electronic control unit (ECU) 5 , position calculation ECU 6 , observable object extraction ECU 7 , and object trajectory prediction ECU 8 .
- the ECUs each execute their own control and are configured by, for example, a central processing unit (CPU), read only memory (ROM), random access memory (RAM), input signal circuit, output signal circuit, power circuit, and the like.
- the object detection ECU 5 is connected to a camera 2 and laser radar 3 .
- the position calculation ECU 6 is connected with a global positioning system (GPS) receiver 4.
- GPS global positioning system
- the camera 2 may be a monocular camera, stereo camera, infrared camera or the like, and is used to acquire a situation around a host vehicle by capturing images of objects such as other vehicles, a pedestrian, roadside object, and the like.
- the laser radar 3 transmits a laser beam to surroundings of the host vehicle while scanning in a horizontal direction of the host vehicle, receives a wave reflected from the surface of the other vehicle or pedestrian to detect the distance to as well as the bearing and speed of the other vehicle or pedestrian.
- the bearing of the other vehicle or pedestrian, the distance to the other vehicle or pedestrian, and the speed of the other vehicle or pedestrian are detected by using the angle of the reflected wave, the time from when an electric wave is emitted till when the reflected wave returns, and changes in the frequency of the reflected wave, respectively.
- the GPS receiver 4 receives a GPS satellite signal to determine the position of the host vehicle, and detects the position of the host vehicle based on the received GPS satellite signal.
- the GPS receiver 4 outputs the determined position of the host vehicle to the position calculation ECU 6 .
- the object detection ECU 5 the surrounding information acquisition means for acquiring information on the surroundings of the base vehicle, acquires an image signal outputted by the camera 2 and signals of a plurality of other vehicles outputted by the laser radar 3 , and detects the plurality of other vehicles. The object detection ECU 5 then outputs the detected other vehicles to the position calculation ECU 6 .
- the position calculation ECU 6 is connected to the object detection ECU 5 , and may specify an object from the plurality of other vehicles detected by the object detection ECU 5 . For example, from a plurality of oncoming vehicles traveling in an oncoming lane, the vehicle closest to the host vehicle may be selected. Furthermore, the position calculation ECU 6 calculates the absolute position of a specified vehicle based on information on the specified vehicle (to be referred to as “specified vehicle”) and the absolute position of the host vehicle output by the GPS receiver 4. The position calculation ECU 6 then outputs the absolute position calculated for the specified vehicle to the observable object extraction ECU 7 .
- the observable object extraction ECU 7 is connected to the position calculation ECU 6 and map information storage device 9 .
- Road information or map information including information on a structure around a road is stored in the map information storage device 9 .
- this device reads the map information on the surroundings of the host vehicle based on the signal output by the GPS receiver 4, and outputs the read map information to the observable object extraction ECU 7 .
- Examples of the information on a structure around a road include the shape, length, height and the like of the structure.
- the observable object extraction ECU 7 serving as the recognition information acquisition means, extracts an observable object from the specified vehicle based on the absolute position of the specified vehicle output from the position calculation ECU 6 and the map information of the surroundings of the host vehicle that is output from the map information storage device 9 .
- the observable object from the specified vehicle means an object that is visible from the driver's seat of the specified vehicle, and examples of such an object include other vehicles, such as two-wheeled vehicles, pedestrians, etc.
- the observable object extraction ECU 7 then outputs information on the extracted observable object of the specified vehicle to the object trajectory prediction ECU 8 .
- the object trajectory prediction ECU 8 the trajectory estimating means, generates a predicted trajectory of each observable object based on the information on the observable object from the specified vehicle extracted by the observable object extraction ECU 7 , and predicts the trajectory of the specified object based on the generated result.
- the object trajectory prediction ECU 8 then outputs the predicted trajectory of the specified object to an output part 10 .
- the output part 10 determines the trajectory of the vehicle in response to, for example, the result of the predicted trajectory of the specified object, and automatically controls a steering actuator or a drive actuator.
- FIG. 2 is an explanatory diagram showing a scenario in which the moving object trajectory estimating device according to the first embodiment is applied on a T intersection.
- a host vehicle M 11 and an oncoming vehicle M 12 which are both equipped with the moving object trajectory estimating device 1 , travel in a priority road of a T intersection, and other vehicle M 13 travels in a nonpriority road.
- a motorcycle M14 travels behind the host vehicle M 11 .
- a large building T is present at a corner on the left-hand side of the oncoming vehicle M 12 .
- FIG. 3 is a flowchart showing the operation of the moving object trajectory estimating device according to the first embodiment. Control steps shown in FIG. 3 are executed predetermined intervals (e.g., 100 to 1000 ms) after the ignition is turned on.
- predetermined intervals e.g. 100 to 1000 ms
- step S 11 objects such as other vehicles or pedestrians around the host vehicle M 11 are detected. Any conventional method may be used as the method of this detection.
- the surroundings of the host vehicle M 11 may be scanned using the laser radar 3 to measure the positions of the oncoming vehicle M 12 , other vehicle M 13 and motorcycle M 14 , and the speed of each of these vehicles is measured based on positional changes occurring in continuous time.
- objects such as the other vehicle and pedestrian in the surroundings including the oncoming vehicle M 12 , other vehicle M 13 and motor cycle M 14 are detected based on the images captured by the camera 2 .
- one object from among the plurality of objects detected in step S 11 is selected and a trajectory is predicted. For example, out of a plurality of oncoming vehicles traveling in an oncoming lane, the oncoming vehicle M 12 closest to the host vehicle M 11 may be selected.
- step S 13 a base position is detected based on the GPS satellite signal received by the GPS receiver 4 , and the absolute position of the host vehicle M 11 is thereby obtained.
- step S 14 the absolute position of the oncoming vehicle M 12 is determined based on the position of the oncoming vehicle M 12 relative to the position of the host vehicle M 11 and the absolute position of the host vehicle M 11 .
- the map information on the surroundings of the oncoming vehicle M 12 is read from the map information storage device 9 in step 15 once the absolute position of the oncoming vehicle M 12 has been calculated in step S 14 .
- the map information is information with which whether a visual field from the oncoming vehicle M 12 is blocked or not by the road structure on the map.
- the map information includes information on at least the height of the road structure.
- step S 17 a predicted trajectory of the object extracted in step S 16 is generated. Because only the host vehicle M 11 is extracted in step S 16 , a predicted trajectory of the host vehicle M 11 is generated. Here, because the host vehicle M 11 appears merely as an object to the oncoming vehicle M 12 , the trajectory generation is carried out using the same method as with the other object, regardless of the trajectory followed by the host vehicle M 11 . Note that any conventional method may be used as the trajectory generation method. Examples of such a method include a method for stochastically expressing the tracks of the positions that sequentially change with the lapse of time.
- a trajectory estimating method for a moving object according to a second embodiment of the invention is described next.
- FIG. 4 is a block diagram showing the structure of a moving object trajectory estimating device according to the second embodiment.
- a trajectory estimating method for a moving object 11 according to the second embodiment differs from the moving object trajectory estimating device 1 according to the first embodiment in that the trajectory estimating method for a moving object 11 has an observed object specifying ECU 12 and receiving device 13 .
- the moving object trajectory estimating device 11 has the object detection ECU 5 , position calculation ECU 6 , observed object specifying ECU 12 , and object trajectory prediction ECU 8 , and the receiving device 13 is connected with the observed object specifying ECU 12 .
- FIG. 5 is a flowchart showing an operation of the moving object trajectory estimating device according to the second embodiment.
- the control steps shown in FIG. 5 are executed predetermined intervals (e.g., 100 to 1000 ms) after the ignition is turned on.
- step S 21 detects an object such as other vehicle or a pedestrian around the host vehicle M 11 .
- An existing method may be used as the method of this detection.
- the surroundings of the host vehicle M 11 may be scanned using the laser radar 3 to determine the positions of the oncoming vehicle M 12 , other vehicle M 13 and of motorcycle M 14 , and the speed of each of these vehicles may be measured based on positional changes that occur over time.
- objects such as the other vehicle and pedestrian in the surroundings including the oncoming vehicle M 12 , other vehicle M 13 and motor cycle M 14 are detected based on the images captured by the camera 2 .
- the information received from the oncoming vehicle M 12 is read.
- the information includes the information of the oncoming vehicle M 12 and objects detected y the oncoming vehicle M 12 .
- the objects detected by the oncoming vehicle M 12 include not only those objects that are directly observed by the oncoming vehicle M 12 , but also those objects that cannot directly observed by the oncoming vehicle M 12 but may be obtained through inter-vehicle communication.
- the oncoming vehicle M 12 can detect these vehicles by means of inter-vehicle communication between the other vehicle M 13 and the motorcycle M 14 .
- Step S 24 selects, from the objects detected by the oncoming vehicle M 12 , an object that can be observed by the oncoming vehicle M 12 .
- the host vehicle M 11 is selected.
- the moving object trajectory estimating device 11 of the second embodiment not only is it possible to obtain the same operational effects as those obtained by the moving object trajectory estimating device 1 according to the first embodiment, but also to obtain the information on the observable objects from the oncoming vehicle M 12 via communication with the oncoming vehicle M 12 . Therefore, the trajectories that the oncoming vehicle M 12 may take are more accurately estimated, and appropriate trajectory estimation can be performed.
- FIG. 6 is a block diagram showing the structure of the moving object trajectory estimating device according to the third embodiment.
- a trajectory estimating method for a moving object 16 according to the third embodiment differs from the moving object trajectory estimating device 1 according to the first embodiment in that the trajectory estimating method for a moving object 16 includes a blind spot calculation ECU 17 , observed object selecting ECU 18 , individual authentication ECU 19 , and individual blind spot information database (DB) 20 .
- DB blind spot information database
- the individual authentication ECU 19 is connected to the object detection ECU 5 and performs individual authentication on the plurality of other vehicles detected by the object detection ECU 5 .
- the individual authentication ECU 19 authenticates the vehicle model by reading an image or license plate of the other vehicle captured by the camera 2 .
- Blind spot information for each vehicle model is stored in the individual blind spot information DB 20 .
- the individual blind spot information DB 20 is connected to the individual authentication ECU 19 , so that blind spot information unique to a vehicle is extracted in accordance with the result of vehicle model output by the individual authentication ECU 19 .
- the individual authentication ECU 19 then outputs the extracted blind spot information to the blind spot calculation ECU 17 .
- the blind spot calculation ECU 17 is connected to the individual blind spot information DB 20 and the position calculation ECU 6 , and calculates the blind spot of the specified vehicle based on the blind spot information for the vehicle that is output from the individual blind spot information DB 20 and the absolute position of the specified vehicle that is output from the position calculation ECU 6 .
- the blind spot calculation ECU 17 then outputs the calculated blind spot of the specified vehicle to the observed object selecting ECU 18 .
- the observed object selecting ECU 18 which serves as the recognition information acquisition means, selects an object that is not present in the blind spot of the specified vehicle and can be observed from the specified vehicle, based on the results of the blind spot of the specified vehicle in the area that is output from the blind spot of calculation ECU 17 .
- the observed object selecting ECU 18 then outputs the selected result to the object trajectory prediction ECU 8 .
- FIG. 7 is an explanatory diagram showing a scenario in which the moving object trajectory estimating device according to the third embodiment of the invention is applied on a T intersection.
- a host vehicle M 15 and an oncoming vehicle M 16 which that are both equipped with the moving object trajectory estimating device 16 , travel in a priority road of a T intersection, and motorcycles M 17 and M 18 travel on the left-hand side of the oncoming vehicle M 16 and behind the oncoming vehicle M 16 respectively.
- the motorcycle M 17 is located within a blind spot of the oncoming vehicle M 16 in area H 3 .
- FIG. 8 is a flowchart showing an operation of the moving object trajectory estimating device according to the third embodiment.
- the control steps shown in FIG. 8 are executed at predetermined intervals (e.g., 100 to 1000 ms) after the ignition is turned on.
- step S 31 objects such as other vehicles or pedestrians around the host vehicle M 15 are detected.
- Conventional methods may be used as the method of this detection.
- the surroundings of the host vehicle M 15 may be scanned using the laser radar 3 to measure the positions of the oncoming vehicle M 16 and motorcycles M 17 , M 18 , and the speed of the oncoming vehicle M 16 and motorcycles M 17 , M 18 may be measured based on positional changes occurring over time.
- the oncoming vehicle M 16 and motorcycles M 17 , M 18 are detected based on the images captured by the camera 2 .
- step S 32 the object from the plurality of objects detected in step S 31 , for which the trajectory is predicted, is then selected. For example, out of a plurality of oncoming vehicles traveling in an oncoming lane, the oncoming vehicle M 16 closest to the host vehicle M 15 is selected.
- step S 34 the blind spot information for the vehicle model of the oncoming vehicle M 16 is read from the individual blind spot information DB 20 in accordance with the individual information of the oncoming vehicle M 16 specified in step S 33 , and then specifies a blind spot. For example, as shown in FIG. 7 , the blind spot H 3 of the oncoming vehicle M 16 is specified.
- step S 34 the objects present in the blind spot specified in step S 34 are eliminated in step 35 , and only the objects that are not present in the blind spot are extracted.
- the motorcycle M 17 located within the blind spot H 3 is not visible to the oncoming vehicle M 16 .
- step S 36 a predicted trajectory of the objects visible to the oncoming vehicle M 16 are generated. Because the host vehicle M 15 and motorcycle M 18 are extracted in step S 35 , the predicted trajectories of the host vehicle M 15 and motorcycle M 18 are generated. Note that any conventional method may be used as the trajectory generation method. Examples include stochastically expressing the tracks of the positions that sequentially change over time.
- Step S 37 subsequent to step S 36 determines the predicted trajectory of the specified object.
- the predicted trajectory of the oncoming vehicle M 16 is determined based on the predicted trajectories of the host vehicle M 15 and motorcycle M 18 generated in step S 36 .
- any conventional method may be used as this trajectory determination method. Examples include reducing the probability that a track that the oncoming vehicle M 16 interferes with the host vehicle M 15 and motorcycle M 18 is taken.
- step S 38 it is determined whether to determine the predicted trajectories for all of the detected objects.
- the motorcycles M 17 , M 18 are sequentially selected after the predicted trajectory of the oncoming vehicle M 16 is determined, and the trajectory of each motorcycle M 17 , M 18 is generated in accordance with the above-described steps. Then, the series of control steps is ended after determining the predicted trajectories of each detected object.
- the moving object trajectory estimating device 16 of the third embodiment not only is it possible to obtain the same operational effects as those of the moving object trajectory estimating device 1 according to the first embodiment, but it is also possible to specify the blind spot unique to the oncoming vehicle M 16 in accordance with the individual information of the oncoming vehicle M 16 and to eliminate the objects contained in the blind spot. Therefore, the trajectories that may be taken by the oncoming vehicle M 16 are estimated more accurately, and appropriate trajectory estimation can be performed.
- the observed object selecting ECU 18 not only specifies the objects that can be observed from the specified vehicle, based on the blind spot of the specified vehicle, but also may specify an object that can be observed from each object, from detection capability information provided to the specified vehicle.
- the detection capability information may include the type and presence/absence of a sensor installed in each object, the capability of each sensor to detect an observable distance or observable environment, blind spot, visual field, and the like.
- examples of methods for specifying a vehicle model include reading a license plate or processing the images and then acquiring the vehicle model from the database as described above, and acquiring the vehicle model by means of direct communication.
- the individual information of the vehicle model does not necessarily have to be the vehicle model information, instead the size of the vehicle or the pillar position information may be acquired by the camera or via communication.
- the embodiments described above are merely examples of the moving object trajectory estimating device according to the invention.
- the moving object trajectory estimating device according to the invention is not limited to those described in the embodiments.
- the moving object trajectory estimating device according to the invention may be applied to not only in the automatic operation of a vehicle, but also in predicting and warning about the movement of other moving body, as well as a robot.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
- Regulating Braking Force (AREA)
Abstract
A moving object trajectory estimating device has: a surrounding information acquisition part that acquires information on surroundings of a moving object; a trajectory estimating part that specifies another moving object around the moving object based on the acquired surrounding information and estimates a trajectory of the specified moving object; and a recognition information acquisition part that acquires recognition information on a recognizable area of the specified moving object, and the trajectory estimating part estimates a trajectory of the specified moving object, based on the acquired recognition information of the specified moving object.
Description
- The disclosure of Japanese Patent Application No. 2008-099447 filed on Apr. 7, 2008 including the specification, drawings and abstract is incorporated herein by reference in its entirety.
- 1. Field of the Invention
- The invention relates to a moving object trajectory estimating device, which estimates the trajectory of a vehicle or other moving object.
- 2. Description of the Related Art
- A moving object trajectory estimating device is described in, for example, Japanese Patent Application Publication No. 2007-230454 (JP-A-2007-230454). The device estimates the trajectory that a specified object out of a plurality of object may follow; changes in positions that the plurality of objects might possibly take with the lapse of time are generated as tracks on a space-time constituted of time and space; uses the tracks to predict the trajectories of the plurality of objects; and, based on the prediction result, quantitatively calculates the degree of interference between the trajectory of the specified object may follow and the trajectories that the other objects may follow.
- However, in the moving object trajectory estimating device according to the related art, the estimation is performed in consideration of the movements of the other objects present around the specified object of which the trajectory needs to be estimated. Therefore, the movements of the other objects that are invisible to the specified object are also taken into consideration. As a result, appropriate trajectory estimation might not be performed.
- The invention provides a moving object trajectory estimating device that estimates an appropriate trajectory.
- A moving object trajectory estimating device according to the invention includes: a surrounding information acquisition part that acquires information on the surroundings of a moving object; a trajectory estimating part that specifies another moving object around the moving object based on the surrounding information acquired by the surrounding information acquisition part and estimates the trajectory of the specified moving object; and a recognition information acquisition part that acquires recognition information on a recognizable area of the specified moving object, wherein the trajectory estimating part estimates the trajectory of the specified moving object based on the recognition information of the specified moving object acquired by the recognition information acquisition part.
- According to this aspect, by estimating the trajectory of the specified moving object based on the recognition information of the specified moving object acquired by the recognition information acquisition part, the trajectory of the specified moving object can be estimated more accurately. Therefore, estimation of the trajectory of the specified moving object from the perspective of the specified moving object allows appropriate trajectory estimation. In addition, because it is not necessary to take into consideration any information other than the information recognized by the specified moving object in this case, the speed of the estimation processing can be improved, and the accuracy of the trajectory estimation can be enhanced. The recognition information here includes not only information that is directly visible to the specified moving object but also information that is not directly visible but can be obtained through communication.
- According to this invention, a moving object trajectory estimating device that performs appropriate trajectory estimation can be provided.
- The foregoing and further objects, features and advantages of the invention will become more apparent from the following description of preferred embodiment with reference to the accompanying drawings, in which like numerals are used to represent like elements and wherein:
-
FIG. 1 is a block diagram showing the structure of a moving object trajectory estimating device according to a first embodiment of the invention; -
FIG. 2 is an explanatory diagram showing a situation in which the moving object trajectory estimating device according to first and second embodiments of the invention is applied on a T intersection; -
FIG. 3 is a flowchart showing an operation of the moving object trajectory estimating device according to the first embodiment of the invention; -
FIG. 4 is a block diagram showing the structure of a moving object trajectory estimating device according to the second embodiment of the invention; -
FIG. 5 is a flowchart showing an operation of the moving object trajectory estimating device according to the second embodiment of the invention; -
FIG. 6 is a block diagram showing the structure of a moving object trajectory estimating device according to a third embodiment of the invention; -
FIG. 7 is an explanatory diagram showing a situation in which the moving object trajectory estimating device according to the third embodiment of the invention is applied at a T intersection; and -
FIG. 8 is a flowchart showing an operation of the moving object trajectory estimating device according to the third embodiment of the invention. - Embodiments of the invention will be described in detail below with reference to the accompanying drawings. Note that like numerals are used to represent like elements in the descriptions of the drawings, and overlapping descriptions are omitted.
- A moving object
trajectory estimating device 1 according to a first embodiment may be applied to a controller of an automatically driven vehicle and estimates the trajectories of other vehicles. -
FIG. 1 is a block diagram showing the structure of the moving object trajectory estimating device according to the first embodiment of the invention. As shown inFIG. 1 , the moving object trajectory estimating device I has an object detection electronic control unit (ECU) 5,position calculation ECU 6, observableobject extraction ECU 7, and objecttrajectory prediction ECU 8. The ECUs each execute their own control and are configured by, for example, a central processing unit (CPU), read only memory (ROM), random access memory (RAM), input signal circuit, output signal circuit, power circuit, and the like. The object detection ECU 5 is connected to acamera 2 andlaser radar 3. Theposition calculation ECU 6 is connected with a global positioning system (GPS)receiver 4. - The
camera 2 may be a monocular camera, stereo camera, infrared camera or the like, and is used to acquire a situation around a host vehicle by capturing images of objects such as other vehicles, a pedestrian, roadside object, and the like. - The
laser radar 3 transmits a laser beam to surroundings of the host vehicle while scanning in a horizontal direction of the host vehicle, receives a wave reflected from the surface of the other vehicle or pedestrian to detect the distance to as well as the bearing and speed of the other vehicle or pedestrian. The bearing of the other vehicle or pedestrian, the distance to the other vehicle or pedestrian, and the speed of the other vehicle or pedestrian are detected by using the angle of the reflected wave, the time from when an electric wave is emitted till when the reflected wave returns, and changes in the frequency of the reflected wave, respectively. - The
GPS receiver 4 receives a GPS satellite signal to determine the position of the host vehicle, and detects the position of the host vehicle based on the received GPS satellite signal. TheGPS receiver 4 outputs the determined position of the host vehicle to theposition calculation ECU 6. - The
object detection ECU 5, the surrounding information acquisition means for acquiring information on the surroundings of the base vehicle, acquires an image signal outputted by thecamera 2 and signals of a plurality of other vehicles outputted by thelaser radar 3, and detects the plurality of other vehicles. Theobject detection ECU 5 then outputs the detected other vehicles to theposition calculation ECU 6. - The position calculation ECU 6 is connected to the
object detection ECU 5, and may specify an object from the plurality of other vehicles detected by theobject detection ECU 5. For example, from a plurality of oncoming vehicles traveling in an oncoming lane, the vehicle closest to the host vehicle may be selected. Furthermore, theposition calculation ECU 6 calculates the absolute position of a specified vehicle based on information on the specified vehicle (to be referred to as “specified vehicle”) and the absolute position of the host vehicle output by theGPS receiver 4. Theposition calculation ECU 6 then outputs the absolute position calculated for the specified vehicle to the observableobject extraction ECU 7. - The observable
object extraction ECU 7 is connected to theposition calculation ECU 6 and mapinformation storage device 9. Road information or map information including information on a structure around a road is stored in the mapinformation storage device 9. For example, this device reads the map information on the surroundings of the host vehicle based on the signal output by theGPS receiver 4, and outputs the read map information to the observableobject extraction ECU 7. Examples of the information on a structure around a road include the shape, length, height and the like of the structure. - The observable
object extraction ECU 7, serving as the recognition information acquisition means, extracts an observable object from the specified vehicle based on the absolute position of the specified vehicle output from theposition calculation ECU 6 and the map information of the surroundings of the host vehicle that is output from the mapinformation storage device 9. Here, the observable object from the specified vehicle means an object that is visible from the driver's seat of the specified vehicle, and examples of such an object include other vehicles, such as two-wheeled vehicles, pedestrians, etc. The observableobject extraction ECU 7 then outputs information on the extracted observable object of the specified vehicle to the objecttrajectory prediction ECU 8. - The object
trajectory prediction ECU 8, the trajectory estimating means, generates a predicted trajectory of each observable object based on the information on the observable object from the specified vehicle extracted by the observableobject extraction ECU 7, and predicts the trajectory of the specified object based on the generated result. The objecttrajectory prediction ECU 8 then outputs the predicted trajectory of the specified object to anoutput part 10. Theoutput part 10 determines the trajectory of the vehicle in response to, for example, the result of the predicted trajectory of the specified object, and automatically controls a steering actuator or a drive actuator. - Next, an operation of the moving object
trajectory estimating device 1 according to the first embodiment is described. -
FIG. 2 is an explanatory diagram showing a scenario in which the moving object trajectory estimating device according to the first embodiment is applied on a T intersection. As shown inFIG. 2 , a host vehicle M11 and an oncoming vehicle M12, which are both equipped with the moving objecttrajectory estimating device 1, travel in a priority road of a T intersection, and other vehicle M13 travels in a nonpriority road. A motorcycle M14 travels behind the host vehicle M11. A large building T is present at a corner on the left-hand side of the oncoming vehicle M12. -
FIG. 3 is a flowchart showing the operation of the moving object trajectory estimating device according to the first embodiment. Control steps shown inFIG. 3 are executed predetermined intervals (e.g., 100 to 1000 ms) after the ignition is turned on. - First, in step S11, objects such as other vehicles or pedestrians around the host vehicle M11 are detected. Any conventional method may be used as the method of this detection. For example, the surroundings of the host vehicle M11 may be scanned using the
laser radar 3 to measure the positions of the oncoming vehicle M12, other vehicle M13 and motorcycle M14, and the speed of each of these vehicles is measured based on positional changes occurring in continuous time. Also, objects such as the other vehicle and pedestrian in the surroundings including the oncoming vehicle M12, other vehicle M13 and motor cycle M14 are detected based on the images captured by thecamera 2. - Next, one object from among the plurality of objects detected in step S11 is selected and a trajectory is predicted. For example, out of a plurality of oncoming vehicles traveling in an oncoming lane, the oncoming vehicle M12 closest to the host vehicle M11 may be selected.
- In step S13, a base position is detected based on the GPS satellite signal received by the
GPS receiver 4, and the absolute position of the host vehicle M11 is thereby obtained. Next, in step S14, the absolute position of the oncoming vehicle M12 is determined based on the position of the oncoming vehicle M12 relative to the position of the host vehicle M11 and the absolute position of the host vehicle M11. - The map information on the surroundings of the oncoming vehicle M12 is read from the map
information storage device 9 instep 15 once the absolute position of the oncoming vehicle M12 has been calculated in step S14. The map information is information with which whether a visual field from the oncoming vehicle M12 is blocked or not by the road structure on the map. The map information includes information on at least the height of the road structure. - In step S16, it is determined whether, from the perspective of the oncoming vehicle M12, other surrounding object is blocked by the road structure or not, eliminates a blocked invisible object, and extracts only objects that are not blocked. Specifically, when the oncoming vehicle M12 is selected as the specified object, as shown in
FIG. 2 , whether other object is visible to the oncoming vehicle M12 or not is determined. - Thus, for example, by drawing a straight line L1 passing from the driver's seat P1 of the oncoming vehicle M12 to a top point P2 of a corner of the building T, the visual field on the left-hand side of the straight line L is blocked by the building T, whereby a blocked area H1 is formed. It is determined that the other vehicle M13 is not visible to the oncoming vehicle M12, because the other vehicle M13 is positioned within this blocked area H1. On the other hand, it is determined that the host vehicle M11 is visible to the oncoming vehicle M12, because there is no object between the host vehicle M11 and the oncoming vehicle M12.
- Furthermore, when drawing straight lines L2, L3 passing from the driver's seat P1 of the oncoming vehicle M12 to right and left ends of the host vehicle M11 from the perspective of the driver's seat P1, the section between the straight lines L1 and L2 and behind the host vehicle M11 is blocked by the host vehicle M11, thereby forming a blocked area H2. It is determined that the motorcycle M14 is not visible to the oncoming vehicle M12, because the motorcycle M14 is positioned within the blocked area H2. Therefore, only the host vehicle M11 is the object visible to the oncoming vehicle M12. The other vehicle M13 and the motorcycle M14 are then eliminated, but the host vehicle M11 is extracted.
- In step S17, a predicted trajectory of the object extracted in step S16 is generated. Because only the host vehicle M11 is extracted in step S16, a predicted trajectory of the host vehicle M11 is generated. Here, because the host vehicle M11 appears merely as an object to the oncoming vehicle M12, the trajectory generation is carried out using the same method as with the other object, regardless of the trajectory followed by the host vehicle M11. Note that any conventional method may be used as the trajectory generation method. Examples of such a method include a method for stochastically expressing the tracks of the positions that sequentially change with the lapse of time.
- Step S18 determines a predicted trajectory of the specified object. Specifically, a predicted trajectory of the oncoming vehicle M12 is determined based on the predicted trajectory of other objects around the oncoming vehicle M12 (i.e., the host vehicle M11) that is generated in step S17. Note that any conventional method may be used as this trajectory determination method. Examples of one such method include a method for reducing the probability that a track that the oncoming vehicle M12 and the host vehicle M11 interfere with each other is taken.
- In step S19 it is determined whether the predicted trajectories for all of the detected objects should be determined. The other vehicle M13 and the motorcycle M14 are sequentially selected after the predicted trajectory of the oncoming vehicle M12 is determined, and the trajectories of these objects are generated by repeatedly performing the above-described steps. Then, the series of control steps is ended after determining the predicted trajectories of all of the detected objects.
- As described above, according to the moving object
trajectory estimating device 1 of the first embodiment, because the oncoming vehicle M12, other vehicle M13 and motorcycle M14 are selected to estimate the predicted trajectories thereof based on the recognition information of these vehicles, the predicted trajectories are estimated more accurately. Appropriate estimation may be performed by estimating a predicted trajectory of a vehicle from the perspective of the oncoming vehicle M12, other vehicle M13 and motorcycle M14. Furthermore, because it is not necessary to take into consideration any information other than the recognizable range of the vehicles, not only is it possible to reduce the amount of estimation processing needed, but also the speed of the estimation processing may be improved, to enhance the accuracy of the trajectory estimation. - A trajectory estimating method for a moving object according to a second embodiment of the invention is described next.
-
FIG. 4 is a block diagram showing the structure of a moving object trajectory estimating device according to the second embodiment. As shown inFIG. 4 , a trajectory estimating method for a movingobject 11 according to the second embodiment differs from the moving objecttrajectory estimating device 1 according to the first embodiment in that the trajectory estimating method for a movingobject 11 has an observedobject specifying ECU 12 and receivingdevice 13. Specifically, the moving objecttrajectory estimating device 11 has theobject detection ECU 5,position calculation ECU 6, observedobject specifying ECU 12, and objecttrajectory prediction ECU 8, and the receivingdevice 13 is connected with the observedobject specifying ECU 12. - The receiving
device 13 communicates with other vehicles around a host vehicle. For example, the receivingdevice 13 receives vehicle information from oncoming vehicles traveling in an oncoming lane and a vehicle following the host vehicle (including two-wheel vehicles). The receivingdevice 13 then outputs the received information on the other vehicles to the observedobject specifying ECU 12. - The observed
object specifying ECU 12, which serves as the recognition information acquisition means, is provided between theposition calculation ECU 6 and the objecttrajectory prediction ECU 8. The observedobject specifying ECU 12 specifies an observed object of a specified vehicle based on the absolute position of the specified vehicle output from theposition calculation ECU 6 and the information on the specified vehicle output from the receivingdevice 13. Here, the observed object from the specified vehicle may an object visible from the driver's seat of the specified vehicle, and examples of such an object include other vehicles, such as a two-wheel vehicle, a pedestrian, etc. The observedobject specifying ECU 12 outputs the information regarding the specified observed object of the specified vehicle to the objecttrajectory prediction ECU 8. - On the other hand, a
controller 14 installed in the other vehicle that communicates with the host vehicle may be configured by, for example, thecamera 2,laser radar 3,GPS receiver 4, objectdetection ECU 5,position calculation ECU 6, and atransmitter 15. Thetransmitter 15 is connected with theposition calculation ECU 6 and transmits the calculated absolute position and base position of the surrounding other vehicle. - Next, an operation of the moving object
trajectory estimating device 11 according to the second embodiment will be described. The operation is described below is based on the scenario shown inFIG. 2 . -
FIG. 5 is a flowchart showing an operation of the moving object trajectory estimating device according to the second embodiment. The control steps shown inFIG. 5 are executed predetermined intervals (e.g., 100 to 1000 ms) after the ignition is turned on. - First, step S21 detects an object such as other vehicle or a pedestrian around the host vehicle M11. An existing method may be used as the method of this detection. For example, the surroundings of the host vehicle M11 may be scanned using the
laser radar 3 to determine the positions of the oncoming vehicle M12, other vehicle M13 and of motorcycle M14, and the speed of each of these vehicles may be measured based on positional changes that occur over time. In addition, objects such as the other vehicle and pedestrian in the surroundings including the oncoming vehicle M12, other vehicle M13 and motor cycle M14 are detected based on the images captured by thecamera 2. - In step S22 selects one specified object trajectory from the plurality of vehicles detected in step S21 and the trajectory of the selected object is predicted. For example, out of a plurality of oncoming vehicles traveling in an oncoming lane, the oncoming vehicle M12 closest to the host vehicle M11 is selected.
- In the process of S23 the information received from the oncoming vehicle M12 is read. The information includes the information of the oncoming vehicle M12 and objects detected y the oncoming vehicle M12. The objects detected by the oncoming vehicle M12 include not only those objects that are directly observed by the oncoming vehicle M12, but also those objects that cannot directly observed by the oncoming vehicle M12 but may be obtained through inter-vehicle communication. In the situation shown in
FIG. 2 , although the other vehicle M13 and motorcycle M14 cannot be directly observed from the oncoming vehicle M12 because the other vehicle M13 and motorcycle M14 are positioned within the blocked areas H1, H2, respectively, the oncoming vehicle M12 can detect these vehicles by means of inter-vehicle communication between the other vehicle M13 and the motorcycle M14. - Step S24 selects, from the objects detected by the oncoming vehicle M12, an object that can be observed by the oncoming vehicle M12. In the situation shown in
FIG. 2 , because the object that can be observed by the oncoming vehicle M12 is the host vehicle M11 only, the host vehicle M11 is selected. - The predicted trajectory of the object selected in step S24 is then generated in step S25. Because only the host vehicle M11 is selected, the predicted trajectory of the host vehicle M11 is generated. Note that any conventional method may be used as the trajectory generation method. Examples of such a method include a method for stochastically expressing the tracks of the positions that sequentially change with the lapse of time.
- Steps S26 and S27 are the same as those of S18 and S19 of the first embodiment described above, accordingly overlapping descriptions are omitted. Then, the series of control steps ends after determining the predicted trajectories of for each detected object.
- As described above, according to the moving object
trajectory estimating device 11 of the second embodiment, not only is it possible to obtain the same operational effects as those obtained by the moving objecttrajectory estimating device 1 according to the first embodiment, but also to obtain the information on the observable objects from the oncoming vehicle M12 via communication with the oncoming vehicle M12. Therefore, the trajectories that the oncoming vehicle M12 may take are more accurately estimated, and appropriate trajectory estimation can be performed. - Next, a moving object trajectory estimating device according to a third embodiment of the invention will be described.
-
FIG. 6 is a block diagram showing the structure of the moving object trajectory estimating device according to the third embodiment. As shown inFIG. 6 , a trajectory estimating method for a movingobject 16 according to the third embodiment differs from the moving objecttrajectory estimating device 1 according to the first embodiment in that the trajectory estimating method for a movingobject 16 includes a blindspot calculation ECU 17, observedobject selecting ECU 18,individual authentication ECU 19, and individual blind spot information database (DB) 20. - The
individual authentication ECU 19 is connected to theobject detection ECU 5 and performs individual authentication on the plurality of other vehicles detected by theobject detection ECU 5. For example, theindividual authentication ECU 19 authenticates the vehicle model by reading an image or license plate of the other vehicle captured by thecamera 2. Blind spot information for each vehicle model is stored in the individual blindspot information DB 20. The individual blindspot information DB 20 is connected to theindividual authentication ECU 19, so that blind spot information unique to a vehicle is extracted in accordance with the result of vehicle model output by theindividual authentication ECU 19. Theindividual authentication ECU 19 then outputs the extracted blind spot information to the blindspot calculation ECU 17. - The blind
spot calculation ECU 17 is connected to the individual blindspot information DB 20 and theposition calculation ECU 6, and calculates the blind spot of the specified vehicle based on the blind spot information for the vehicle that is output from the individual blindspot information DB 20 and the absolute position of the specified vehicle that is output from theposition calculation ECU 6. The blindspot calculation ECU 17 then outputs the calculated blind spot of the specified vehicle to the observedobject selecting ECU 18. The observedobject selecting ECU 18, which serves as the recognition information acquisition means, selects an object that is not present in the blind spot of the specified vehicle and can be observed from the specified vehicle, based on the results of the blind spot of the specified vehicle in the area that is output from the blind spot ofcalculation ECU 17. The observedobject selecting ECU 18 then outputs the selected result to the objecttrajectory prediction ECU 8. - Next, the operation of the moving object
trajectory estimating device 16 according to the third embodiment is described. -
FIG. 7 is an explanatory diagram showing a scenario in which the moving object trajectory estimating device according to the third embodiment of the invention is applied on a T intersection. As shown inFIG. 7 , a host vehicle M15 and an oncoming vehicle M16, which that are both equipped with the moving objecttrajectory estimating device 16, travel in a priority road of a T intersection, and motorcycles M17 and M18 travel on the left-hand side of the oncoming vehicle M16 and behind the oncoming vehicle M16 respectively. The motorcycle M17 is located within a blind spot of the oncoming vehicle M16 in area H3. -
FIG. 8 is a flowchart showing an operation of the moving object trajectory estimating device according to the third embodiment. The control steps shown inFIG. 8 are executed at predetermined intervals (e.g., 100 to 1000 ms) after the ignition is turned on. - First, in step S31, objects such as other vehicles or pedestrians around the host vehicle M15 are detected. Conventional methods may be used as the method of this detection. For example, the surroundings of the host vehicle M15 may be scanned using the
laser radar 3 to measure the positions of the oncoming vehicle M16 and motorcycles M17, M18, and the speed of the oncoming vehicle M16 and motorcycles M17, M18 may be measured based on positional changes occurring over time. Also, the oncoming vehicle M16 and motorcycles M17, M18 are detected based on the images captured by thecamera 2. - In step S32, the object from the plurality of objects detected in step S31, for which the trajectory is predicted, is then selected. For example, out of a plurality of oncoming vehicles traveling in an oncoming lane, the oncoming vehicle M16 closest to the host vehicle M15 is selected.
- Then in step S33, specified individual information of the oncoming vehicle M16 selected in step S32. For example, the vehicle model of the oncoming vehicle M16 is specified. A general method may be used as the method for specifying the vehicle model. For example, based on an image of the oncoming vehicle M16 captured by the
camera 2, the vehicle model is specified through pattern matching of the image, or the license plate is read, to specify the appropriate vehicle model in the database. - Next in step S34, the blind spot information for the vehicle model of the oncoming vehicle M16 is read from the individual blind
spot information DB 20 in accordance with the individual information of the oncoming vehicle M16 specified in step S33, and then specifies a blind spot. For example, as shown inFIG. 7 , the blind spot H3 of the oncoming vehicle M16 is specified. - Then, the objects present in the blind spot specified in step S34 are eliminated in step 35, and only the objects that are not present in the blind spot are extracted. In
FIG. 7 , although the host vehicle M15 and motorcycle M18 are visible to the oncoming vehicle M16, the motorcycle M17 located within the blind spot H3 is not visible to the oncoming vehicle M16. - In step S36, a predicted trajectory of the objects visible to the oncoming vehicle M16 are generated. Because the host vehicle M15 and motorcycle M18 are extracted in step S35, the predicted trajectories of the host vehicle M15 and motorcycle M18 are generated. Note that any conventional method may be used as the trajectory generation method. Examples include stochastically expressing the tracks of the positions that sequentially change over time.
- Step S37 subsequent to step S36 determines the predicted trajectory of the specified object. Specifically, the predicted trajectory of the oncoming vehicle M16 is determined based on the predicted trajectories of the host vehicle M15 and motorcycle M18 generated in step S36. Note that any conventional method may be used as this trajectory determination method. Examples include reducing the probability that a track that the oncoming vehicle M16 interferes with the host vehicle M15 and motorcycle M18 is taken.
- In step S38, it is determined whether to determine the predicted trajectories for all of the detected objects. The motorcycles M17, M18 are sequentially selected after the predicted trajectory of the oncoming vehicle M16 is determined, and the trajectory of each motorcycle M17, M18 is generated in accordance with the above-described steps. Then, the series of control steps is ended after determining the predicted trajectories of each detected object.
- As described above, according to the moving object
trajectory estimating device 16 of the third embodiment, not only is it possible to obtain the same operational effects as those of the moving objecttrajectory estimating device 1 according to the first embodiment, but it is also possible to specify the blind spot unique to the oncoming vehicle M16 in accordance with the individual information of the oncoming vehicle M16 and to eliminate the objects contained in the blind spot. Therefore, the trajectories that may be taken by the oncoming vehicle M16 are estimated more accurately, and appropriate trajectory estimation can be performed. - In the third embodiment, the observed
object selecting ECU 18 not only specifies the objects that can be observed from the specified vehicle, based on the blind spot of the specified vehicle, but also may specify an object that can be observed from each object, from detection capability information provided to the specified vehicle. The detection capability information may include the type and presence/absence of a sensor installed in each object, the capability of each sensor to detect an observable distance or observable environment, blind spot, visual field, and the like. - In addition, examples of methods for specifying a vehicle model include reading a license plate or processing the images and then acquiring the vehicle model from the database as described above, and acquiring the vehicle model by means of direct communication. Moreover, the individual information of the vehicle model does not necessarily have to be the vehicle model information, instead the size of the vehicle or the pillar position information may be acquired by the camera or via communication.
- Note that the embodiments described above are merely examples of the moving object trajectory estimating device according to the invention. The moving object trajectory estimating device according to the invention is not limited to those described in the embodiments. For example, the moving object trajectory estimating device according to the invention may be applied to not only in the automatic operation of a vehicle, but also in predicting and warning about the movement of other moving body, as well as a robot.
- While the invention has been described with reference to example embodiments thereof, it should be understood that the invention is not limited to the example embodiments or constructions. To the contrary, the invention is intended to cover various modifications and equivalent arrangements. In addition, while the various elements of the example embodiments are shown in various combinations and configurations, which are example, other combinations and configurations, including more, less or only a single element, are also within the spirit and scope of the invention.
Claims (11)
1. A moving object trajectory estimating device, comprising:
a surrounding information acquisition part that acquires information on surroundings of a moving object;
a trajectory estimating part that specifies another moving object around the moving object based on the surrounding information acquired by the surrounding information acquisition part and estimates a trajectory of the specified moving object; and
a recognition information acquisition part that acquires recognition information on a recognizable area of the specified moving object,
wherein the trajectory estimating part estimates the trajectory of the specified moving object based on the recognition information of the specified moving object acquired by the recognition information acquisition part.
2. The moving object trajectory estimating device according to claim 1 , wherein the recognition information acquisition part acquires, from the specified moving object, information that includes the recognizable area.
3. The moving object trajectory estimating device according to claim 2 , wherein the recognizable area of the specified moving object is a visible area of the specified moving object.
4. The moving object trajectory estimating device according to claim 2 , wherein the recognition information acquisition part acquires information that includes the recognizable area of the specified moving object through communication with the specified moving object.
5. The moving object trajectory estimating device according to claim 4 , wherein the recognizable area of the specified moving object is a visible area of the specified moving object.
6. The moving object trajectory estimating device according to claim 2 , wherein the recognition information acquisition part acquires information that includes the recognizable area of the specified moving object, based on individual information on the specified moving object.
7. The moving object trajectory estimating device according to claim 6 , wherein the recognizable area of the selected moving object is a visible area of the selected moving object.
8. The moving object trajectory estimating device according to claim 6 , wherein the individual information on the specified moving object is individual blind spot information on the specified moving object.
9. The moving object trajectory estimating device according to claim 1 , wherein the recognition information acquisition part acquires information that includes the recognizable area of the specified moving object, based on map information that includes information on the height of a road structure.
10. The moving object trajectory estimating device according to claim 9 , wherein the recognizable area of the selected moving object is a visible area of the selected moving object.
11. The moving object trajectory estimating device according to claim 1 , wherein the moving object trajectory estimating device is adopted to an automatically driven vehicle that determines a trajectory of the moving object based on the estimated trajectory of the specified moving object and automatically controls the moving object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/157,835 US8615109B2 (en) | 2008-04-07 | 2011-06-10 | Moving object trajectory estimating device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-099447 | 2008-04-07 | ||
JP2008099447A JP4561863B2 (en) | 2008-04-07 | 2008-04-07 | Mobile body path estimation device |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/157,835 Continuation US8615109B2 (en) | 2008-04-07 | 2011-06-10 | Moving object trajectory estimating device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090252380A1 true US20090252380A1 (en) | 2009-10-08 |
Family
ID=41051711
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/413,659 Abandoned US20090252380A1 (en) | 2008-04-07 | 2009-03-30 | Moving object trajectory estimating device |
US13/157,835 Active 2030-01-05 US8615109B2 (en) | 2008-04-07 | 2011-06-10 | Moving object trajectory estimating device |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/157,835 Active 2030-01-05 US8615109B2 (en) | 2008-04-07 | 2011-06-10 | Moving object trajectory estimating device |
Country Status (3)
Country | Link |
---|---|
US (2) | US20090252380A1 (en) |
JP (1) | JP4561863B2 (en) |
DE (1) | DE102009016568B4 (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120130588A1 (en) * | 2010-11-22 | 2012-05-24 | Ramadev Burigsay Hukkeri | Object detection system having interference avoidance strategy |
US20120194680A1 (en) * | 2009-10-09 | 2012-08-02 | Clarion Co., Ltd. | Pedestrian detection system |
US8744693B2 (en) | 2010-11-22 | 2014-06-03 | Caterpillar Inc. | Object detection system having adjustable focus |
WO2015038048A1 (en) * | 2013-09-10 | 2015-03-19 | Scania Cv Ab | Detection of an object by use of a 3d camera and a radar |
US20150104073A1 (en) * | 2013-10-16 | 2015-04-16 | Xerox Corporation | Delayed vehicle identification for privacy enforcement |
US9501932B2 (en) | 2009-05-18 | 2016-11-22 | Toyota Jidosha Kabushiki Kaisha | Vehicular environment estimation device |
US20170023404A1 (en) * | 2015-07-21 | 2017-01-26 | Topcon Corporation | Management System For Illumination Facility |
US9786177B2 (en) | 2015-04-10 | 2017-10-10 | Honda Motor Co., Ltd. | Pedestrian path predictions |
US20180233049A1 (en) * | 2017-02-16 | 2018-08-16 | Panasonic Intellectual Property Corporation Of America | Information processing apparatus and non-transitory recording medium |
EP3614362A4 (en) * | 2017-04-19 | 2020-05-13 | Nissan Motor Co., Ltd. | Travel assistance method and travel assistance device |
US10814840B2 (en) * | 2016-12-30 | 2020-10-27 | Hyundai Motor Company | Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method |
US10821946B2 (en) * | 2016-12-30 | 2020-11-03 | Hyundai Motor Company | Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method |
US10891864B2 (en) * | 2019-08-07 | 2021-01-12 | Lg Electronics Inc. | Obstacle warning method for vehicle |
US20220335727A1 (en) * | 2021-03-05 | 2022-10-20 | Tianiin Soterea Automotive Technology Limited Company | Target determination method and apparatus, electronic device, and computer-readable storage medium |
Families Citing this family (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102010044631A1 (en) | 2010-09-07 | 2012-03-08 | Volkswagen Ag | Method for determining collision probability of motor car with turning motor car in e.g. crossing area, involves determining probability values associated with surface elements, and using values for determining collision probability |
US9180882B1 (en) | 2012-06-20 | 2015-11-10 | Google Inc. | Avoiding blind spots of other vehicles |
US9779318B1 (en) | 2014-06-27 | 2017-10-03 | Blinker, Inc. | Method and apparatus for verifying vehicle ownership from an image |
US9818154B1 (en) | 2014-06-27 | 2017-11-14 | Blinker, Inc. | System and method for electronic processing of vehicle transactions based on image detection of vehicle license plate |
US9760776B1 (en) | 2014-06-27 | 2017-09-12 | Blinker, Inc. | Method and apparatus for obtaining a vehicle history report from an image |
US10733471B1 (en) | 2014-06-27 | 2020-08-04 | Blinker, Inc. | Method and apparatus for receiving recall information from an image |
US9607236B1 (en) | 2014-06-27 | 2017-03-28 | Blinker, Inc. | Method and apparatus for providing loan verification from an image |
US9558419B1 (en) | 2014-06-27 | 2017-01-31 | Blinker, Inc. | Method and apparatus for receiving a location of a vehicle service center from an image |
US10515285B2 (en) | 2014-06-27 | 2019-12-24 | Blinker, Inc. | Method and apparatus for blocking information from an image |
US10579892B1 (en) | 2014-06-27 | 2020-03-03 | Blinker, Inc. | Method and apparatus for recovering license plate information from an image |
US9594971B1 (en) | 2014-06-27 | 2017-03-14 | Blinker, Inc. | Method and apparatus for receiving listings of similar vehicles from an image |
US9754171B1 (en) | 2014-06-27 | 2017-09-05 | Blinker, Inc. | Method and apparatus for receiving vehicle information from an image and posting the vehicle information to a website |
US10867327B1 (en) | 2014-06-27 | 2020-12-15 | Blinker, Inc. | System and method for electronic processing of vehicle transactions based on image detection of vehicle license plate |
US10572758B1 (en) | 2014-06-27 | 2020-02-25 | Blinker, Inc. | Method and apparatus for receiving a financing offer from an image |
US9773184B1 (en) | 2014-06-27 | 2017-09-26 | Blinker, Inc. | Method and apparatus for receiving a broadcast radio service offer from an image |
US9600733B1 (en) | 2014-06-27 | 2017-03-21 | Blinker, Inc. | Method and apparatus for receiving car parts data from an image |
US9563814B1 (en) | 2014-06-27 | 2017-02-07 | Blinker, Inc. | Method and apparatus for recovering a vehicle identification number from an image |
US9589202B1 (en) | 2014-06-27 | 2017-03-07 | Blinker, Inc. | Method and apparatus for receiving an insurance quote from an image |
US9589201B1 (en) | 2014-06-27 | 2017-03-07 | Blinker, Inc. | Method and apparatus for recovering a vehicle value from an image |
US10540564B2 (en) | 2014-06-27 | 2020-01-21 | Blinker, Inc. | Method and apparatus for identifying vehicle information from an image |
US9892337B1 (en) | 2014-06-27 | 2018-02-13 | Blinker, Inc. | Method and apparatus for receiving a refinancing offer from an image |
DE102015214689A1 (en) * | 2014-08-04 | 2016-02-04 | Continental Teves Ag & Co. Ohg | System for automated cooperative driving |
DE102015105784A1 (en) * | 2015-04-15 | 2016-10-20 | Denso Corporation | Distributed system for detecting and protecting vulnerable road users |
KR20170014556A (en) * | 2015-07-30 | 2017-02-08 | 삼성전자주식회사 | Method and photographing device for photographing a moving object |
US10776636B2 (en) | 2015-12-29 | 2020-09-15 | Faraday&Future Inc. | Stereo camera-based detection of objects proximate to a vehicle |
US9707961B1 (en) | 2016-01-29 | 2017-07-18 | Ford Global Technologies, Llc | Tracking objects within a dynamic environment for improved localization |
US11261642B2 (en) | 2016-01-29 | 2022-03-01 | Faraday & Future Inc. | System and method for tracking moving objects to avoid interference with vehicular door operations |
US10115025B2 (en) | 2016-06-13 | 2018-10-30 | Ford Global Technologies, Llc | Detecting visibility of a vehicle to driver of other vehicles |
KR102014144B1 (en) * | 2017-09-26 | 2019-08-26 | 엘지전자 주식회사 | Method for controlling the driving system of a vehicle |
MX2020011583A (en) * | 2018-05-11 | 2021-03-25 | Prec Point Systems Llc | Photographic method and system for aiding officials in locating an object. |
DE102018210280A1 (en) * | 2018-06-25 | 2020-01-02 | Robert Bosch Gmbh | Adaptation of the trajectory of an ego vehicle to moving foreign objects |
EP3657460B1 (en) * | 2018-11-23 | 2024-08-07 | Bayerische Motoren Werke Aktiengesellschaft | Method, computer program product, and driver assistance system for determining one or more lanes of a road in an environment of a vehicle |
CA3081212A1 (en) | 2019-05-31 | 2020-11-30 | Indiana Mills & Manufacturing, Inc. | Dual-web retractor arrangement |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6353785B1 (en) * | 1999-03-12 | 2002-03-05 | Navagation Technologies Corp. | Method and system for an in-vehicle computer architecture |
US6396535B1 (en) * | 1999-02-16 | 2002-05-28 | Mitsubishi Electric Research Laboratories, Inc. | Situation awareness system |
US6421463B1 (en) * | 1998-04-01 | 2002-07-16 | Massachusetts Institute Of Technology | Trainable system to search for objects in images |
US6791471B2 (en) * | 2002-10-01 | 2004-09-14 | Electric Data Systems | Communicating position information between vehicles |
US20040246114A1 (en) * | 2003-06-05 | 2004-12-09 | Stefan Hahn | Image processing system for a vehicle |
US7610146B2 (en) * | 1997-10-22 | 2009-10-27 | Intelligent Technologies International, Inc. | Vehicle position determining system and method |
US20100030472A1 (en) * | 2007-03-29 | 2010-02-04 | Toyota Jidosha Kabushiki Kaisha | Collision possibility acquiring device, and collision possibility acquiring method |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4193266B2 (en) * | 1999-02-22 | 2008-12-10 | 株式会社エクォス・リサーチ | Peripheral vehicle notification device |
JP2004145479A (en) * | 2002-10-22 | 2004-05-20 | Aisin Seiki Co Ltd | Device for providing peripheral vehicle information |
JP4239856B2 (en) * | 2004-03-02 | 2009-03-18 | 株式会社デンソー | Communication apparatus and program |
DE102005015088B4 (en) * | 2004-04-02 | 2015-06-18 | Denso Corporation | Vehicle environment monitoring system |
US7639841B2 (en) * | 2004-12-20 | 2009-12-29 | Siemens Corporation | System and method for on-road detection of a vehicle using knowledge fusion |
JP4645891B2 (en) * | 2005-03-24 | 2011-03-09 | 日本精機株式会社 | Vehicle driving support apparatus and vehicle driving support method |
JP4585356B2 (en) | 2005-03-31 | 2010-11-24 | 本田技研工業株式会社 | Inter-vehicle communication system |
JP2007140647A (en) | 2005-11-15 | 2007-06-07 | Yamaguchi Univ | Clinical research support system |
JP2007140674A (en) * | 2005-11-15 | 2007-06-07 | Fuji Heavy Ind Ltd | Dead angle information providing device |
JP4353192B2 (en) * | 2006-03-02 | 2009-10-28 | トヨタ自動車株式会社 | Course setting method, apparatus, program, and automatic driving system |
JP4735346B2 (en) * | 2006-03-09 | 2011-07-27 | 株式会社豊田中央研究所 | Driving support device and driving support system |
ITTO20060214A1 (en) * | 2006-03-22 | 2007-09-23 | Kria S R L | VEHICLE DETECTION SYSTEM |
JP4602277B2 (en) * | 2006-03-28 | 2010-12-22 | 本田技研工業株式会社 | Collision determination device |
JP4906437B2 (en) * | 2006-08-22 | 2012-03-28 | アルパイン株式会社 | Perimeter monitoring system |
US7609174B2 (en) * | 2006-12-12 | 2009-10-27 | Nissan Technical Center North America, Inc. | Vehicle information communication system |
JP2008299676A (en) * | 2007-05-31 | 2008-12-11 | Toyota Motor Corp | Dead angle information requesting/providing devices and inter-vehicle communication system using the same |
US8885039B2 (en) * | 2008-07-25 | 2014-11-11 | Lg Electronics Inc. | Providing vehicle information |
US8947219B2 (en) * | 2011-04-22 | 2015-02-03 | Honda Motors Co., Ltd. | Warning system with heads up display |
-
2008
- 2008-04-07 JP JP2008099447A patent/JP4561863B2/en active Active
-
2009
- 2009-03-30 US US12/413,659 patent/US20090252380A1/en not_active Abandoned
- 2009-04-06 DE DE102009016568.1A patent/DE102009016568B4/en not_active Expired - Fee Related
-
2011
- 2011-06-10 US US13/157,835 patent/US8615109B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7610146B2 (en) * | 1997-10-22 | 2009-10-27 | Intelligent Technologies International, Inc. | Vehicle position determining system and method |
US6421463B1 (en) * | 1998-04-01 | 2002-07-16 | Massachusetts Institute Of Technology | Trainable system to search for objects in images |
US6396535B1 (en) * | 1999-02-16 | 2002-05-28 | Mitsubishi Electric Research Laboratories, Inc. | Situation awareness system |
US6353785B1 (en) * | 1999-03-12 | 2002-03-05 | Navagation Technologies Corp. | Method and system for an in-vehicle computer architecture |
US6791471B2 (en) * | 2002-10-01 | 2004-09-14 | Electric Data Systems | Communicating position information between vehicles |
US20040246114A1 (en) * | 2003-06-05 | 2004-12-09 | Stefan Hahn | Image processing system for a vehicle |
US20100030472A1 (en) * | 2007-03-29 | 2010-02-04 | Toyota Jidosha Kabushiki Kaisha | Collision possibility acquiring device, and collision possibility acquiring method |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11568746B2 (en) * | 2009-05-18 | 2023-01-31 | Toyota Jidosha Kabushiki Kaisha | Vehicular environment estimation device |
US11941985B2 (en) | 2009-05-18 | 2024-03-26 | Toyota Jidosha Kabushiki Kaisha | Vehicular environment estimation device |
US9501932B2 (en) | 2009-05-18 | 2016-11-22 | Toyota Jidosha Kabushiki Kaisha | Vehicular environment estimation device |
US11995988B2 (en) | 2009-05-18 | 2024-05-28 | Toyota Jidosha Kabushiki Kaisha | Vehicular environment estimation device |
US20170032675A1 (en) * | 2009-05-18 | 2017-02-02 | Toyota Jidosha Kabushiki Toshiba | Vehicular environment estimation device |
US20120194680A1 (en) * | 2009-10-09 | 2012-08-02 | Clarion Co., Ltd. | Pedestrian detection system |
US20120130588A1 (en) * | 2010-11-22 | 2012-05-24 | Ramadev Burigsay Hukkeri | Object detection system having interference avoidance strategy |
US8744693B2 (en) | 2010-11-22 | 2014-06-03 | Caterpillar Inc. | Object detection system having adjustable focus |
US8751103B2 (en) * | 2010-11-22 | 2014-06-10 | Caterpillar Inc. | Object detection system having interference avoidance strategy |
WO2015038048A1 (en) * | 2013-09-10 | 2015-03-19 | Scania Cv Ab | Detection of an object by use of a 3d camera and a radar |
US10114117B2 (en) | 2013-09-10 | 2018-10-30 | Scania Cv Ab | Detection of an object by use of a 3D camera and a radar |
US20150104073A1 (en) * | 2013-10-16 | 2015-04-16 | Xerox Corporation | Delayed vehicle identification for privacy enforcement |
US9412031B2 (en) * | 2013-10-16 | 2016-08-09 | Xerox Corporation | Delayed vehicle identification for privacy enforcement |
US9786177B2 (en) | 2015-04-10 | 2017-10-10 | Honda Motor Co., Ltd. | Pedestrian path predictions |
US9952091B2 (en) * | 2015-07-21 | 2018-04-24 | Topcon Corporation | Management system for illumination facility |
US20170023404A1 (en) * | 2015-07-21 | 2017-01-26 | Topcon Corporation | Management System For Illumination Facility |
US10814840B2 (en) * | 2016-12-30 | 2020-10-27 | Hyundai Motor Company | Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method |
US10821946B2 (en) * | 2016-12-30 | 2020-11-03 | Hyundai Motor Company | Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method |
US11584340B2 (en) * | 2016-12-30 | 2023-02-21 | Hyundai Motor Company | Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method |
US20210031737A1 (en) * | 2016-12-30 | 2021-02-04 | Hyundai Motor Company | Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method |
US20180233049A1 (en) * | 2017-02-16 | 2018-08-16 | Panasonic Intellectual Property Corporation Of America | Information processing apparatus and non-transitory recording medium |
US10453344B2 (en) * | 2017-02-16 | 2019-10-22 | Panasonic Intellectual Corporation Of America | Information processing apparatus and non-transitory recording medium |
US10994730B2 (en) | 2017-04-19 | 2021-05-04 | Nissan Motor Co., Ltd. | Traveling assistance method and traveling assistance device |
EP3614362A4 (en) * | 2017-04-19 | 2020-05-13 | Nissan Motor Co., Ltd. | Travel assistance method and travel assistance device |
US10891864B2 (en) * | 2019-08-07 | 2021-01-12 | Lg Electronics Inc. | Obstacle warning method for vehicle |
US20220335727A1 (en) * | 2021-03-05 | 2022-10-20 | Tianiin Soterea Automotive Technology Limited Company | Target determination method and apparatus, electronic device, and computer-readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
JP4561863B2 (en) | 2010-10-13 |
US20110235864A1 (en) | 2011-09-29 |
US8615109B2 (en) | 2013-12-24 |
DE102009016568A1 (en) | 2009-10-08 |
JP2009251953A (en) | 2009-10-29 |
DE102009016568B4 (en) | 2014-02-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8615109B2 (en) | Moving object trajectory estimating device | |
US11703876B2 (en) | Autonomous driving system | |
JP4420011B2 (en) | Object detection device | |
JP7413935B2 (en) | In-vehicle sensor system | |
TW201704067A (en) | Collision avoidance method, computer program product for said collision avoidance method and collision avoidance system | |
US11351997B2 (en) | Collision prediction apparatus and collision prediction method | |
RU2720501C1 (en) | Method for determining interference, method of parking assistance, method of departure assistance and device for determining interference | |
JP7077967B2 (en) | Driving lane estimation device, driving lane estimation method, and control program | |
JP2008037361A (en) | Obstacle recognition device | |
CN112771591B (en) | Method for evaluating the influence of an object in the environment of a vehicle on the driving maneuver of the vehicle | |
JP6828655B2 (en) | Own vehicle position estimation device | |
CN109318894A (en) | Vehicle drive assist system, vehicle drive assisting method and vehicle | |
WO2022070250A1 (en) | Information processing device, information processing method, and program | |
KR101779963B1 (en) | Method of enhancing performance for road-environment recognition device based on learning and apparatus for the same | |
US11420624B2 (en) | Vehicle control apparatus and vehicle control method | |
JP2003276538A (en) | Obstacle predicting device | |
WO2017013692A1 (en) | Travel lane determination device and travel lane determination method | |
JP2010072947A (en) | Obstacle detection device | |
US11933900B2 (en) | Recognition device, vehicle system, recognition method, and storage medium | |
US11884265B1 (en) | Parking assistance method and parking assistance device | |
US20240125604A1 (en) | Method for operating a sensor circuit in a motor vehicle, a sensor circuit, and a motor vehicle with the sensor circuit | |
US20240312058A1 (en) | Robust lidar-to-camera sensor alignment | |
WO2023002863A1 (en) | Driving assistance device, driving assistance method | |
CN117849826A (en) | External recognition device | |
CN117901786A (en) | System and method for verifying presence of target object, vehicle and program product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TOYOTA JIDOSHA KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SHIMIZU, HIROAKI;REEL/FRAME:022466/0746 Effective date: 20090305 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |