WO2008146951A1 - Object recognition device and object recognition method, and lane determination device and lane determination method using them - Google Patents

Object recognition device and object recognition method, and lane determination device and lane determination method using them Download PDF

Info

Publication number
WO2008146951A1
WO2008146951A1 PCT/JP2008/060411 JP2008060411W WO2008146951A1 WO 2008146951 A1 WO2008146951 A1 WO 2008146951A1 JP 2008060411 W JP2008060411 W JP 2008060411W WO 2008146951 A1 WO2008146951 A1 WO 2008146951A1
Authority
WO
WIPO (PCT)
Prior art keywords
object type
host vehicle
recognized
information
image
Prior art date
Application number
PCT/JP2008/060411
Other languages
French (fr)
Inventor
Masaki Nakamura
Tomoaki Ishikawa
Koichi Nakao
Osamu Aisaka
Motoki Kanba
Kiyokazu Okada
Original Assignee
Aisin Aw Co., Ltd.
Toyota Jidosha Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aisin Aw Co., Ltd., Toyota Jidosha Kabushiki Kaisha filed Critical Aisin Aw Co., Ltd.
Publication of WO2008146951A1 publication Critical patent/WO2008146951A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/582Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/24Character recognition characterised by the processing or recognition method
    • G06V30/248Character recognition characterised by the processing or recognition method involving plural approaches, e.g. verification by template match; Resolving confusion among similar patterns, e.g. "O" versus "Q"

Definitions

  • the present invention relates to an object recognition device and an object recognition method capable of determining an object type of a targeted object included in image information that is taken by an imaging device, and also relates to a lane determination device and a lane determination method using them.
  • a lane determination device For the purpose of appropriate route guidance by a navigation system, a lane determination device has been known in recent years that determines a host vehicle lane in a road on which the host vehicle is traveling based on various information obtained from inside and outside the host vehicle.
  • a lane determination device includes that in Japanese Patent Application Publication No. JP-A-2006- 162409, for example, where a structure is described that specifies a lane position of the host vehicle and outputs a determination result.
  • the specification is based on information including the following: light beacon information from a vehicle information processing system such as a vehicle information and communication system (VICS); estimated information from a current location management unit; an event such as steering information or turning signal information from a driver input information management unit; a number of recognized lanes from an image recognition device; a host lane position among the number of recognized lanes; a lane internal position (whether the host vehicle is positioned more leftward or rightward within the lane); increases and decreases in the number of lanes; increases and decreases in the number of lane directions; road shoulder information (whether a road shoulder exists and so on); crossing condition (whether the lane or white line is being crossed and so on); and road indicator (paint) information.
  • VICS vehicle information processing system
  • estimated information from a current location management unit such as a vehicle information and communication system (VICS)
  • an event such as steering information or turning signal information from a driver input information management unit
  • a number of recognized lanes from an image recognition device a host lane position among the number
  • a structure is described in which the lane position of the host vehicle is specified by collation of an image recognition result for road indicators such as a crosswalk or arrows indicating traffic sections by the travel direction of lanes, e.g. straight travel and right and left turns, with information obtained from a database regarding an object type, object position, and the like of the applicable object.
  • road indicators such as a crosswalk or arrows indicating traffic sections by the travel direction of lanes, e.g. straight travel and right and left turns
  • image recognition processing is performed for road indicators such as arrows indicating traffic sections by the travel directions of lanes.
  • the image recognition result is then collated with information obtained from a database regarding an object type and an object position of the applicable road indicator, thus enabling a determination of a host vehicle lane.
  • the object type of road indicators such as arrows present on the host vehicle lane must by accurately recognized by the image recognition processing.
  • the image recognition result is actually false due to reasons such as partial fading of the road indicator, or a portion of the road indicator not being included in the image information taken.
  • vehicle lane determination is performed in such cases based on the false image recognition result, a lane other than the actual host vehicle lane may be determined as the host vehicle lane.
  • the present invention was devised in light of the foregoing problem, and it is an object of the present invention to provide an object recognition device that, when determining an object type of a targeted object included in image information taken by an imaging device, is capable of appropriately determining the object type in consideration of the possibility of false recognition of the object type.
  • a characteristic configuration of an object recognition device includes: image information obtaining unit that obtains image information taken by an imaging device; image recognizing unit that performs image recognition processing of an object type of a targeted object included in the image information; a false recognition table that, when a portion of a form of the targeted object of a predetermined first object type cannot be recognized in an image with regard to a plurality of object types applicable as the target object, prescribes a relation of the first object type with an object type designated as a second object type different from the first object type that may be falsely recognized; and object type determining unit that determines the object type of the targeted object included in the image information, wherein the object type deterniining unit, based on a recognized object type indicated in an image recognition result from the image recognizing unit and the false recognition table, determines that the object type of the targeted object in the image information is potentially one of the recognized object type and the object type prescribed as related with the recognized object type in the false recognition table.
  • determination of the object type of the targeted object included in the image information taken by the imaging device is based on the false recognition table that prescribes in advance related object types which may be falsely recognized if a portion of the targeted object cannot be recognized in an image due to fading or the like.
  • the recognized object specified in the image recognition result and object types that may be falsely recognized are included when determining the object type of the targeted object included in the image information.
  • the object type can be appropriately determined in consideration of the possibility of false recognition of the object type.
  • the object type determining unit preferably has a configuration wherein the object type deterniining unit determines that if the recognized object type is the first object type, then the object type of the targeted object included in the image information is the recognized object type, and determines that if the recognized object type is the second object type, then the object type of the targeted object included in the image information is potentially one of the recognized object type and the first object type prescribed as related with the recognized object type in the false recognition table.
  • the relation between the first object type and the second object type prescribed in the false recognition table is one where, if a portion of the form of the targeted object of the first object type cannot be recognized in an image, then the first object type may be falsely recognized as the second object type. Therefore, if the recognized object type specified in the image recognition result is the second object type, then it is difficult to determine if the image recognition result is correct, or if the second object type has been falsely recognized due to a portion of the form of the targeted object of the first object type being faded and unrecognizable in an image. According to this configuration, if it is difficult to determine the correctness of the image recognition result specifying that the recognized object type is the second object type, then the object type of the targeted object included in the image information is determined including the possibility of false recognition of the first object type.
  • the recognized object type is the first object type, then the possibility of false recognition is low and the object type of the targeted object included in the image information is determined as the recognized object type.
  • the object type can be appropriately determined depending on how possible false recognition is of the recognized object type specified in the image recognition result.
  • relation between the object types prescribed in the false recognition table is preferably a relation where the form of the object of the first object type has at least two or more characteristic parts, and the form of the object of the second object type resembles the object of the first object type except for a portion of the two or more characteristic parts.
  • the image recognizing unit is structured to perform image recognition processing of the object type by recognizing the characteristic form of the targeted object in an image
  • the structure of the object recognition device is particularly well suited to the determination of object types related to arrow-shaped road indicators that represent traffic sections by travel directions and for which there is a plurality of object types that are prone to being falsely recognized as one another.
  • the object type of the arrow-shaped road indicator provided in the lanes of the road on which the host vehicle is traveling can be appropriately determined.
  • the lane in which the host vehicle is traveling i.e., the host vehicle lane, can also be appropriately determined. [0012] ,
  • a straight/right-turn arrow and a straight/left-turn arrow designated as the first object type are preferably related with a straight arrow designated as the second object type
  • a right/left-turn arrow designated as the first object type are preferably related with a right-turn arrow and a left-turn arrow designated as the second object type.
  • the object recognition device is also preferably structured further including: host vehicle position information obtaining unit that obtain host vehicle position information that indicates a current position of the host vehicle; and object information obtaining unit that obtain object information regarding one, two, or more targeted objects present in a traveling direction of the host vehicle, based on the host vehicle position information, wherein the imaging device is installed in the host vehicle, and the object type determining unit determines the object type of the targeted object included in the image information from among the object types of the one, two, or more targeted objects indicated in the object information obtained by the object information obtaining unit.
  • an object type other than the object types specified in the object information for one, two, or more targeted objects in the traveling direction of the host vehicle is not determined as the object type of the targeted object included in the image information.
  • an object type incapable of. existing in the traveling direction of the host vehicle in view of the object information obtained based on the host vehicle position information can be prevented from being determined as the object type of the targeted object included in the image information.
  • the object type can be determined with greater accuracy by the object type determining unit.
  • a characteristic configuration of a lane determination device includes: the object recognition device having the above structures; host vehicle position information obtaining unit that obtain host vehicle position information that indicates a current position of a host vehicle; object information obtaining unit that obtain object information of targeted objects present in lanes in a traveling direction of the host vehicle based on the host vehicle position information when a road on which the host vehicle is traveling has a plurality of lanes; and lane determining unit that determine a host vehicle lane, which is a lane where the host vehicle is traveling, from among the plurality of lanes, wherein the image recognizing unit of the object, recognition device performs image recognition processing of the object type of the targeted object in the host vehicle lane included in the image information, and the lane determining unit determines that the host vehicle lane is one, two, or more lanes for which the obtained object information indicates an object type that matches with the object type indicated in the determination result made by the obj ect type determining unit of the obj ect recognition device.
  • the host vehicle lane is determined as the lane with a matching object type, based on the determination result made by the object type determining unit of the object recognition device and based on the object information of the targeted objects in the lanes in the traveling direction of the host vehicle as obtained by the object information obtaining unit.
  • the host vehicle lane can be appropriately determined using the determination result made by the object type determining unit of the object recognition device.
  • a characteristic configuration of a navigation system includes: the above lane determination device; a map database that stores map information including the object information; an application program that operates in reference to the map information and information regarding the host vehicle lane determined by the lane determination device; and guidance information output unit that operate in accordance with the application program and output guidance information.
  • a characteristic configuration of an object recognition method includes the steps of: obtaining image information taken by an imaging device; performing image recognition processing of an object type of a targeted object included in the image information; and with regard to a plurality of object types applicable as the target object, when a portion of a form of the targeted object of a predetermined first object type cannot be recognized in an image, and based on a false recognition table that prescribes a relation of the first object type with an object type that may be falsely recognized designated as a second object type different from the first object type and also based on a recognized object type indicated in an image recognition result from the image recognizing step, determining that the object type of the targeted object in the image information is potentially one of the recognized object type and the object type prescribed as related with the recognized object type in the false recognition table.
  • determination of the object type of the targeted object included in the image information taken by the imaging device is based on the false recognition table that prescribes in advance related object types which may be falsely recognized if a portion of the targeted object cannot be recognized in an image due to fading or the like.
  • the recognized object specified in the image recognition result and object types that may be falsely recognized are included when determining the object type of the targeted object included in the image information.
  • the object type can be appropriately determined in consideration of the possibility of false recognition of the object type.
  • the step for determining the object type is preferably configured such that if the recognized object type is the first object type, then the object type of the targeted object included in the image information is determined as the recognized object type, and if the recognized object type is the second object type, then the object type of the targeted object included in the image information is determined as potentially one of the recognized object type and the first object type prescribed as related with the recognized object type in the false recognition table. [0022] According to this configuration, if it is difficult to determine the correctness of the image recognition result specifying that the recognized object type is the second type, then the object type of the targeted object included in the image information is determined including the possibility of false recognition of the first object type.
  • the recognized object type is the first object type, then the possibility of false recognition is low and the object type of the targeted object included in the image information is determined as the recognized object type.
  • the object type can be appropriately determined depending on how possible false recognition is of the recognized object type specified in the image recognition result.
  • a characteristic configuration of a lane determination method includes the steps of: obtaining image information taken by an imaging device installed in a host vehicle; obtaining host vehicle position information indicating a current position of the host vehicle; when a road on which the host vehicle is traveling has a plurality of lanes based on the host vehicle position information, obtaining object information of targeted objects present in the lanes in a traveling direction of the host vehicle; performing image recognition processing of an object type of the targeted object in a host vehicle lane, which is a lane where the host vehicle is traveling; with regard to a plurality of object types applicable as the target object, when a portion of a form of the targeted object of a predetermined first object type cannot be recognized in an image, and based on a false recognition table that prescribes a relation of the first object type with an object type that may be falsely recognized designated as a second object type different from the first object type and also based on a recognized object type indicated in an image recognition result from the image recognizing step, determining that the object type
  • determination of the object type of the targeted object in the host vehicle lane is based on the false recognition table that prescribes in advance related object types which may be falsely recognized if a portion of the targeted object cannot be recognized in an image due to fading or the like.
  • the recognized object specified in the image recognition result and object types that may be falsely recognized are included when determining the object type of the targeted object in the host vehicle lane.
  • the object type can be appropriately determined in consideration of the possibility of false recognition of the object type.
  • the host vehicle lane can be appropriately determined based on such a determination result of the object type and based on the object information of the targeted objects present in the lanes in the traveling direction of the host vehicle.
  • FIG. 1 is a block diagram showing a general configuration of a navigation system that includes an object recognition device and a lane determination device according to an embodiment of the present invention
  • FIG. 2 is an explanatory drawing showing an example of the structure of map information and object information stored in a map database;
  • FIG. 3 is a drawing showing an example of a layout configuration of an imaging device in a host vehicle
  • FIG. 4 is a drawing showing an example of a false recognition table T according the embodiment of the present invention
  • FIG. 5 shows drawings of examples of relations between object types with the potential for false recognition
  • FIG. 6 is an explanatory drawing showing a specific example of host vehicle lane determination processing according the embodiment of the present invention.
  • FIG. 7 is an explanatory drawing showing a specific example of host vehicle lane determination processing according the embodiment of the present invention.
  • FIG. 8 is a flowchart showing an entire processing sequence of a lane determination method that includes an object type determination method according the embodiment of the present invention.
  • FIG. 9 is a flowchart showing a detailed processing sequence of the object type determination method according the embodiment of the present invention.
  • FIG. 1 is a block diagram showing a general configuration of the navigation system 1 that includes the object recognition device 2 and the lane determination device 3 according to the present embodiment.
  • the object recognition device 2 uses a false recognition table T to determine an object type of a targeted object within a host vehicle lane, which is included in image information G taken by an imaging device 21 installed in a host vehicle 30 (see FIG. 3).
  • the lane determination device 3 determines the host vehicle lane from multiple lanes on a road the host vehicle 30 is traveling (hereinafter referred to as the "traveled road"), based on a determination result made by an object type determining unit 8 that structures the object recognition device 2, and based on object information F obtained from a map database 22.
  • the navigation system 1 refers to a determination result made by a lane determining unit 9 that structures the lane determination device 3, and performs a predetermined navigation operation.
  • an image information obtaining unit 4 i.e., an image information obtaining unit 4, an image recognizing unit 5, a host vehicle position information obtaining unit 6, a data extracting unit 7, the object type determining unit 8, the lane determining unit 9, and a navigation computing unit 10
  • a computation processing device such as a CPU acting as a core member.
  • the functional parts for performing various processing with respect to data input are also structured with hardware, software (a program), or both.
  • the map database 22 is provided with a device that has a storage medium capable of storing information and driving unit therefor, such as a hard disk drive, a DVD drive equipped with a DVD-ROM, or a CD drive equipped with a CD-ROM, for example.
  • the map database 22 is a database that stores map information M divided into predetermined areas, and a plurality of object information F associated with the map information M.
  • FIG. 2 is an explanatory drawing showing an example of the structure of the map information M and the object information F stored in the map database 22.
  • the map database 22 holds a road network layer ml, a road configuration layer m2, and an object layer m3.
  • the road network layer ml specifies information about connections among roads. More specifically, the road network layer ml is structured with information regarding a plurality of nodes n that have positional information on a map expressed by longitude and latitude, and information regarding a plurality of links k that structure roads connecting two nodes n.
  • the links k also have information regarding the road class (classes such as expressway, toll road, national road, and prefectural road), link length, and the like as link information.
  • the road configuration layer m2 is stored associated with the road network layer ml and specifies the configurations of roads. More specifically, the road configuration layer m2 is structured with information regarding the road width and the like, as well as with information regarding a plurality of road configuration supplemental points s that have positional information on a map expressed by longitude and latitude and are arranged between two nodes n (on the link k).
  • the map information M is structured by information stored in the road network layer ml and the road configuration layer m2. [0030]
  • the object layer m3 is structured associated to the road network layer ml and the road configuration layer m2, and stores information regarding various objects found on and around roads, namely, the object information F.
  • the objects of the object information F stored in the object layer m3 include road indicators provided on the surfaces of roads.
  • Objects pertaining to such road indicators include arrow-shaped road indicators (hereinafter simply referred to as "arrow indicators") that represent traffic sections by the travel direction of lanes. More specifically, a straight arrow, a straight/right-turn arrow, a straight/left-turn arrow, a right-turn arrow, a left-turn arrow, and a right/left-turn arrow are included.
  • the arrow indicator is an object that can be targeted.
  • objects pertaining to road indicators include various painted indicators.
  • objects stored in the object information F can include various objects in addition to the above road indicators, such as traffic signals, signs, overpasses, and tunnels.
  • the contents of the object information F also include positional information, object type information, form information, and attribute information for each object.
  • the positional information possesses information regarding a position (longitude and latitude) on a map with representative points for each object and regarding an orientation for each object.
  • the representative point of the object is set to a center position in a width direction and a length direction of the object, for example.
  • the object type information represents an object type for each object.
  • one object type is prescribed for objects with the same shape as a general rule.
  • the information regarding the object type represents a specific type of road indicator, such as the straight arrow, the right-turn arrow, the stop line, and the crosswalk, m addition, the form information possesses information such as the shape, size, color, and the like for each object.
  • the attribute information includes lane information that expresses which road lane an object is disposed on, when the road on which the object is provided has multiple lanes. This lane information is represented as "2/3" for example, when the applicable object is provided in the center lane of a road with three lanes going in one direction of traffic. [0032]
  • the image information obtaining unit 4 functions as image information obtaining unit for obtaining the image information G around the host vehicle position taken by the imaging device 21.
  • the imaging device 21 is an onboard camera or the like equipped with an image sensor, and is provided at a position where at least the road surface of a road around the host vehicle 30 can be imaged.
  • Such an imaging device 21 preferably uses a back camera that images the road surface of the rear of the host vehicle 30, for example, as illustrated in FIG. 3.
  • the image information obtaining unit 4 imports analog imaging information taken by the imaging device 21 at a predetermined time interval, which is then converted to obtain the digital signal image information G.
  • the time interval at which the image information G is imported can be set to approximately 10 to 50 m.s, for example.
  • the image information obtaining unit 4 can continuously obtain the image information G from multiple frames taken by the imaging device 21.
  • the image information G obtained here is output to the image recognizing unit 5. [0033]
  • the host vehicle position information obtaining unit 6 functions as host vehicle position information obtaining unit for obtaining the host vehicle position information P specifying the current position of the host vehicle 30.
  • the host vehicle position information obtaining unit 6 is connected with a GPS receiver 23, an orientation sensor 24, and a distance sensor 25.
  • the GPS receiver 23 is a device that receives a GPS signal from a GPS (Global Positioning System) satellite. The GPS signal is normally received every second, and output to the host vehicle position information obtaining unit 6.
  • the signal received by the GPS receiver 23 from the GPS satellite can be analyzed to obtain the current position (latitude and longitude), direction of travel, speed of movement, and the like of the host vehicle 30.
  • the orientation sensor 24 detects the traveling direction of the host vehicle 30 and changes in the direction of travel.
  • the orientation sensor 24 is structured from a gyro sensor, a geomagnetic sensor, an optical rotation sensor or rotation type resistance volume attached to a rotating portion of a steering wheel, and an angular sensor attached to a vehicle wheel portion, for example. Also, the orientation sensor 24 outputs a detection result thereof to the host vehicle position information obtaining unit 6.
  • the distance sensor 25 detects a vehicle speed and a movement distance of the host vehicle 30.
  • the distance sensor 25 is structured from a vehicle speed pulse sensor that outputs a pulse signal every time a vehicle drive shaft, wheel, or the like rotates a certain amount, a yaw/G sensor that detects an acceleration of the host vehicle 30, and a circuit that integrates the detected acceleration, for example. Also, the distance sensor 25 outputs information regarding the vehicle speed and the movement distance as a detection result thereof to the host vehicle position information obtaining unit 6. [0034] Based on the output from the GPS receiver 23, the orientation sensor 24, and the distance sensor 25, the host vehicle position information obtaining unit 6 performs a computation according to a known method to specify the host vehicle position.
  • the host vehicle position information obtaining unit 6 obtains the map information M around the host vehicle position, which is extracted from the map database 22 by the data extracting unit 7. By performing known map matching based thereon, the host vehicle position information obtaining unit 6 also corrects the host vehicle position to match the road specified in the map information M. In this manner, the host vehicle position information obtaining unit 6 obtains the host vehicle position information P, which includes information regarding the current position of the host vehicle 30 expressed in latitude and longitude, and information regarding the traveling direction of the host vehicle 30. However, the host vehicle position information P thus obtained is not information capable of specifying the lane in which the host vehicle 30 is traveling, i.e., the host vehicle lane, when the road the host vehicle 30 is traveling on has multiple lanes.
  • the navigation system 1 is structured such that determination of the host vehicle lane is made in the lane determining unit 9 described later. Furthermore, the host vehicle position information P obtained by the host vehicle position information obtaining unit 6 is output to the data extracting unit 7, the lane determining unit 9, and the navigation computing unit 10. [0035] 4. Data Extracting Unit
  • the data extracting unit 7 extracts the required map information M and the object information F from the map database 22, based on the host vehicle position information P obtained by the host vehicle position information obtaining unit 6 and so on. According to the present embodiment, the data extracting unit 7 extracts and obtains the object information F for one or more targeted objects present in the traveling direction of the host vehicle 30, based on the host vehicle position information P. More specifically, when the traveled road has multiple lanes, the data extracting unit 7 extracts and obtains the object information F for the targeted objects present in the lanes on the traveled road in the traveling direction of the host vehicle 30. Also, the obtained object information F is output to the image recognizing unit 5 and the lane determining unit 9.
  • the object that is, the targeted object
  • the object is subject to image recognition processing by the image recognizing unit 5, and is also an object of an object type subject to object type determination processing by the object type determining unit 8.
  • various arrow indicators to be described later are applicable as such objects.
  • the data extracting unit 7 functions as object information obtaining unit in the present invention. Additionally, the data extracting unit 7 extracts the map information M around the host vehicle position to be used in the map matching performed by the host vehicle position information obtaining unit 6, and outputs the map information M to the host vehicle position information obtaining unit 6.
  • the data extracting unit 7 further extracts the map information M from the map database 22 for an area requested by the navigation computing unit, 10 and outputs such map information M to the navigation computing unit 10. [0036] 5. Image Recognizing Unit
  • the image recognizing unit 5 functions as image recognizing unit for performing image recognition processing of a targeted object included in the image information G obtained by the image information obtaining unit 4.
  • the image recognizing unit 5 uses the object information F of the targeted object extracted by the data extracting unit 7 to perform image recognition processing for the object type of the targeted object in the lane in which the host vehicle 30 is traveling, i.e., the host vehicle lane.
  • the object information F of the targeted object to be used here is a plurality of object information F pertaining to targeted objects present in the lanes in the traveling direction of the host vehicle 30 when the traveled road has multiple lanes.
  • the image recognizing unit 5 performs binarization processing, edge detection processing, and the like with respect to the obtained image information G, and extracts outline information for the object (road indicator) included in the image information G.
  • the image recognizing unit 5 subsequently extracts outline information that matches any of the forms respectively specified in the form information for a plurality of targeted objects, which are included in the object information F for the plurality of targeted objects extracted by the data extracting unit 7.
  • the object type specified in the object information F pertaining to the form information that matches with the outline information is recognized as the object type of the targeted object in the host vehicle lane included in the image information G.
  • the object type of the targeted object in the host vehicle lane recognized by the image recognizing unit 5 becomes a recognized object type specified in an image recognition result.
  • image recognition processing is performed using the object information F of the targeted object extracted from the map database 22 by the data extracting unit 7, based on the host vehicle position information P. Accordingly, the image recognition result is prevented from becoming an object type that is inapplicable to the targeted object in the host vehicle lane.
  • FIG. 4 is a drawing showing an example of the false recognition table T according the present embodiment.
  • the false recognition table T pertains to a plurality of object types that are applicable as targeted objects. If a portion of the form of the targeted object of a predetermined first object type fcl cannot be recognized in an image, then there is a possibility that a second object type fc2 different from the first object type fcl may be falsely recognized instead.
  • This table prescribed such relations between object types.
  • the false recognition table T classifies the plurality of object types applicable as targeted objects into the first object type fcl or the second object type fc2, and prescribes the first object type fcl as having the potential to be falsely recognized as the second object type fc2.
  • the examples shown in the figure are eight types of arrow indicators, which are object types applicable as targeted objects, and consist of a straight arrow, a straight/right-turn arrow, a straight/left-turn arrow, a right-turn arrow, a left-turn arrow, a right/left-turn arrow, a right-turn type 2 arrow, and a left-turn type 2 arrow.
  • the five types of arrow indicators consisting of the straight/right-turn arrow, the straight/left-turn arrow, the right/left-turn arrow, the right-turn type 2 arrow, and the left-turn type 2 arrow are classified as first object types fcl, while the three types of arrow indicators consisting of the straight arrow, the right-turn arrow, and the left-turn arrow are classified as second object types fc2.
  • a two-dimensional matrix is used in the false recognition table T to prescribe the relations among the five first object types fcl arranged in the horizontal direction on the top side of the figure and the three second object types fc2 arranged in the vertical direction on the left side of the figure.
  • the relation indicated by a dot in the table between the first object type fcl and the second object type fc2 is defined as one where the first object type fcl may be falsely recognized as the corresponding second object type fc2.
  • the relations prescribed by the false recognition table T for object types that may be falsely recognized are object type relations where, if a portion of the form of the targeted object of the first object type fcl cannot be recognized in an image, there is a possibility that the second object type fc2 different from the first object type fcl may be falsely recognized instead.
  • object type relations are ones where the form of the object of the first object type fcl has at least two or more characteristic parts, and the form of the object of the second object type fc2 resembles the object of the first object type fcl except for a portion of the two or more characteristic parts.
  • 5A to 5E pertain to the eight types of arrow indicators that are object types applicable as targeted objects, and are drawings showing examples of relations between object types with the potential for false recognition.
  • Square frames Cl to C4 in the figures indicate characteristic parts of the arrow indicators.
  • the characteristic part is a part that is relatively easily recognizable in image recognition processing performed by the image recognizing unit 5.
  • the characteristic parts correspond to triangular portions indicating the arrow direction or the like, which are enclosed by the frames Cl to C3 in the figures.
  • the relation shown in FIG. 5A is one between the straight/right-tum arrow, which is classified as a first object type fcl, and the straight arrow, which is classified as a second object type fc2.
  • the straight/right-turn arrow has two characteristic parts, the straight arrow part Cl and the right-turn arrow part C2. Meanwhile, the straight arrow only has the straight arrow part Cl as a characteristic part.
  • the right-turn arrow part C2 which is one characteristic part of the straight/right-turn arrow classified as the first object type fcl, cannot be recognized in an image, then only the straight arrow part Cl is recognized in the image.
  • the straight/right-turn arrow may be falsely recognized as a straight arrow classified as the second object fc2.
  • the right-turn arrow part C2 of the straight/right-turn arrow is arranged on a right side with respect to a center of the lane in the width direction. Therefore, the right-turn arrow part C2 is repeatedly run over by vehicle wheels and can fade easily as a result. For this reason, there is a relatively high possibility that the right-turn arrow part C2 will be unrecognizable in an image. Accordingly, there is a possibility that the straight/right-turn arrow, which is classified as the first object type fcl, will be falsely recognized as the straight arrow, which is classified as the second object type fc2. [0040]
  • the relation shown in FIG. 5B is one between the straight/left-turn arrow, which is classified as a first object type fcl, and the straight arrow, which is classified as a second object type fc2.
  • the straight/left-turn arrow has two characteristic parts, the straight arrow part Cl and the left-turn arrow part C3. Meanwhile, the straight arrow only has the straight arrow part Cl as a characteristic part.
  • the left-turn arrow part C3 which is one characteristic part of the straight/left-turn arrow classified as the first object type fcl, cannot be recognized in an image, then only the straight arrow part Cl is recognized in the image.
  • the straight/left-turn arrow may be falsely recognized as a straight arrow classified as the second object fc2.
  • the left-turn arrow part C3 of the straight/left-turn arrow is arranged on a left side with respect to a center of the lane in the width direction. Therefore, the left-turn arrow part C3 is repeatedly run over by vehicle wheels and can fade easily as a result. For this reason, there is a relatively high possibility that the left-turn arrow part C3 will be unrecognizable in an image. Accordingly, there is a possibility that the straight/left-turn arrow, which is classified as the first object type fcl, will be falsely recognized as the straight arrow, which is classified as the second object type fc2. [0041]
  • the relation shown in FIG. 5 C is one between the right/left-turn arrow, which is classified as a first object type fcl, and the right-turn arrow, which is classified as a second object type fc2.
  • the right/left-turn arrow has two characteristic parts, the right-turn arrow part C2 and the left-turn arrow part C3.
  • the right-turn arrow only has the right-turn arrow part C2 as a characteristic part.
  • the left-turn arrow part C3 which is one characteristic part of the right/left-turn arrow classified as the first object type fcl, cannot be recognized in an image, then only the right-turn arrow part C2 is recognized in the image.
  • the right/left-turn arrow may be falsely recognized as a right-turn arrow classified as the second object fc2.
  • the left-turn arrow part C3 of the right/left-turn arrow is arranged on a left side with respect to a center of the lane in the width direction. Therefore, the left-turn arrow part C3 is repeatedly run over by vehicle wheels and can fade easily as a result. For this reason, there is a relatively high possibility that the left-turn arrow part C3 will be unrecognizable in an image. Accordingly, there is a possibility that the right/left-turn arrow, which is classified as the first object type fcl, will be falsely recognized as the right-turn arrow, which is classified as the second object type fc2. [0042]
  • the relation shown in FIG. 5D is one between the right/left-turn arrow, which is classified as a first object type fcl, and the left-turn arrow, which is classified as a second object type fc2.
  • the right/left-turn arrow has two characteristic parts, the right-turn arrow part C2 and the left-turn arrow part C3.
  • the left-turn arrow only has the left-turn arrow part C3 as a characteristic part.
  • the right-turn arrow part C2 which is one characteristic part of the right/left-turn arrow classified as the first object type fcl, cannot be recognized in an image, then only the left-turn arrow part C3 is recognized in the image.
  • the right/left-turn arrow may be falsely recognized as a left-turn arrow classified as the second object fc2.
  • the right-turn arrow part C2 of the right/left-turn arrow is arranged on a right side with respect to a center of the lane in the width direction. Therefore, the right-turn arrow part C2 is repeatedly run over by vehicle wheels and can fade easily as a result. For this reason, there is a relatively high possibility that the right-turn arrow part C2 will be unrecognizable in an image. Accordingly, there is a possibility that the right/left-turn arrow, which is classified as the first object type fcl, will be falsely recognized as the left-turn arrow, which is classified as the second object type fc2. [0043]
  • the relation shown in FIG. 5E is one between the right-turn type 2 arrow, which is classified as a first object type fcl, and the right-turn arrow, which is classified as a second object type fc2.
  • the right-turn type 2 arrow has two characteristic parts, the right-turn arrow part C2 and a straight part C4. Meanwhile, the right-turn arrow only has the right-turn arrow part C2 as a characteristic part.
  • the straight part C4 which is one characteristic part of the right-turn type 2 arrow classified as the first object type fcl, cannot be recognized in an image, then only the right-turn arrow part C2 is recognized in the image.
  • the right-turn type 2 arrow may be falsely recognized as a right-turn arrow classified as the second object fc2.
  • the straight part C4 of the right-turn type 2 arrow has few relatively easily recognizable characteristic parts in image recognition processing compared to the triangular parts Cl to C3 indicating the arrow direction. For this reason, there is a relatively high possibility that the straight part C4 will be unrecognizable in an image. Accordingly, there is a possibility that the right-turn type 2 arrow, which is classified as the first object type fcl, will be falsely recognized as the right-turn arrow, which is classified as the second object type fc2. [0044]
  • the relation shown in FIG. 5F is one between the left-turn type 2 arrow, which is classified as a first object type fcl, and the left-turn arrow, which is classified as a second object type fc2.
  • the left-turn type 2 arrow has two characteristic parts, the left-turn arrow part C3 and the straight part C4. Meanwhile, the left-turn arrow only has the left-turn arrow part C3 as a characteristic part.
  • the straight part C4 which is one characteristic part of the left-turn type 2 arrow classified as the first object type fcl, cannot be recognized in an image, then only the left-turn arrow part C3 is recognized in the image.
  • the left-turn type 2 arrow may be falsely recognized as a left-turn arrow classified as the second object fc2.
  • the straight part C4 of the left-turn type 2 arrow has few relatively easily recognizable characteristic parts in image recognition processing compared to the triangular parts Cl to C3 indicating the arrow direction. For this reason, there is a relatively high possibility that the straight part C4 will be unrecognizable in an image. Accordingly, there is a possibility that the left-turn type 2 arrow, which is classified as the first object type fcl, will be falsely recognized as the left-turn arrow, which is classified as the second object type fc2.
  • the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is the recognized object type. Accordingly, object types other than the above eight types of arrow indicators that have no risk of being falsely recognized as another object type such as the second object type fc2 are preferably prescribed in the false recognition table T as first object types fcl. Thus, object types for which there is no possibility of false recognition as another object type can be prescribed in the false recognition table T so that the recognized object type specified in the image recognition result is thereby determined as the object type of the targeted object in the host vehicle lane.
  • the object type determining unit 8 functions as object type determining unit for determining the object type of the targeted object included in the image information G. According to the present embodiment, the object type determining unit 8 determines the object type of the targeted object in the host vehicle lane included in the image information G based on the false recognition table T and the recognized object type specified in the image recognition result from the image recognizing unit 5. At this time, when the false recognition table T has another object type whose prescribed relation with the recognized object type has the risk of false recognition, the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is potentially either the recognized object type itself specified in the image recognition result, or another object type prescribed as related with the recognized object type in the false recognition table T.
  • the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is the recognized object type itself specified in the image recognition result. [0047] More specifically, in cases where the recognized object type specified in the image recognition result is the first object type fcl based on the false recognition table T, the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is the recognized object type.
  • the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is potentially either the recognized object type, or the first object type fcl prescribed as related with the recognized object type in the false recognition table T. In other words, if the recognized object type specified in the image recognition result is the first object type fcl having many characteristic parts and there is no possibility of false recognition as another object type, then the object type of the targeted object in the host vehicle lane is determined as the recognized object type.
  • the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is potentially either the recognized object type itself, or the first object type fcl that may have been falsely recognized as the recognized object type. For example, if the recognized object type specified in the image recognition result is the straight/right-turn arrow, i.e., the first object type fcl, then the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is the straight/right-turn arrow, i.e., the recognized object type itself.
  • the object type determining unit 8 refers to the false recognition table T, namely the first row from the top in FIG. 4, and finds that the straight/right-turn arrow and the straight-left-turn arrow classified as first object types fcl are prescribed as related with the recognized object type.
  • the object type determining unit 8 determines that there is a possibility of the object type of the targeted object in the host vehicle lane being the straight arrow, i.e., the recognized object type itself, or the straight/right-turn arrow or the straight/left-turn arrow, i.e., the first object types fcl prescribed as related with the recognized object type in the false recognition table T.
  • the lane determining unit 9 functions as lane determining unit for determining the lane in which the host vehicle is traveling, i.e., the host vehicle lane, from multiple lanes in cases where the road on which the host vehicle 30 is traveling has multiple lanes. According to the present embodiment, based on the object type specified in the determination result made by the object type determining unit 8 and the object information F of the targeted object extracted by the data extracting unit 7, the lane determining unit 9 determines the host vehicle lane as one, two, or more lanes for which the obtained object information F specifies an object type matching the object type specified in the determination result made by the object type determining unit 8.
  • the lane determining unit 9 performs processing to determine the host vehicle lane when determination of the host vehicle lane is required, namely, only when the traveled road has multiple lanes in the traveling direction (on one side) based on the host vehicle position information P. Also, the lane determining unit 9 outputs host vehicle lane information S as a determination result to the navigation computing unit 10. Thus, the navigation computing unit 10 can perform operations for guide functions such as route guidance and route searching with reference to the host vehicle lane information S.
  • FIGS. 6 and 7 are explanatory drawings for describing specific examples of host vehicle lane determination processing performed by the lane determining unit 9.
  • the object type in the rectangular box on the left side is a recognized object type fa specified in an image recognition result made by the image recognizing unit 5 regarding the targeted object in the host vehicle lane.
  • object types FtI to Ft4 of the targeted objects in the lanes on the traveled road specified in the object information F 5 which were obtained by the data extracting unit 7, are shown to the right of the recognized object type fa.
  • information that specifies the host vehicle lane and is framed at the bottom of the figures corresponds to the host vehicle lane information S.
  • the object information F obtained by the data extracting unit 7 regarding the object types of the targeted objects in the lanes of the traveled road specifies the following: the object type FtI of a targeted object in a first lane Ll is a straight/left-turn arrow, the object type Ft2 of a targeted object in a second lane L2 is a straight arrow, the object type Ft3 of a targeted object in a third lane L3 is a straight arrow, and the object type Ft4 of a targeted object in a fourth lane L4 is a right-turn arrow.
  • a straight arrow is specified as the recognized object type fa specified in the image recognition result of the targeted object in the host vehicle lane according to the image recognizing unit 5.
  • the straight arrow is a second object type fc2 as prescribed in the false recognition table T.
  • the object type determining unit 8 determines that there are three possibilities for the object type of the targeted object in the host vehicle lane: the straight arrow, the straight/right-turn arrow, and the straight/left-turn arrow.
  • the lane determining unit 9 extracts those that match with the three object types of straight arrow, straight/right-turn arrow, and straight/left-turn arrow. Then, the one, two, or more lanes for which the obtained object information F specifies object types matching with these three object types are determined as the host vehicle lanes. In the present example, a straight/right-turn arrow does not exist among the object types FtI to Ft4 of the targeted objects in the lanes.
  • the host vehicle lanes are determined as the first lane Ll for which the object type FtI is a straight/left-turn arrow, as well as the second lane L2 and the third lane L3 for which the object types Ft2, Ft3 are straight arrows. Also, the lane determining unit 9 generates information specifying that the first to third lanes Ll to L3 are the host vehicle lanes as the host vehicle lane information S. [0051]
  • the object information F obtained by the data extracting unit 7 is the same as that used for the example shown in FIG. 6.
  • a straight/left-turn arrow is specified as the recognized object type fa specified in the image recognition result of the targeted object in the host vehicle lane according to the image recognizing unit 5.
  • the straight/left-turn arrow is a first object type fcl as prescribed in the false recognition table T.
  • the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is the recognized object type fa itself, namely, the straight/left-turn arrow.
  • the lane determining unit 9 determines the host vehicle lane as the first lane Ll, i.e., the lane for which the obtained object information F specifies an object type matching with the object type of straight/left-turn arrow, as specified in the determination result made by the object type determining unit 8. Also, the lane determining unit 9 generates information specifying that the first lane Ll is the host vehicle lane as the host vehicle lane information S. [0052]
  • the navigation computing unit 10 is computation processing unit for operating in accordance with an application program AP in order to mainly execute guide functions as the navigation system 1.
  • the application program AP operates with reference to information including the host vehicle position information P obtained by the host vehicle position information obtaining unit 6, the map information M extracted by the data extracting unit 7, and the host vehicle lane information S generated by the lane determining unit 9.
  • the navigation computing unit 10, in accordance with the application program AP obtains map information M around the host vehicle 30 from the map database 22 via the data extracting unit 7 to display a map image on a display input device 26, and also performs processing to display a host vehicle position mark superimposed over the map image, based on the host vehicle position information P.
  • the navigation computing unit 10 performs a route search from a place of departure to a destination, and based on the host vehicle position information P and the route thus searched, further performs route guidance using either or both the display input device 26 and an audio output device 27.
  • the application program AP refers to the host vehicle lane information determined by the lane determining unit 8, and performs navigation operations such as displaying the host vehicle position, searching for a route, and route guidance. More specifically, for example, the application program AP performs operations such as displaying the determined host vehicle lane on the display input device 26, or canceling route guidance that requires an impossible lane change depending on the determined host vehicle lane.
  • the navigation computing unit 10 is connected with the display input device 26 and the audio output device 27.
  • the display input device 26 integrates a display device such as a liquid crystal display device with an input device such as a touch panel.
  • the audio output device is structured with a speaker and the like, hi the present embodiment, the navigation computing unit 10, the display input device 26, and the audio output device 27 function as guidance information output unit 28 of the present invention.
  • FIG. 8 is a flowchart showing an entire processing sequence of the lane determination method that includes the object type determination method according the present embodiment.
  • FIG. 9 is a flowchart showing a detailed processing sequence of the object type determination method according the present embodiment.
  • step #02 it is determined whether the traveled road has multiple lanes by using the data extracting unit 7 to refer to either or both the map information M and the object information F in the vicinity of the host vehicle position, which are stored in the map database 22 (step #02). If the traveled road does not have multiple lanes (No at step #02), that is, if the traveled road has only one lane going in each direction, then the host vehicle lane determination is unnecessary and the processing is ended. [0055] However, if the traveled road has multiple lanes (Yes at step #02), then the data extracting unit 7 extracts and obtains from the map database 22 the object information F for the targeted objects present in the lanes on the traveled road in the traveling direction of the host vehicle 30 (step #03).
  • the image information obtaining unit 4 obtains the image information G taken by the imaging device 21 installed in the host vehicle 30 (step #04).
  • the image recognizing unit 5 then performs image recognition processing for the object type of the targeted object in the host vehicle lane, that is, the lane in which the host vehicle 30 is traveling (step #05).
  • the object type determining unit 8 subsequently performs processing to determine the object type of the targeted object in the host vehicle lane included in the image information G (step #06).
  • the object type determination processing method of the targeted object in the host vehicle lane will be described in detail later based on a flowchart in FIG. 9.
  • the lane determining unit 9 determines the lane in which the host vehicle is traveling, i.e., the host vehicle lane, from the multiple lanes of the road on which the host vehicle 30 is traveling, and generates the host vehicle information S (step #07).
  • the lane determining unit 9 determines the host vehicle lane as one, two, or more lanes for which the obtained object information F specifies an object type matching the object type specified in the determination result made by the object type determining unit 8. With this, the entire processing of the lane determination method is ended. [0056]
  • the object type determining unit 8 obtains information regarding the recognized object type specified in the image recognition result from the image recognition processing performed at step #05 (step #11). The object type determining unit 8 then refers to the false recognition table T (step #12). Next, the object type determining unit 8 determines whether the recognized object type obtained at step #11 is the first object type fcl (step #13). If the recognized object type obtained at step #11 is the first object type fcl (Yes at step #13), then the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is the recognized object type (step #14).
  • the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is potentially either the recognized object type, or the first object type fcl prescribed as related with the recognized object type in the false recognition table T (step #15). Note that since the specific method employed by the object type determining unit 8 for determining the object type of the targeted object in the host vehicle lane has already been described in detail using FIGS. 6 and 7, such specifics are omitted here. With this, the processing of the object type determination method at step #06 is ended.
  • the relation between the first object type fcl and the second object type fc2 as prescribed in the false recognition table T shown in FIG. 4 is only an example, and many other object type relations are naturally possible.
  • the object type relations prescribed in the false recognition table T are preferably ones where the form of the object of the first object type fcl has at least two or more characteristic parts, and the form of the object of the second object type fc2 resembles the object of the first object type except for some of the two or more characteristic parts.
  • the object type relations prescribed in the false recognition table T are not limited to the above relations, provided that the relation prescribed in the false recognition table T is preferably one where the image recognition result of the object type of one object is potentially falsely recognizable as another object type.
  • the object type determining unit 8 was described as having a structure where the object type of the targeted object lane included in the image information G is determined based on the false recognition table T and the recognized object type specified in the image recognition result.
  • the object type determining unit 8 is not limited to such a structure.
  • another preferred embodiment of the present invention is one in which the object type determining unit 8 further utilizes the object information F obtained by the data extracting unit 7 based on the host vehicle position information P to determine the object type of the targeted object included in the image information G from the object types of one, two, or more targeted objects specified in the object information F.
  • the object type determining unit 8 does not determine that there is a possibility of the object type being the object type of the targeted object in the host vehicle lane.
  • the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is potentially any one of the straight arrow, the straight/right-turn arrow, and the straight/left-turn arrow. However, as shown in FIG.
  • the lane determining unit 8 determines the object type of the targeted object in the host vehicle lane from among these object types. Accordingly, in this case, the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is potentially either the straight arrow or the straight/left-turn arrow. Thus, it is possible to prevent the object type determining unit 8 from determining an object type result that is inapplicable to the targeted object in the host vehicle lane.
  • arrow-shaped road indicators that represent traffic sections indicating the travel directions of lanes are objects applicable as targeted objects.
  • the objects applicable as targeted objects are not limited to these, and objects of various object type that are provided on the road surface of the road, such as other road indicators, may also be targeted objects.
  • objects arranged in the lanes of a road having multiple lanes are particularly preferred as targeted objects.
  • another lane determination method may be based on the object information F of a division line around the host vehicle position obtained from the map database 22 and a positional relationship among the host vehicle, the division lines, and the types of division lines (line types including a solid line, a broken line, and double lines) around the host vehicle as specified in the Image recognition result.
  • Such a lane determination method may also use a method for determining the lane in which the host vehicle is traveling, as well as a method for determining the host vehicle lane based on information from VICS, namely, information from a light beacon or the like generated by a transmitter provided in each lane of the road.
  • the above embodiments describe examples in which the entire structure of the navigation system 1, including the object recognition device 2 and the lane determination device 3, is installed in the host vehicle.
  • the scope of the present invention is not limited to such a structure.
  • another preferred embodiment of the present invention is, for example, one in which a portion of the structure excluding the imaging device 21 is disposed outside of the host vehicle in a connected state via a communications network such as the internet, the lane determination device 3 and the navigation system 1 are structured by the sending and receiving of signals and information via the network.
  • the object recognition device 2 may naturally also be used for purposes other than determination of the host vehicle lane, such as for correcting the host vehicle position or the like using the image recognition result of the object type and the object information F obtained by the data extracting unit 7.
  • the present invention can be utilized preferably as: an object recognition device that determines an object type of a targeted object included in image information that is taken by an imaging device, a lane determination device installed hi a vehicle, and a navigation system using the object recognition device and the lane determination device.

Abstract

The present invention appropriately determines an object type in consideration of the possibility of false recognition of the object type, when determining the object type of a targeted object included in image information taken by an imaging device. The present invention includes: unit for obtaining image information; unit for performing image recognition processing of an object type of a targeted object included in the image information; a false recognition table that is related to a plurality of object types applicable as the target object and, when a portion of a form of the targeted object of a predetermined first object type cannot be recognized in an image, prescribes a relation of the first object type with an object type designated as a second object type different from the first object type that may be falsely recognized; and object type determining unit for determining, based on a recognized object type specified in an image recognition result and the false recognition table, that the object type of the targeted object in the image information is potentially either the recognized object type, or the object type prescribed as related with the recognized object type in the false recognition table.

Description

DESCRIPTION
OBJECT RECOGNITION DEVICE AND OBJECT RECOGNITION METHOD5 AND
LANE DETERMINATION DEVICE AND LANE DETERMINATION METHOD USING THEM
TECHNICAL FIELD
[0001] The present invention relates to an object recognition device and an object recognition method capable of determining an object type of a targeted object included in image information that is taken by an imaging device, and also relates to a lane determination device and a lane determination method using them.
BACKGROUND ART
[0002]
For the purpose of appropriate route guidance by a navigation system, a lane determination device has been known in recent years that determines a host vehicle lane in a road on which the host vehicle is traveling based on various information obtained from inside and outside the host vehicle. Such a lane determination device includes that in Japanese Patent Application Publication No. JP-A-2006- 162409, for example, where a structure is described that specifies a lane position of the host vehicle and outputs a determination result. The specification is based on information including the following: light beacon information from a vehicle information processing system such as a vehicle information and communication system (VICS); estimated information from a current location management unit; an event such as steering information or turning signal information from a driver input information management unit; a number of recognized lanes from an image recognition device; a host lane position among the number of recognized lanes; a lane internal position (whether the host vehicle is positioned more leftward or rightward within the lane); increases and decreases in the number of lanes; increases and decreases in the number of lane directions; road shoulder information (whether a road shoulder exists and so on); crossing condition (whether the lane or white line is being crossed and so on); and road indicator (paint) information. Regarding specification of the lane position of the host vehicle using the road indicator information, a structure is described in which the lane position of the host vehicle is specified by collation of an image recognition result for road indicators such as a crosswalk or arrows indicating traffic sections by the travel direction of lanes, e.g. straight travel and right and left turns, with information obtained from a database regarding an object type, object position, and the like of the applicable object.
DISCLOSURE OF THE INVENTION
Problem to be Solved by the Invention [0003] As described above, image recognition processing is performed for road indicators such as arrows indicating traffic sections by the travel directions of lanes. The image recognition result is then collated with information obtained from a database regarding an object type and an object position of the applicable road indicator, thus enabling a determination of a host vehicle lane. In order to appropriately perform such a host vehicle lane determination, the object type of road indicators such as arrows present on the host vehicle lane must by accurately recognized by the image recognition processing. However, there are cases when the image recognition result is actually false due to reasons such as partial fading of the road indicator, or a portion of the road indicator not being included in the image information taken. When the host, vehicle lane determination is performed in such cases based on the false image recognition result, a lane other than the actual host vehicle lane may be determined as the host vehicle lane.
[0004]
The present invention was devised in light of the foregoing problem, and it is an object of the present invention to provide an object recognition device that, when determining an object type of a targeted object included in image information taken by an imaging device, is capable of appropriately determining the object type in consideration of the possibility of false recognition of the object type.
Means for Solving the Problem [0005]
In order to achieve the above object, a characteristic configuration of an object recognition device according to the present invention includes: image information obtaining unit that obtains image information taken by an imaging device; image recognizing unit that performs image recognition processing of an object type of a targeted object included in the image information; a false recognition table that, when a portion of a form of the targeted object of a predetermined first object type cannot be recognized in an image with regard to a plurality of object types applicable as the target object, prescribes a relation of the first object type with an object type designated as a second object type different from the first object type that may be falsely recognized; and object type determining unit that determines the object type of the targeted object included in the image information, wherein the object type deterniining unit, based on a recognized object type indicated in an image recognition result from the image recognizing unit and the false recognition table, determines that the object type of the targeted object in the image information is potentially one of the recognized object type and the object type prescribed as related with the recognized object type in the false recognition table. [0006]
According to this characteristic configuration, determination of the object type of the targeted object included in the image information taken by the imaging device is based on the false recognition table that prescribes in advance related object types which may be falsely recognized if a portion of the targeted object cannot be recognized in an image due to fading or the like. Thus, the recognized object specified in the image recognition result and object types that may be falsely recognized are included when determining the object type of the targeted object included in the image information. As a consequence, the object type can be appropriately determined in consideration of the possibility of false recognition of the object type. [0007]
Here, the object type determining unit preferably has a configuration wherein the object type deterniining unit determines that if the recognized object type is the first object type, then the object type of the targeted object included in the image information is the recognized object type, and determines that if the recognized object type is the second object type, then the object type of the targeted object included in the image information is potentially one of the recognized object type and the first object type prescribed as related with the recognized object type in the false recognition table. [0008]
As explained above, the relation between the first object type and the second object type prescribed in the false recognition table is one where, if a portion of the form of the targeted object of the first object type cannot be recognized in an image, then the first object type may be falsely recognized as the second object type. Therefore, if the recognized object type specified in the image recognition result is the second object type, then it is difficult to determine if the image recognition result is correct, or if the second object type has been falsely recognized due to a portion of the form of the targeted object of the first object type being faded and unrecognizable in an image. According to this configuration, if it is difficult to determine the correctness of the image recognition result specifying that the recognized object type is the second object type, then the object type of the targeted object included in the image information is determined including the possibility of false recognition of the first object type.
Meanwhile, if the recognized object type is the first object type, then the possibility of false recognition is low and the object type of the targeted object included in the image information is determined as the recognized object type. As a consequence, the object type can be appropriately determined depending on how possible false recognition is of the recognized object type specified in the image recognition result.
[0009]
In addition, the relation between the object types prescribed in the false recognition table is preferably a relation where the form of the object of the first object type has at least two or more characteristic parts, and the form of the object of the second object type resembles the object of the first object type except for a portion of the two or more characteristic parts. [0010]
According to this configuration, in cases where the image recognizing unit is structured to perform image recognition processing of the object type by recognizing the characteristic form of the targeted object in an image, it is possible to appropriately prescribed in the false recognition table the relation between the first and second object types, wherein the first object type may be falsely recognized as the second object type due to a portion of the form of the targeted object of the first object type being unrecognizable in an image. [0011]
The structure of the object recognition device according to the present invention is particularly well suited to the determination of object types related to arrow-shaped road indicators that represent traffic sections by travel directions and for which there is a plurality of object types that are prone to being falsely recognized as one another. Thus, the object type of the arrow-shaped road indicator provided in the lanes of the road on which the host vehicle is traveling can be appropriately determined. Furthermore, the lane in which the host vehicle is traveling, i.e., the host vehicle lane, can also be appropriately determined. [0012] ,
Here, for the object type relations prescribed in the false recognition table, a straight/right-turn arrow and a straight/left-turn arrow designated as the first object type are preferably related with a straight arrow designated as the second object type, and a right/left-turn arrow designated as the first object type are preferably related with a right-turn arrow and a left-turn arrow designated as the second object type. [0013]
The object recognition device is also preferably structured further including: host vehicle position information obtaining unit that obtain host vehicle position information that indicates a current position of the host vehicle; and object information obtaining unit that obtain object information regarding one, two, or more targeted objects present in a traveling direction of the host vehicle, based on the host vehicle position information, wherein the imaging device is installed in the host vehicle, and the object type determining unit determines the object type of the targeted object included in the image information from among the object types of the one, two, or more targeted objects indicated in the object information obtained by the object information obtaining unit.
[0014]
According to this configuration, an object type other than the object types specified in the object information for one, two, or more targeted objects in the traveling direction of the host vehicle is not determined as the object type of the targeted object included in the image information. As a consequence, an object type incapable of. existing in the traveling direction of the host vehicle in view of the object information obtained based on the host vehicle position information can be prevented from being determined as the object type of the targeted object included in the image information. Moreover, the object type can be determined with greater accuracy by the object type determining unit. [0015]
A characteristic configuration of a lane determination device according to the present invention includes: the object recognition device having the above structures; host vehicle position information obtaining unit that obtain host vehicle position information that indicates a current position of a host vehicle; object information obtaining unit that obtain object information of targeted objects present in lanes in a traveling direction of the host vehicle based on the host vehicle position information when a road on which the host vehicle is traveling has a plurality of lanes; and lane determining unit that determine a host vehicle lane, which is a lane where the host vehicle is traveling, from among the plurality of lanes, wherein the image recognizing unit of the object, recognition device performs image recognition processing of the object type of the targeted object in the host vehicle lane included in the image information, and the lane determining unit determines that the host vehicle lane is one, two, or more lanes for which the obtained object information indicates an object type that matches with the object type indicated in the determination result made by the obj ect type determining unit of the obj ect recognition device. [0016]
According to this characteristic configuration, the host vehicle lane is determined as the lane with a matching object type, based on the determination result made by the object type determining unit of the object recognition device and based on the object information of the targeted objects in the lanes in the traveling direction of the host vehicle as obtained by the object information obtaining unit. As a consequence, the host vehicle lane can be appropriately determined using the determination result made by the object type determining unit of the object recognition device. [0017]
A characteristic configuration of a navigation system according to the present invention includes: the above lane determination device; a map database that stores map information including the object information; an application program that operates in reference to the map information and information regarding the host vehicle lane determined by the lane determination device; and guidance information output unit that operate in accordance with the application program and output guidance information. [0018]
According to this characteristic configuration, based on the host vehicle lane in which the host vehicle is traveling as determined by the lane determination device, it is possible to appropriately perform operations for guide functions such as displaying the host vehicle lane, searching for a route, and route guidance. [0019]
A characteristic configuration of an object recognition method according to the present invention includes the steps of: obtaining image information taken by an imaging device; performing image recognition processing of an object type of a targeted object included in the image information; and with regard to a plurality of object types applicable as the target object, when a portion of a form of the targeted object of a predetermined first object type cannot be recognized in an image, and based on a false recognition table that prescribes a relation of the first object type with an object type that may be falsely recognized designated as a second object type different from the first object type and also based on a recognized object type indicated in an image recognition result from the image recognizing step, determining that the object type of the targeted object in the image information is potentially one of the recognized object type and the object type prescribed as related with the recognized object type in the false recognition table. [0020]
According to this characteristic configuration, determination of the object type of the targeted object included in the image information taken by the imaging device is based on the false recognition table that prescribes in advance related object types which may be falsely recognized if a portion of the targeted object cannot be recognized in an image due to fading or the like. Thus, the recognized object specified in the image recognition result and object types that may be falsely recognized are included when determining the object type of the targeted object included in the image information. As a consequence, the object type can be appropriately determined in consideration of the possibility of false recognition of the object type. [0021]
Here, the step for determining the object type is preferably configured such that if the recognized object type is the first object type, then the object type of the targeted object included in the image information is determined as the recognized object type, and if the recognized object type is the second object type, then the object type of the targeted object included in the image information is determined as potentially one of the recognized object type and the first object type prescribed as related with the recognized object type in the false recognition table. [0022] According to this configuration, if it is difficult to determine the correctness of the image recognition result specifying that the recognized object type is the second type, then the object type of the targeted object included in the image information is determined including the possibility of false recognition of the first object type. Meanwhile, if the recognized object type is the first object type, then the possibility of false recognition is low and the object type of the targeted object included in the image information is determined as the recognized object type. As a consequence, the object type can be appropriately determined depending on how possible false recognition is of the recognized object type specified in the image recognition result. [0023] A characteristic configuration of a lane determination method according to the present invention includes the steps of: obtaining image information taken by an imaging device installed in a host vehicle; obtaining host vehicle position information indicating a current position of the host vehicle; when a road on which the host vehicle is traveling has a plurality of lanes based on the host vehicle position information, obtaining object information of targeted objects present in the lanes in a traveling direction of the host vehicle; performing image recognition processing of an object type of the targeted object in a host vehicle lane, which is a lane where the host vehicle is traveling; with regard to a plurality of object types applicable as the target object, when a portion of a form of the targeted object of a predetermined first object type cannot be recognized in an image, and based on a false recognition table that prescribes a relation of the first object type with an object type that may be falsely recognized designated as a second object type different from the first object type and also based on a recognized object type indicated in an image recognition result from the image recognizing step, determining that the object type of the targeted object in the host vehicle lane is potentially one of the recognized object type and the object type prescribed as related with the recognized object type in the false recognition table; and determining the host vehicle lane from among the plurality of lanes as one, two, or more lanes for which the obtained object information indicates an object type that matches with the object type indicated in an object type determination result at the object type determining step. [0024] According to this characteristic configuration, determination of the object type of the targeted object in the host vehicle lane is based on the false recognition table that prescribes in advance related object types which may be falsely recognized if a portion of the targeted object cannot be recognized in an image due to fading or the like. Thus, the recognized object specified in the image recognition result and object types that may be falsely recognized are included when determining the object type of the targeted object in the host vehicle lane. As a consequence, the object type can be appropriately determined in consideration of the possibility of false recognition of the object type. Moreover, the host vehicle lane can be appropriately determined based on such a determination result of the object type and based on the object information of the targeted objects present in the lanes in the traveling direction of the host vehicle.
BRIEF DESCRIPTION OF THE DRAWINGS
[0025]
FIG. 1 is a block diagram showing a general configuration of a navigation system that includes an object recognition device and a lane determination device according to an embodiment of the present invention; FIG. 2 is an explanatory drawing showing an example of the structure of map information and object information stored in a map database;
FIG. 3 is a drawing showing an example of a layout configuration of an imaging device in a host vehicle; FIG. 4 is a drawing showing an example of a false recognition table T according the embodiment of the present invention;
FIG. 5 shows drawings of examples of relations between object types with the potential for false recognition;
FIG. 6 is an explanatory drawing showing a specific example of host vehicle lane determination processing according the embodiment of the present invention;
FIG. 7 is an explanatory drawing showing a specific example of host vehicle lane determination processing according the embodiment of the present invention;
FIG. 8 is a flowchart showing an entire processing sequence of a lane determination method that includes an object type determination method according the embodiment of the present invention; and
FIG. 9 is a flowchart showing a detailed processing sequence of the object type determination method according the embodiment of the present invention.
BEST MODES FOR CARRYING OUT THE INVENTION
[0026]
An embodiment of the present invention will be described based on the accompanying drawings. The embodiment is an example wherein an object recognition device 2 according to the present invention is applied to a lane determination device 3 for a vehicle traveling on a road and a navigation system 1 that includes the lane determination device 3. FIG. 1 is a block diagram showing a general configuration of the navigation system 1 that includes the object recognition device 2 and the lane determination device 3 according to the present embodiment. The object recognition device 2 uses a false recognition table T to determine an object type of a targeted object within a host vehicle lane, which is included in image information G taken by an imaging device 21 installed in a host vehicle 30 (see FIG. 3). The lane determination device 3 determines the host vehicle lane from multiple lanes on a road the host vehicle 30 is traveling (hereinafter referred to as the "traveled road"), based on a determination result made by an object type determining unit 8 that structures the object recognition device 2, and based on object information F obtained from a map database 22. The navigation system 1 refers to a determination result made by a lane determining unit 9 that structures the lane determination device 3, and performs a predetermined navigation operation. [0027]
Referring to FIG. 1, functional parts of the navigation system 1, i.e., an image information obtaining unit 4, an image recognizing unit 5, a host vehicle position information obtaining unit 6, a data extracting unit 7, the object type determining unit 8, the lane determining unit 9, and a navigation computing unit 10, are structured with a computation processing device such as a CPU acting as a core member. The functional parts for performing various processing with respect to data input are also structured with hardware, software (a program), or both. As a hardware structure, the map database 22 is provided with a device that has a storage medium capable of storing information and driving unit therefor, such as a hard disk drive, a DVD drive equipped with a DVD-ROM, or a CD drive equipped with a CD-ROM, for example. The structures of respective portions of the navigation system 1 according to the present embodiment will be described in detail below.
[0028] 1. Map Database
The map database 22 is a database that stores map information M divided into predetermined areas, and a plurality of object information F associated with the map information M. FIG. 2 is an explanatory drawing showing an example of the structure of the map information M and the object information F stored in the map database 22. As illustrated in the figure, the map database 22 holds a road network layer ml, a road configuration layer m2, and an object layer m3. [0029] The road network layer ml specifies information about connections among roads. More specifically, the road network layer ml is structured with information regarding a plurality of nodes n that have positional information on a map expressed by longitude and latitude, and information regarding a plurality of links k that structure roads connecting two nodes n. The links k also have information regarding the road class (classes such as expressway, toll road, national road, and prefectural road), link length, and the like as link information. The road configuration layer m2 is stored associated with the road network layer ml and specifies the configurations of roads. More specifically, the road configuration layer m2 is structured with information regarding the road width and the like, as well as with information regarding a plurality of road configuration supplemental points s that have positional information on a map expressed by longitude and latitude and are arranged between two nodes n (on the link k). The map information M is structured by information stored in the road network layer ml and the road configuration layer m2. [0030]
The object layer m3 is structured associated to the road network layer ml and the road configuration layer m2, and stores information regarding various objects found on and around roads, namely, the object information F. The objects of the object information F stored in the object layer m3 include road indicators provided on the surfaces of roads. Objects pertaining to such road indicators include arrow-shaped road indicators (hereinafter simply referred to as "arrow indicators") that represent traffic sections by the travel direction of lanes. More specifically, a straight arrow, a straight/right-turn arrow, a straight/left-turn arrow, a right-turn arrow, a left-turn arrow, and a right/left-turn arrow are included. As described later, according to the present embodiment, the arrow indicator is an object that can be targeted. In addition to this, other objects pertaining to road indicators include various painted indicators. For example: crosswalks, stop lines, indicators of intersection configurations (such as four-way intersections and three-way intersections), division lines that divide vehicle lanes along roads (such as solid lines, broken lines, and double lines), speed indications, and zebra zones. Note that objects stored in the object information F can include various objects in addition to the above road indicators, such as traffic signals, signs, overpasses, and tunnels. [0031]
The contents of the object information F also include positional information, object type information, form information, and attribute information for each object. Here, the positional information possesses information regarding a position (longitude and latitude) on a map with representative points for each object and regarding an orientation for each object. The representative point of the object is set to a center position in a width direction and a length direction of the object, for example. The object type information represents an object type for each object. Here, one object type is prescribed for objects with the same shape as a general rule. Accordingly, the information regarding the object type represents a specific type of road indicator, such as the straight arrow, the right-turn arrow, the stop line, and the crosswalk, m addition, the form information possesses information such as the shape, size, color, and the like for each object. The attribute information includes lane information that expresses which road lane an object is disposed on, when the road on which the object is provided has multiple lanes. This lane information is represented as "2/3" for example, when the applicable object is provided in the center lane of a road with three lanes going in one direction of traffic. [0032]
2. Image Information Obtaining Unit
The image information obtaining unit 4 functions as image information obtaining unit for obtaining the image information G around the host vehicle position taken by the imaging device 21. Here, the imaging device 21 is an onboard camera or the like equipped with an image sensor, and is provided at a position where at least the road surface of a road around the host vehicle 30 can be imaged. Such an imaging device 21 preferably uses a back camera that images the road surface of the rear of the host vehicle 30, for example, as illustrated in FIG. 3. The image information obtaining unit 4 imports analog imaging information taken by the imaging device 21 at a predetermined time interval, which is then converted to obtain the digital signal image information G. In this case, the time interval at which the image information G is imported can be set to approximately 10 to 50 m.s, for example. Thus, the image information obtaining unit 4 can continuously obtain the image information G from multiple frames taken by the imaging device 21. The image information G obtained here is output to the image recognizing unit 5. [0033]
3. Host Vehicle Position Information Obtaining Unit The host vehicle position information obtaining unit 6 functions as host vehicle position information obtaining unit for obtaining the host vehicle position information P specifying the current position of the host vehicle 30. In the case, the host vehicle position information obtaining unit 6 is connected with a GPS receiver 23, an orientation sensor 24, and a distance sensor 25. Here, the GPS receiver 23 is a device that receives a GPS signal from a GPS (Global Positioning System) satellite. The GPS signal is normally received every second, and output to the host vehicle position information obtaining unit 6. In the host vehicle position information obtaining unit 6, the signal received by the GPS receiver 23 from the GPS satellite can be analyzed to obtain the current position (latitude and longitude), direction of travel, speed of movement, and the like of the host vehicle 30. The orientation sensor 24 detects the traveling direction of the host vehicle 30 and changes in the direction of travel. The orientation sensor 24 is structured from a gyro sensor, a geomagnetic sensor, an optical rotation sensor or rotation type resistance volume attached to a rotating portion of a steering wheel, and an angular sensor attached to a vehicle wheel portion, for example. Also, the orientation sensor 24 outputs a detection result thereof to the host vehicle position information obtaining unit 6. The distance sensor 25 detects a vehicle speed and a movement distance of the host vehicle 30. The distance sensor 25 is structured from a vehicle speed pulse sensor that outputs a pulse signal every time a vehicle drive shaft, wheel, or the like rotates a certain amount, a yaw/G sensor that detects an acceleration of the host vehicle 30, and a circuit that integrates the detected acceleration, for example. Also, the distance sensor 25 outputs information regarding the vehicle speed and the movement distance as a detection result thereof to the host vehicle position information obtaining unit 6. [0034] Based on the output from the GPS receiver 23, the orientation sensor 24, and the distance sensor 25, the host vehicle position information obtaining unit 6 performs a computation according to a known method to specify the host vehicle position. In addition, the host vehicle position information obtaining unit 6 obtains the map information M around the host vehicle position, which is extracted from the map database 22 by the data extracting unit 7. By performing known map matching based thereon, the host vehicle position information obtaining unit 6 also corrects the host vehicle position to match the road specified in the map information M. In this manner, the host vehicle position information obtaining unit 6 obtains the host vehicle position information P, which includes information regarding the current position of the host vehicle 30 expressed in latitude and longitude, and information regarding the traveling direction of the host vehicle 30. However, the host vehicle position information P thus obtained is not information capable of specifying the lane in which the host vehicle 30 is traveling, i.e., the host vehicle lane, when the road the host vehicle 30 is traveling on has multiple lanes. Hence, the navigation system 1 according to the present embodiment is structured such that determination of the host vehicle lane is made in the lane determining unit 9 described later. Furthermore, the host vehicle position information P obtained by the host vehicle position information obtaining unit 6 is output to the data extracting unit 7, the lane determining unit 9, and the navigation computing unit 10. [0035] 4. Data Extracting Unit
The data extracting unit 7 extracts the required map information M and the object information F from the map database 22, based on the host vehicle position information P obtained by the host vehicle position information obtaining unit 6 and so on. According to the present embodiment, the data extracting unit 7 extracts and obtains the object information F for one or more targeted objects present in the traveling direction of the host vehicle 30, based on the host vehicle position information P. More specifically, when the traveled road has multiple lanes, the data extracting unit 7 extracts and obtains the object information F for the targeted objects present in the lanes on the traveled road in the traveling direction of the host vehicle 30. Also, the obtained object information F is output to the image recognizing unit 5 and the lane determining unit 9. Here, the object, that is, the targeted object, is subject to image recognition processing by the image recognizing unit 5, and is also an object of an object type subject to object type determination processing by the object type determining unit 8. In the present embodiment, various arrow indicators to be described later are applicable as such objects. Thus according to the present embodiment, the data extracting unit 7 functions as object information obtaining unit in the present invention. Additionally, the data extracting unit 7 extracts the map information M around the host vehicle position to be used in the map matching performed by the host vehicle position information obtaining unit 6, and outputs the map information M to the host vehicle position information obtaining unit 6. For use in navigation processing performed by the navigation computing unit 10, the data extracting unit 7 further extracts the map information M from the map database 22 for an area requested by the navigation computing unit, 10 and outputs such map information M to the navigation computing unit 10. [0036] 5. Image Recognizing Unit
The image recognizing unit 5 functions as image recognizing unit for performing image recognition processing of a targeted object included in the image information G obtained by the image information obtaining unit 4. According to the present embodiment, the image recognizing unit 5 uses the object information F of the targeted object extracted by the data extracting unit 7 to perform image recognition processing for the object type of the targeted object in the lane in which the host vehicle 30 is traveling, i.e., the host vehicle lane. The object information F of the targeted object to be used here is a plurality of object information F pertaining to targeted objects present in the lanes in the traveling direction of the host vehicle 30 when the traveled road has multiple lanes. More specifically, the image recognizing unit 5 performs binarization processing, edge detection processing, and the like with respect to the obtained image information G, and extracts outline information for the object (road indicator) included in the image information G. The image recognizing unit 5 subsequently extracts outline information that matches any of the forms respectively specified in the form information for a plurality of targeted objects, which are included in the object information F for the plurality of targeted objects extracted by the data extracting unit 7. When such outline information is extracted, the object type specified in the object information F pertaining to the form information that matches with the outline information is recognized as the object type of the targeted object in the host vehicle lane included in the image information G. hi this manner, the object type of the targeted object in the host vehicle lane recognized by the image recognizing unit 5 becomes a recognized object type specified in an image recognition result. Note that in the present embodiment, as described above, image recognition processing is performed using the object information F of the targeted object extracted from the map database 22 by the data extracting unit 7, based on the host vehicle position information P. Accordingly, the image recognition result is prevented from becoming an object type that is inapplicable to the targeted object in the host vehicle lane.
[0037] 6. False Recognition Table
FIG. 4 is a drawing showing an example of the false recognition table T according the present embodiment. As the figure shows, the false recognition table T pertains to a plurality of object types that are applicable as targeted objects. If a portion of the form of the targeted object of a predetermined first object type fcl cannot be recognized in an image, then there is a possibility that a second object type fc2 different from the first object type fcl may be falsely recognized instead. This table prescribed such relations between object types. Schematically speaking, the false recognition table T classifies the plurality of object types applicable as targeted objects into the first object type fcl or the second object type fc2, and prescribes the first object type fcl as having the potential to be falsely recognized as the second object type fc2. The examples shown in the figure are eight types of arrow indicators, which are object types applicable as targeted objects, and consist of a straight arrow, a straight/right-turn arrow, a straight/left-turn arrow, a right-turn arrow, a left-turn arrow, a right/left-turn arrow, a right-turn type 2 arrow, and a left-turn type 2 arrow. Among these eight types of arrow indicators, the five types of arrow indicators consisting of the straight/right-turn arrow, the straight/left-turn arrow, the right/left-turn arrow, the right-turn type 2 arrow, and the left-turn type 2 arrow are classified as first object types fcl, while the three types of arrow indicators consisting of the straight arrow, the right-turn arrow, and the left-turn arrow are classified as second object types fc2. Furthermore, a two-dimensional matrix is used in the false recognition table T to prescribe the relations among the five first object types fcl arranged in the horizontal direction on the top side of the figure and the three second object types fc2 arranged in the vertical direction on the left side of the figure. The relation indicated by a dot in the table between the first object type fcl and the second object type fc2 is defined as one where the first object type fcl may be falsely recognized as the corresponding second object type fc2.
[0038] The relations prescribed by the false recognition table T for object types that may be falsely recognized are object type relations where, if a portion of the form of the targeted object of the first object type fcl cannot be recognized in an image, there is a possibility that the second object type fc2 different from the first object type fcl may be falsely recognized instead. According to the present embodiment, such object type relations are ones where the form of the object of the first object type fcl has at least two or more characteristic parts, and the form of the object of the second object type fc2 resembles the object of the first object type fcl except for a portion of the two or more characteristic parts. FIGS. 5A to 5E pertain to the eight types of arrow indicators that are object types applicable as targeted objects, and are drawings showing examples of relations between object types with the potential for false recognition. Square frames Cl to C4 in the figures indicate characteristic parts of the arrow indicators. Here, the characteristic part is a part that is relatively easily recognizable in image recognition processing performed by the image recognizing unit 5. The characteristic parts correspond to triangular portions indicating the arrow direction or the like, which are enclosed by the frames Cl to C3 in the figures. [0039]
The relation shown in FIG. 5A is one between the straight/right-tum arrow, which is classified as a first object type fcl, and the straight arrow, which is classified as a second object type fc2. In this case, the straight/right-turn arrow has two characteristic parts, the straight arrow part Cl and the right-turn arrow part C2. Meanwhile, the straight arrow only has the straight arrow part Cl as a characteristic part. Thus in the relation shown in FIG. 5A, if the right-turn arrow part C2, which is one characteristic part of the straight/right-turn arrow classified as the first object type fcl, cannot be recognized in an image, then only the straight arrow part Cl is recognized in the image. As a consequence, the straight/right-turn arrow may be falsely recognized as a straight arrow classified as the second object fc2. In addition, on an actual road, the right-turn arrow part C2 of the straight/right-turn arrow is arranged on a right side with respect to a center of the lane in the width direction. Therefore, the right-turn arrow part C2 is repeatedly run over by vehicle wheels and can fade easily as a result. For this reason, there is a relatively high possibility that the right-turn arrow part C2 will be unrecognizable in an image. Accordingly, there is a possibility that the straight/right-turn arrow, which is classified as the first object type fcl, will be falsely recognized as the straight arrow, which is classified as the second object type fc2. [0040]
The relation shown in FIG. 5B is one between the straight/left-turn arrow, which is classified as a first object type fcl, and the straight arrow, which is classified as a second object type fc2. hi this case, the straight/left-turn arrow has two characteristic parts, the straight arrow part Cl and the left-turn arrow part C3. Meanwhile, the straight arrow only has the straight arrow part Cl as a characteristic part. Thus in the relation shown in FIG. 5B, if the left-turn arrow part C3, which is one characteristic part of the straight/left-turn arrow classified as the first object type fcl, cannot be recognized in an image, then only the straight arrow part Cl is recognized in the image. As a consequence, the straight/left-turn arrow may be falsely recognized as a straight arrow classified as the second object fc2. In addition, on an actual road, the left-turn arrow part C3 of the straight/left-turn arrow is arranged on a left side with respect to a center of the lane in the width direction. Therefore, the left-turn arrow part C3 is repeatedly run over by vehicle wheels and can fade easily as a result. For this reason, there is a relatively high possibility that the left-turn arrow part C3 will be unrecognizable in an image. Accordingly, there is a possibility that the straight/left-turn arrow, which is classified as the first object type fcl, will be falsely recognized as the straight arrow, which is classified as the second object type fc2. [0041]
The relation shown in FIG. 5 C is one between the right/left-turn arrow, which is classified as a first object type fcl, and the right-turn arrow, which is classified as a second object type fc2. hi this case, the right/left-turn arrow has two characteristic parts, the right-turn arrow part C2 and the left-turn arrow part C3. Meanwhile, the right-turn arrow only has the right-turn arrow part C2 as a characteristic part. Thus in the relation shown in FIG. 5C, if the left-turn arrow part C3, which is one characteristic part of the right/left-turn arrow classified as the first object type fcl, cannot be recognized in an image, then only the right-turn arrow part C2 is recognized in the image. As a consequence, the right/left-turn arrow may be falsely recognized as a right-turn arrow classified as the second object fc2. hi addition, on an actual road, the left-turn arrow part C3 of the right/left-turn arrow is arranged on a left side with respect to a center of the lane in the width direction. Therefore, the left-turn arrow part C3 is repeatedly run over by vehicle wheels and can fade easily as a result. For this reason, there is a relatively high possibility that the left-turn arrow part C3 will be unrecognizable in an image. Accordingly, there is a possibility that the right/left-turn arrow, which is classified as the first object type fcl, will be falsely recognized as the right-turn arrow, which is classified as the second object type fc2. [0042]
The relation shown in FIG. 5D is one between the right/left-turn arrow, which is classified as a first object type fcl, and the left-turn arrow, which is classified as a second object type fc2. In this case, the right/left-turn arrow has two characteristic parts, the right-turn arrow part C2 and the left-turn arrow part C3. Meanwhile, the left-turn arrow only has the left-turn arrow part C3 as a characteristic part. Thus in the relation shown in FIG. 5D, if the right-turn arrow part C2, which is one characteristic part of the right/left-turn arrow classified as the first object type fcl, cannot be recognized in an image, then only the left-turn arrow part C3 is recognized in the image. As a consequence, the right/left-turn arrow may be falsely recognized as a left-turn arrow classified as the second object fc2. In addition, on an actual road, the right-turn arrow part C2 of the right/left-turn arrow is arranged on a right side with respect to a center of the lane in the width direction. Therefore, the right-turn arrow part C2 is repeatedly run over by vehicle wheels and can fade easily as a result. For this reason, there is a relatively high possibility that the right-turn arrow part C2 will be unrecognizable in an image. Accordingly, there is a possibility that the right/left-turn arrow, which is classified as the first object type fcl, will be falsely recognized as the left-turn arrow, which is classified as the second object type fc2. [0043]
The relation shown in FIG. 5E is one between the right-turn type 2 arrow, which is classified as a first object type fcl, and the right-turn arrow, which is classified as a second object type fc2. In this case, the right-turn type 2 arrow has two characteristic parts, the right-turn arrow part C2 and a straight part C4. Meanwhile, the right-turn arrow only has the right-turn arrow part C2 as a characteristic part. Thus in the relation shown in FIG. 5E, if the straight part C4, which is one characteristic part of the right-turn type 2 arrow classified as the first object type fcl, cannot be recognized in an image, then only the right-turn arrow part C2 is recognized in the image. As a consequence, the right-turn type 2 arrow may be falsely recognized as a right-turn arrow classified as the second object fc2. Furthermore, the straight part C4 of the right-turn type 2 arrow has few relatively easily recognizable characteristic parts in image recognition processing compared to the triangular parts Cl to C3 indicating the arrow direction. For this reason, there is a relatively high possibility that the straight part C4 will be unrecognizable in an image. Accordingly, there is a possibility that the right-turn type 2 arrow, which is classified as the first object type fcl, will be falsely recognized as the right-turn arrow, which is classified as the second object type fc2. [0044]
The relation shown in FIG. 5F is one between the left-turn type 2 arrow, which is classified as a first object type fcl, and the left-turn arrow, which is classified as a second object type fc2. In this case, the left-turn type 2 arrow has two characteristic parts, the left-turn arrow part C3 and the straight part C4. Meanwhile, the left-turn arrow only has the left-turn arrow part C3 as a characteristic part. Thus in the relation shown in FIG. 5F, if the straight part C4, which is one characteristic part of the left-turn type 2 arrow classified as the first object type fcl, cannot be recognized in an image, then only the left-turn arrow part C3 is recognized in the image. As a consequence, the left-turn type 2 arrow may be falsely recognized as a left-turn arrow classified as the second object fc2. Furthermore, the straight part C4 of the left-turn type 2 arrow has few relatively easily recognizable characteristic parts in image recognition processing compared to the triangular parts Cl to C3 indicating the arrow direction. For this reason, there is a relatively high possibility that the straight part C4 will be unrecognizable in an image. Accordingly, there is a possibility that the left-turn type 2 arrow, which is classified as the first object type fcl, will be falsely recognized as the left-turn arrow, which is classified as the second object type fc2. [0045] Note that, as described later, in cases where the recognized object type specified in the image recognition result is the first object type fcl based on the false recognition table T, the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is the recognized object type. Accordingly, object types other than the above eight types of arrow indicators that have no risk of being falsely recognized as another object type such as the second object type fc2 are preferably prescribed in the false recognition table T as first object types fcl. Thus, object types for which there is no possibility of false recognition as another object type can be prescribed in the false recognition table T so that the recognized object type specified in the image recognition result is thereby determined as the object type of the targeted object in the host vehicle lane.
[0046] 7. Object Type Determining Unit
The object type determining unit 8 functions as object type determining unit for determining the object type of the targeted object included in the image information G. According to the present embodiment, the object type determining unit 8 determines the object type of the targeted object in the host vehicle lane included in the image information G based on the false recognition table T and the recognized object type specified in the image recognition result from the image recognizing unit 5. At this time, when the false recognition table T has another object type whose prescribed relation with the recognized object type has the risk of false recognition, the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is potentially either the recognized object type itself specified in the image recognition result, or another object type prescribed as related with the recognized object type in the false recognition table T. On the other hand, when the false recognition table T does not have another object type whose prescribed relation with the recognized object type has the risk of false recognition, the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is the recognized object type itself specified in the image recognition result. [0047] More specifically, in cases where the recognized object type specified in the image recognition result is the first object type fcl based on the false recognition table T, the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is the recognized object type. On the other hand, in cases where the recognized object type specified in the image recognition result is the second object type fc2, the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is potentially either the recognized object type, or the first object type fcl prescribed as related with the recognized object type in the false recognition table T. In other words, if the recognized object type specified in the image recognition result is the first object type fcl having many characteristic parts and there is no possibility of false recognition as another object type, then the object type of the targeted object in the host vehicle lane is determined as the recognized object type. However, if the recognized object type specified in the image recognition result is the second object type fc2 having only a few characteristic parts and there is a possibility of false recognition as the other first object type fcl, then the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is potentially either the recognized object type itself, or the first object type fcl that may have been falsely recognized as the recognized object type. For example, if the recognized object type specified in the image recognition result is the straight/right-turn arrow, i.e., the first object type fcl, then the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is the straight/right-turn arrow, i.e., the recognized object type itself. However, for example, if the recognized object type specified in the image recognition result is the straight arrow, i.e., the second object type fc2, then the object type determining unit 8 refers to the false recognition table T, namely the first row from the top in FIG. 4, and finds that the straight/right-turn arrow and the straight-left-turn arrow classified as first object types fcl are prescribed as related with the recognized object type. Accordingly, in cases where the recognized object type is the straight arrow, the object type determining unit 8 determines that there is a possibility of the object type of the targeted object in the host vehicle lane being the straight arrow, i.e., the recognized object type itself, or the straight/right-turn arrow or the straight/left-turn arrow, i.e., the first object types fcl prescribed as related with the recognized object type in the false recognition table T. [0048]
8. Lane Determining Unit
The lane determining unit 9 functions as lane determining unit for determining the lane in which the host vehicle is traveling, i.e., the host vehicle lane, from multiple lanes in cases where the road on which the host vehicle 30 is traveling has multiple lanes. According to the present embodiment, based on the object type specified in the determination result made by the object type determining unit 8 and the object information F of the targeted object extracted by the data extracting unit 7, the lane determining unit 9 determines the host vehicle lane as one, two, or more lanes for which the obtained object information F specifies an object type matching the object type specified in the determination result made by the object type determining unit 8. It should be noted that the lane determining unit 9 performs processing to determine the host vehicle lane when determination of the host vehicle lane is required, namely, only when the traveled road has multiple lanes in the traveling direction (on one side) based on the host vehicle position information P. Also, the lane determining unit 9 outputs host vehicle lane information S as a determination result to the navigation computing unit 10. Thus, the navigation computing unit 10 can perform operations for guide functions such as route guidance and route searching with reference to the host vehicle lane information S. [0049] FIGS. 6 and 7 are explanatory drawings for describing specific examples of host vehicle lane determination processing performed by the lane determining unit 9. In these figures, the object type in the rectangular box on the left side is a recognized object type fa specified in an image recognition result made by the image recognizing unit 5 regarding the targeted object in the host vehicle lane. Also, in these figures, object types FtI to Ft4 of the targeted objects in the lanes on the traveled road specified in the object information F5 which were obtained by the data extracting unit 7, are shown to the right of the recognized object type fa. In addition, information that specifies the host vehicle lane and is framed at the bottom of the figures corresponds to the host vehicle lane information S.
[0050] In the example shown in FIG. 6, the object information F obtained by the data extracting unit 7 regarding the object types of the targeted objects in the lanes of the traveled road, which has four lanes, specifies the following: the object type FtI of a targeted object in a first lane Ll is a straight/left-turn arrow, the object type Ft2 of a targeted object in a second lane L2 is a straight arrow, the object type Ft3 of a targeted object in a third lane L3 is a straight arrow, and the object type Ft4 of a targeted object in a fourth lane L4 is a right-turn arrow. Furthermore, a straight arrow is specified as the recognized object type fa specified in the image recognition result of the targeted object in the host vehicle lane according to the image recognizing unit 5. The straight arrow is a second object type fc2 as prescribed in the false recognition table T. hi this case, as explained above, based on the false recognition table T, the object type determining unit 8 determines that there are three possibilities for the object type of the targeted object in the host vehicle lane: the straight arrow, the straight/right-turn arrow, and the straight/left-turn arrow. Accordingly, from the object types FtI to Ft4 of the targeted objects in the lanes of the traveled road, which are specified in the object information F obtained by the data extracting unit 7, the lane determining unit 9 extracts those that match with the three object types of straight arrow, straight/right-turn arrow, and straight/left-turn arrow. Then, the one, two, or more lanes for which the obtained object information F specifies object types matching with these three object types are determined as the host vehicle lanes. In the present example, a straight/right-turn arrow does not exist among the object types FtI to Ft4 of the targeted objects in the lanes. Thus, the host vehicle lanes are determined as the first lane Ll for which the object type FtI is a straight/left-turn arrow, as well as the second lane L2 and the third lane L3 for which the object types Ft2, Ft3 are straight arrows. Also, the lane determining unit 9 generates information specifying that the first to third lanes Ll to L3 are the host vehicle lanes as the host vehicle lane information S. [0051]
In the example shown in FIG. 7, the object information F obtained by the data extracting unit 7 is the same as that used for the example shown in FIG. 6. A straight/left-turn arrow is specified as the recognized object type fa specified in the image recognition result of the targeted object in the host vehicle lane according to the image recognizing unit 5. The straight/left-turn arrow is a first object type fcl as prescribed in the false recognition table T. In this case, as explained above, the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is the recognized object type fa itself, namely, the straight/left-turn arrow. Then, the lane determining unit 9 determines the host vehicle lane as the first lane Ll, i.e., the lane for which the obtained object information F specifies an object type matching with the object type of straight/left-turn arrow, as specified in the determination result made by the object type determining unit 8. Also, the lane determining unit 9 generates information specifying that the first lane Ll is the host vehicle lane as the host vehicle lane information S. [0052]
9. Navigation Computing Unit
The navigation computing unit 10 is computation processing unit for operating in accordance with an application program AP in order to mainly execute guide functions as the navigation system 1. Here, the application program AP operates with reference to information including the host vehicle position information P obtained by the host vehicle position information obtaining unit 6, the map information M extracted by the data extracting unit 7, and the host vehicle lane information S generated by the lane determining unit 9. The navigation computing unit 10, in accordance with the application program AP, obtains map information M around the host vehicle 30 from the map database 22 via the data extracting unit 7 to display a map image on a display input device 26, and also performs processing to display a host vehicle position mark superimposed over the map image, based on the host vehicle position information P. Additionally, the navigation computing unit 10, in accordance with the application program AP, performs a route search from a place of departure to a destination, and based on the host vehicle position information P and the route thus searched, further performs route guidance using either or both the display input device 26 and an audio output device 27. In this case, the application program AP refers to the host vehicle lane information determined by the lane determining unit 8, and performs navigation operations such as displaying the host vehicle position, searching for a route, and route guidance. More specifically, for example, the application program AP performs operations such as displaying the determined host vehicle lane on the display input device 26, or canceling route guidance that requires an impossible lane change depending on the determined host vehicle lane. Moreover, according to the present embodiment, the navigation computing unit 10 is connected with the display input device 26 and the audio output device 27. The display input device 26 integrates a display device such as a liquid crystal display device with an input device such as a touch panel. The audio output device is structured with a speaker and the like, hi the present embodiment, the navigation computing unit 10, the display input device 26, and the audio output device 27 function as guidance information output unit 28 of the present invention. [0053]
10. Lane Determination Method
Descriptions are given below of an object type determination method and a lane determination method executed in the navigation system 1 that includes the object recognition device 2 and the lane determination device 3 according to the present embodiment. FIG. 8 is a flowchart showing an entire processing sequence of the lane determination method that includes the object type determination method according the present embodiment. FIG. 9 is a flowchart showing a detailed processing sequence of the object type determination method according the present embodiment. [0054] In the navigation system 1, as FIG. 8 shows, to determine the host vehicle lane, first the host vehicle position information P is obtained by the host vehicle position information obtaining unit 6 (step #01). Next, based on the host vehicle position information P at step #01, it is determined whether the traveled road has multiple lanes by using the data extracting unit 7 to refer to either or both the map information M and the object information F in the vicinity of the host vehicle position, which are stored in the map database 22 (step #02). If the traveled road does not have multiple lanes (No at step #02), that is, if the traveled road has only one lane going in each direction, then the host vehicle lane determination is unnecessary and the processing is ended. [0055] However, if the traveled road has multiple lanes (Yes at step #02), then the data extracting unit 7 extracts and obtains from the map database 22 the object information F for the targeted objects present in the lanes on the traveled road in the traveling direction of the host vehicle 30 (step #03). Also, the image information obtaining unit 4 obtains the image information G taken by the imaging device 21 installed in the host vehicle 30 (step #04). The image recognizing unit 5 then performs image recognition processing for the object type of the targeted object in the host vehicle lane, that is, the lane in which the host vehicle 30 is traveling (step #05). The object type determining unit 8 subsequently performs processing to determine the object type of the targeted object in the host vehicle lane included in the image information G (step #06). The object type determination processing method of the targeted object in the host vehicle lane will be described in detail later based on a flowchart in FIG. 9. Next, the lane determining unit 9 determines the lane in which the host vehicle is traveling, i.e., the host vehicle lane, from the multiple lanes of the road on which the host vehicle 30 is traveling, and generates the host vehicle information S (step #07). At this time, as explained above, based on the object type specified in the determination result made by the object type determining unit 8 and the object information F of the targeted object extracted by the data extracting unit 7, the lane determining unit 9 determines the host vehicle lane as one, two, or more lanes for which the obtained object information F specifies an object type matching the object type specified in the determination result made by the object type determining unit 8. With this, the entire processing of the lane determination method is ended. [0056]
Next, the details of the object type determination method at step #06 will be described. As FIG. 9 shows, first, the object type determining unit 8 obtains information regarding the recognized object type specified in the image recognition result from the image recognition processing performed at step #05 (step #11). The object type determining unit 8 then refers to the false recognition table T (step #12). Next, the object type determining unit 8 determines whether the recognized object type obtained at step #11 is the first object type fcl (step #13). If the recognized object type obtained at step #11 is the first object type fcl (Yes at step #13), then the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is the recognized object type (step #14). However, if the recognized object type obtained at step #11 is not the first object type fcl (No at step #13), then the recognized object type can be judged as the second object type fc2. Accordingly, the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is potentially either the recognized object type, or the first object type fcl prescribed as related with the recognized object type in the false recognition table T (step #15). Note that since the specific method employed by the object type determining unit 8 for determining the object type of the targeted object in the host vehicle lane has already been described in detail using FIGS. 6 and 7, such specifics are omitted here. With this, the processing of the object type determination method at step #06 is ended.
[0057] 11. Other Embodiments
(1) In the above embodiment, the relation between the first object type fcl and the second object type fc2 as prescribed in the false recognition table T shown in FIG. 4 is only an example, and many other object type relations are naturally possible. In such cases as well, the object type relations prescribed in the false recognition table T are preferably ones where the form of the object of the first object type fcl has at least two or more characteristic parts, and the form of the object of the second object type fc2 resembles the object of the first object type except for some of the two or more characteristic parts. However, the object type relations prescribed in the false recognition table T are not limited to the above relations, provided that the relation prescribed in the false recognition table T is preferably one where the image recognition result of the object type of one object is potentially falsely recognizable as another object type. [0058]
(2) In the above embodiment, the object type determining unit 8 was described as having a structure where the object type of the targeted object lane included in the image information G is determined based on the false recognition table T and the recognized object type specified in the image recognition result. However, the object type determining unit 8 is not limited to such a structure. Thus, another preferred embodiment of the present invention is one in which the object type determining unit 8 further utilizes the object information F obtained by the data extracting unit 7 based on the host vehicle position information P to determine the object type of the targeted object included in the image information G from the object types of one, two, or more targeted objects specified in the object information F. In the case of such a structure, regardless of whether the first object type fcl prescribed in the false recognition table T is related with the recognized object type specified in the image recognition result, if the object type is not included in the object types specified in the object information F obtained by the data extracting unit 7, then the object type determining unit 8 does not determine that there is a possibility of the object type being the object type of the targeted object in the host vehicle lane. For example, if the recognized object type specified in the image recognition result is the straight arrow, i.e., the second object type fc2, then as described in the above embodiment, based on the false recognition table T, the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is potentially any one of the straight arrow, the straight/right-turn arrow, and the straight/left-turn arrow. However, as shown in FIG. 6 for example, if the object types FtI to Ft4 of the targeted objects in the lanes of the traveled road, which are specified in the object information F obtained by the data extracting unit 7, are a straight arrow, a straight/left-turn arrow, and a right-turn arrow, then the lane determining unit 8 determines the object type of the targeted object in the host vehicle lane from among these object types. Accordingly, in this case, the object type determining unit 8 determines that the object type of the targeted object in the host vehicle lane is potentially either the straight arrow or the straight/left-turn arrow. Thus, it is possible to prevent the object type determining unit 8 from determining an object type result that is inapplicable to the targeted object in the host vehicle lane. [0059]
(3) In the above embodiment, an example was described in which arrow-shaped road indicators (arrow indicators) that represent traffic sections indicating the travel directions of lanes are objects applicable as targeted objects. However, the objects applicable as targeted objects are not limited to these, and objects of various object type that are provided on the road surface of the road, such as other road indicators, may also be targeted objects. Also, in cases where the object recognition device 2 is applied to the lane determination device 3, then objects arranged in the lanes of a road having multiple lanes are particularly preferred as targeted objects. [0060] (4) Another preferred embodiment of the present embodiment employs the determination of the host vehicle lane according to the above embodiment in combination with another lane determination method. For example, another lane determination method may be based on the object information F of a division line around the host vehicle position obtained from the map database 22 and a positional relationship among the host vehicle, the division lines, and the types of division lines (line types including a solid line, a broken line, and double lines) around the host vehicle as specified in the Image recognition result. Such a lane determination method may also use a method for determining the lane in which the host vehicle is traveling, as well as a method for determining the host vehicle lane based on information from VICS, namely, information from a light beacon or the like generated by a transmitter provided in each lane of the road. [0061]
(5) The above embodiments describe examples in which the entire structure of the navigation system 1, including the object recognition device 2 and the lane determination device 3, is installed in the host vehicle. However, the scope of the present invention is not limited to such a structure. Namely, another preferred embodiment of the present invention is, for example, one in which a portion of the structure excluding the imaging device 21 is disposed outside of the host vehicle in a connected state via a communications network such as the internet, the lane determination device 3 and the navigation system 1 are structured by the sending and receiving of signals and information via the network. [0062]
(6) In the above embodiment, an example was described in which the object recognition device 2 according to the present invention was applied to the lane determination device 3. However, the object recognition device 2 may naturally also be used for purposes other than determination of the host vehicle lane, such as for correcting the host vehicle position or the like using the image recognition result of the object type and the object information F obtained by the data extracting unit 7. [0063]
(7) In the above embodiments, examples were described in which the lane determination device 3 is utilized by the navigation system 1. However, the scope of the present invention is not limited to this, and the lane determination device 3 may naturally be utilized for other purposes, such as a running control device of the vehicle.
INDUSTRIAL APPLICABILITY
[0064]
The present invention can be utilized preferably as: an object recognition device that determines an object type of a targeted object included in image information that is taken by an imaging device, a lane determination device installed hi a vehicle, and a navigation system using the object recognition device and the lane determination device.

Claims

1. An object recognition device comprising: image information obtaining unit that obtains image information taken by an imaging device; image recognizing unit that performs image recognition processing of an object type of a targeted object included in the image information; a false recognition table that, when a portion of a form of the targeted object of a predetermined first object type cannot be recognized in an image with regard to a plurality of object types applicable as the target object, prescribes a relation of the first object type with an object type designated as a second object type different from the first object type that may be falsely recognized; and object type determining unit that determines the object type of the targeted object included in the image information, wherein the object type determining unit, based on a recognized object type specified in an image recognition result from the image recognizing unit and the false recognition table, determines that the object type of the targeted object in the image information is potentially one of the recognized object type and the object type prescribed as related with the recognized object type in the false recognition table.
2. The object recognition device according to claim 1, wherein the object type determining unit determines that if the recognized object type is the first object type, then the object type of the targeted object included in the image information is the recognized object type, and determines that if the recognized object type is the second object type, then the object type of the targeted object included in the image information is potentially one of the recognized object type and the first object type prescribed as related with the recognized object type in the false recognition table.
3. The object recognition device according to claim 1 or 2, wherein, the relation between the object types prescribed in the false recognition table is a relation where the form of the object of the first object type has at least two or more characteristic parts, and the form of the object of the second object type resembles the object of the first object type except for a portion of the two or more characteristic parts.
4. The object recognition device according to any one of claims 1 to 3, wherein the imaging device is installed in a vehicle and provided so as to image at least a surface of a road, and objects applicable as the targeted object are provided on the surface of the road and are arrow-shaped road indicators that represent traffic sections by travel directions of lanes.
5. The object recognition device according to claim 4, wherein regarding various object types including the arrow-shaped road indicator, the false recognition table prescribes a straight/right-turn arrow and a straight/left-turn arrow designated as the first object type as related with a straight arrow designated as the second object type.
6. The object recognition device according to claim 4 or 5, wherein regarding relations for various object types including the arrow-shaped road indicator, the false recognition table prescribes a right/left-turn arrow designated as the first object type as related with a right-turn arrow and a left-turn arrow designated as the second object type.
7. The object recognition device according to any one of claims 1 to 6, further comprising: host vehicle position information obtaining unit that obtains host vehicle position information specifying a current position of the host vehicle; and object information obtaining unit that obtains object information regarding one, two, or more targeted objects present in a traveling direction of the host vehicle, based on the host vehicle position information, wherein the imaging device is installed in the host vehicle, and the object type determining unit determines the object type of the targeted object included in the image information from among the object types of the one, two, or more targeted objects specified in the object information obtained by the object information obtaining unit.
8. A lane determination device comprising: the object recognition device according to any one of claims 1 to 6; host vehicle position information obtaining unit that obtains host vehicle position information specifying a current position of a host vehicle; object information obtaining unit that obtains object information of targeted objects present in lanes in a traveling direction of the host vehicle based on the host vehicle position information when a road on which the host vehicle is traveling has a plurality of lanes; and lane determining unit that determines a host vehicle lane, which is a lane where the host vehicle is traveling, from among the plurality of lanes, wherein the image recognizing unit of the object recognition device performs image recognition processing of the object type of the targeted object in the host vehicle lane included in the image information, and the lane determining unit determines that the host vehicle lane is one, two, or more lanes for which the obtained object information specifies an object type that matches with the object type specified in the determination result made by the object type determining unit of the object recognition device.
9. A navigation system comprising: the lane determination device according to claim 8; a map database that stores map information including the object information; an application program that operates in reference to the map information and information regarding the host vehicle lane determined by the lane determination device; and guidance information output unit that operates in accordance with the application program and output guidance information.
10. An object recognition method comprising the steps of: obtaining image information taken by an imaging device; performing image recognition processing of an object type of a targeted object included in the image information; and with regard to a plurality of object types applicable as the target object, when a portion of a form of the targeted object of a predetermined first object type cannot be recognized in an image, and based on a false recognition table that prescribes a relation of the first object type with an object type that may be falsely recognized designated as a second object type different from the first object type and also based on a recognized object type specified in an image recognition result from the image recognizing step, determining that the object type of the targeted object in the image information is potentially one of the recognized object type and the object type prescribed as related with the recognized object type in the false recognition table.
11. The object recognition method according to claim 10, wherein at the object type determining step if the recognized object type is the first object type, then the object type of the targeted object included in the image information is determined as the recognized object type, and if the recognized object type is the second object type, then the object type of the targeted object included in the image information is determined as potentially one of the recognized object type and the first object type prescribed as related with the recognized object type in the false recognition table.
12. A lane determination method comprising the steps of: obtaining image information taken by an imaging device installed in a host vehicle; obtaining host vehicle position information that specifies a current position of the host vehicle; when a road on which the host vehicle is traveling has a plurality of lanes based on the host vehicle position information, obtaining object information of targeted objects present in the lanes in a traveling direction of the host vehicle; performing image recognition processing of an object type of the targeted object in a host vehicle lane, which is a lane where the host vehicle is traveling; with regard to a plurality of object types applicable as the target object, when a portion of a form of the targeted object of a predetermined first object type cannot be recognized in an image, and based on a false recognition table that prescribes a relation of the first object type with an object type that may be falsely recognized designated as a second object type different from the first object type and also based on a recognized object type specified in an image recognition result from the image recognizing step, determining that the object type of the targeted object in the host vehicle lane is potentially one of the recognized object type and the object type prescribed as related with the recognized object type in the false recognition table; and determining the host vehicle lane from among the plurality of lanes as one, two, or more lanes for which the obtained object information specifies an object type that matches with the object type specified in an object type determination result at the object type determining step.
PCT/JP2008/060411 2007-05-31 2008-05-30 Object recognition device and object recognition method, and lane determination device and lane determination method using them WO2008146951A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2007-145747 2007-05-31
JP2007145747A JP4497424B2 (en) 2007-05-31 2007-05-31 Feature recognition device, feature recognition method, lane determination device and lane determination method using the same

Publications (1)

Publication Number Publication Date
WO2008146951A1 true WO2008146951A1 (en) 2008-12-04

Family

ID=39679359

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2008/060411 WO2008146951A1 (en) 2007-05-31 2008-05-30 Object recognition device and object recognition method, and lane determination device and lane determination method using them

Country Status (2)

Country Link
JP (1) JP4497424B2 (en)
WO (1) WO2008146951A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2583050C1 (en) * 2015-04-08 2016-05-10 Акционерное общество "НИИ измерительных приборов-Новосибирский завод имени Коминтерна" /АО "НПО НИИИП-НЗиК"/ Method of identifying false path formed by synchronous repeater jamming
KR20190056775A (en) * 2017-11-17 2019-05-27 현대자동차주식회사 Apparatus and method for recognizing object of vehicle

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023145740A1 (en) * 2022-01-26 2023-08-03 株式会社デンソー Map information system, in-vehicle device, and management server

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004111574A1 (en) * 2003-06-18 2004-12-23 Siemens Aktiengesellschaft Navigation system with traffic lane instructions

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4321821B2 (en) * 2005-01-28 2009-08-26 アイシン・エィ・ダブリュ株式会社 Image recognition apparatus and image recognition method
JP4596566B2 (en) * 2005-03-31 2010-12-08 アイシン・エィ・ダブリュ株式会社 Self-vehicle information recognition device and self-vehicle information recognition method
JP4820712B2 (en) * 2005-08-05 2011-11-24 アイシン・エィ・ダブリュ株式会社 Road marking recognition system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004111574A1 (en) * 2003-06-18 2004-12-23 Siemens Aktiengesellschaft Navigation system with traffic lane instructions

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
FRANKE U ET AL: "Steps towards an intelligent vision system for driver assistance in urban traffic", INTELLIGENT TRANSPORTATION SYSTEM, 1997. ITSC '97., IEEE CONFERENCE ON BOSTON, MA, USA 9-12 NOV. 1997, NEW YORK, NY, USA,IEEE, US, 9 November 1997 (1997-11-09), pages 601 - 606, XP010270892, ISBN: 978-0-7803-4269-9 *
PRIESE L ET AL: "New Results on Traffic Sign Recognition", 19941024; 19941024 - 19941026, 24 October 1994 (1994-10-24), pages 249 - 254, XP010258342 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2583050C1 (en) * 2015-04-08 2016-05-10 Акционерное общество "НИИ измерительных приборов-Новосибирский завод имени Коминтерна" /АО "НПО НИИИП-НЗиК"/ Method of identifying false path formed by synchronous repeater jamming
KR20190056775A (en) * 2017-11-17 2019-05-27 현대자동차주식회사 Apparatus and method for recognizing object of vehicle
KR102371617B1 (en) 2017-11-17 2022-03-07 현대자동차주식회사 Apparatus and method for recognizing object of vehicle

Also Published As

Publication number Publication date
JP2008298622A (en) 2008-12-11
JP4497424B2 (en) 2010-07-07

Similar Documents

Publication Publication Date Title
US8346473B2 (en) Lane determining device, lane determining method and navigation apparatus using the same
JP4446204B2 (en) Vehicle navigation apparatus and vehicle navigation program
EP1959236B1 (en) Lane determining device and lane determining method
EP2113746B1 (en) Feature information collecting device and feature information collecting program, and vehicle position recognizing device and navigation device
US7747385B2 (en) Traveling condition determination device
US7948397B2 (en) Image recognition apparatuses, methods and programs
EP2009400B1 (en) Vehicle position recognition device, navigation device, vehicle position recognition method
US7463974B2 (en) Systems, methods, and programs for determining whether a vehicle is on-road or off-road
US20070021912A1 (en) Current position information management systems, methods, and programs
JP4577827B2 (en) Next road prediction device for traveling vehicle
US20110172913A1 (en) Road learning apparatus
JP4953012B2 (en) Image recognition device, program for image recognition device, navigation device using the same, and program for navigation device
JP4738360B2 (en) Lane determination device, lane determination method, and navigation device using the same
JP2008196968A (en) Lane determining device, lane determining method, and navigation device using it
JP4875509B2 (en) Navigation device and navigation method
JP4936070B2 (en) Navigation device and navigation program
WO2008146951A1 (en) Object recognition device and object recognition method, and lane determination device and lane determination method using them
JP3786047B2 (en) Car navigation system
JP5013214B2 (en) Lane determination device, lane determination program, and navigation device using the same
JP5071737B2 (en) Lane determination device, lane determination program, and navigation device using the same
JP2004317390A (en) Vehicle navigation device and its program, and recording medium
JP4943246B2 (en) Lane determination device, lane determination program, and navigation device using the same
JP2005321360A (en) Vehicle-mounted navigation device
US20230194295A1 (en) Method, apparatus and computer program product for lane level toll plaza congestion modeling
JP2008224676A (en) Navigator

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08765224

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08765224

Country of ref document: EP

Kind code of ref document: A1