US8594377B2 - Image recognition apparatus - Google Patents
Image recognition apparatus Download PDFInfo
- Publication number
- US8594377B2 US8594377B2 US13/227,746 US201113227746A US8594377B2 US 8594377 B2 US8594377 B2 US 8594377B2 US 201113227746 A US201113227746 A US 201113227746A US 8594377 B2 US8594377 B2 US 8594377B2
- Authority
- US
- United States
- Prior art keywords
- image
- vehicle
- optical axis
- camera
- displacement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Definitions
- the invention relates to technologies for recognizing an object based on an image.
- an image recognition apparatus for recognizing an object based on an outside-vehicle image, which is obtained by a camera installed in a vehicle and shows a periphery of the vehicle. This image recognition apparatus is considered to be used in a variety of applications in a vehicle.
- the image recognition apparatus for example, recognizes an object based on an outside-vehicle image and in a case where there is an object having a possibility of collision, the image recognition apparatus warns a user of the possibility of collision. Thereby, the user can drive in consideration of the object.
- the image recognition apparatus recognizes an object based on an outside-vehicle image and transmits a signal that indicates existence of the recognized object to a vehicle controller for controlling a behavior of the vehicle. Thereby, the vehicle controller can control the behavior of the vehicle in consideration of the object.
- an optical axis of a camera is needed to point in a predetermined direction relative to the vehicle.
- the optical axis of the camera which is installed so as to point in the predetermined direction, may be displaced with age.
- the image recognition apparatus for informing a user that the optical axis of the camera is changed largely from the predetermined direction in a case where the optical axis of the camera is changed largely from the predetermined direction is proposed. Thereby, the user can adjust the optical axis of the camera to the predetermined direction by such information being informed.
- an image recognition apparatus that recognizes an object based on an image, includes: a recognition unit that recognizes the object based on a target area in an image obtained by a camera installed in a vehicle; an identifying unit that identifies an optical axis position of the camera relative to the vehicle based on the image; and a changing unit that changes a position of the target area in the image according to the optical axis position identified by the identifying unit.
- the position of the target area in the image is changed according to the optical axis position of the camera, and therefore, the object can be recognized properly based on the target area even though the optical axis position of the camera is displaced.
- the image recognition apparatus further includes: a derivation unit that derives an amount of displacement of the optical axis position from an initial position based on the image; and an informing unit that informs a user that the amount of displacement is more than a threshold value in a case where the amount of displacement is more than a threshold value.
- the amount of displacement of the optical axis position of the camera from the initial position is more than the threshold value, a user is informed of the displacement. Thereby, the user can recognize that the optical axis position of the camera is displaced largely.
- the image recognition apparatus further includes: a derivation unit that derives an amount of displacement of the optical axis position from an initial position based on the image; and an adjustment unit that adjusts the optical axis position in a case where the amount of displacement is more than a threshold value.
- the optical axis position is adjusted. Thereby, a user can skip a step to adjust the optical axis position of the camera even though the optical axis position of the camera is displaced largely.
- an object of the invention is to recognize an object properly based on a target area even though an optical axis position of a camera is displaced.
- FIG. 1 shows an appearance of an image recognition apparatus
- FIG. 2 shows condition in which a camera is installed in a vehicle
- FIG. 3 shows a direction in which an optical axis of a camera is pointed
- FIG. 4 mainly shows a configuration of a body part of a first embodiment
- FIG. 5 shows an exemplary outside-vehicle image
- FIG. 6 is a diagram to explain a method to derive a vanishing point position
- FIG. 7 shows a change of a vanishing point position
- FIG. 8 shows a flow of a collision-warning process
- FIG. 9 shows a flow of an optical-axis-related process in a first embodiment
- FIG. 10 shows a configuration of a camera in a second embodiment
- FIG. 11 mainly shows a configuration of a body part in a second embodiment
- FIG. 12 shows a flow of an optical-axis-related process in a second embodiment
- FIG. 13 shows an exemplary outside-vehicle image
- FIG. 14 shows an exemplary outside-vehicle image
- FIG. 15 shows an exemplary outside-vehicle image
- FIG. 16 shows an exemplary outside-vehicle image.
- FIG. 1 shows an appearance of an image recognition apparatus 2 in the first embodiment.
- This image recognition apparatus 2 is installed in a vehicle (hereinafter, referred to as an “own vehicle”) and mainly includes a body part 20 and a camera 3 .
- the body part 20 and the camera 3 are electrically connected through a cable 19 , and they can transmit/receive a signal mutually.
- the camera 3 captures a surrounding area of an own vehicle and obtains an outside-vehicle image that shows surroundings of the own vehicle.
- the body part 20 recognizes another vehicle running around the own vehicle based on a target area in an outside-vehicle image obtained by the camera 3 .
- the body part 20 in a case where the body part 20 recognizes another vehicle and there is a high possibility of collision between the own vehicle and another vehicle, the body part 20 warns a user of the high possibility of collision between the own vehicle and another vehicle. Furthermore, the body part 20 identifies an optical axis position of the camera 3 and changes a position of the target area in an outside-vehicle image according to the optical axis position.
- the image recognition apparatus 2 having such functions are explained in detail.
- the camera 3 includes an optical lens 10 , a chassis 11 and a holding part 12 .
- the optical lens 10 is an optical device that focuses light.
- An angle of view of the optical lens 10 is, for example, 45 degrees.
- the chassis 11 is a chassis that stores parts such as the optical lens 10 and an imaging device (illustration is omitted).
- the holding part 12 is a holding member that holds the chassis 11 .
- the chassis 11 is fixed to the holding part 12 with a screw 14 . It is possible to adjust an angle of the chassis 11 relative to the holding part 12 by the screw 14 being loosened.
- the holding part 12 includes a mounting surface 13 touching with an inner surface of a windshield of an own vehicle by plane contact.
- FIG. 2 shows condition in which the camera 3 is installed in an own vehicle 1 .
- the camera 3 is installed in a cabin of the own vehicle 1 by the mounting surface 13 of the holding part 12 being touched with an inner surface of a windshield 15 by plane contact.
- the mounting surface 13 has a double-sided tape and the mounting surface 13 is fixed to the windshield 15 with this double-sided tape. Thereby, the holding part 12 is adhered near a rearview mirror 16 installed on the inner surface of the windshield 15 .
- an optical axis of the camera 3 (an optical axis of the optical lens 10 ) is pointed in a predetermined direction with reference to the own vehicle 1 .
- an optical axis of the camera 3 is pointed to the front of the own vehicle 1 . Therefore, the camera 3 captures a forward area of the own vehicle 1 and obtains an outside-vehicle image showing the forward area of the own vehicle 1 .
- the chassis 11 may move relative to the holding part 12 due to loosening of the screw 14 caused by aging and the like. In this way, if the chassis 11 moves relative to the holding part 12 , an optical axis position of the camera 3 relative to the own vehicle 1 is displaced from a predetermined direction. Normally, the optical axis of the camera 3 is displaced downward relative to the own vehicle 1 with age.
- FIG. 4 mainly shows the configuration of the body part 20 .
- the body part 20 includes a rectangular solid chassis and this chassis stores a variety of electronic parts.
- the body part 20 mainly includes an image processing part 21 , a controller 22 , a signal receiver 30 and a signal output part 31 .
- the image processing part 21 is, for example, a hardware circuit such as ASIC and has functions to perform a variety of image processing. As a part of the functions, the image processing part 21 includes an area setting part 23 and an image recognition part 24 .
- the area setting part 23 sets a target area that is an area intended for a process of recognizing an object, in an outside-vehicle image obtained by the camera 3 .
- the image recognition part 24 recognizes an object based on the target area in the outside-vehicle image. In this embodiment, the image recognition part 24 recognizes a preceding vehicle that is another vehicle running in front of the own vehicle 1 , as an object.
- the controller 22 includes a microcomputer and controls the image recognition apparatus 2 overall.
- a microcomputer for example, includes CPU, ROM, RAM and the like.
- a variety of functions as the controller 22 are performed by CPU performing arithmetic processing in accordance with a program stored in ROM.
- the controller 22 includes a warning controller 25 , a position identifying part 26 , an area changing part 27 , a displacement-amount deriving part 28 and an informing controller 29 as a part of functions performed by arithmetic processing.
- the warning controller 25 takes control to warn a user to avoid a collision between the own vehicle 1 and a preceding vehicle in a case where the image recognition part 24 recognizes the preceding vehicle.
- the position identifying part 26 identifies an optical axis position of the camera 3 relative to the own vehicle 1 based on an outside-vehicle image.
- the area changing part 27 changes a position of the target area in the outside-vehicle image according to the optical axis position of the camera 3 .
- the displacement-amount deriving part 28 derives an amount of displacement of the optical axis position of the camera 3 from an initial position.
- the initial position is the optical axis position of the camera 3 at the time the camera 3 is first installed in the own vehicle 1 at a factory or a sales company of the own vehicle 1 .
- the optical axis position of the camera 3 which is pointed in a predetermined direction relative to the own vehicle 1 so that the image recognition apparatus 2 fulfills recognition functions properly, is the initial position.
- the informing controller 29 takes control to inform a user that the amount of displacement of the optical axis position of the camera 3 from the initial position is more than the predetermined threshold value.
- the signal receiver 30 is connected with the camera 3 electrically and is an interface to receive a signal from the camera 3 .
- the signal receiver 30 receives an outside-vehicle image obtained by the camera 3 and inputs the outside-vehicle image to the image processing part 21 and the controller 22 .
- the signal output part 31 is connected with a display 4 and a speaker 5 electrically and is an interface to transmit a signal to the display 4 and the speaker 5 .
- the signal output part 31 outputs the outside-vehicle image and the like processed by the image processing part 21 to the display 4 so that the display 4 displays the outside-vehicle image and the like.
- the signal output part 31 transmits a signal to the speaker 5 so that the speaker 5 outputs sound.
- the area setting part 23 sets a target area in an outside-vehicle image.
- the image processing part 21 obtains the outside-vehicle image from the camera 3 in a predetermined cycle.
- the area setting part 23 sets the target area on respective outside-vehicle images obtained in such a predetermined cycle.
- the area setting part 23 sets a part of an area of the outside-vehicle image as the target area.
- the image recognition part 24 recognizes a preceding vehicle based on such a target area set in the outside-vehicle image.
- the image recognition part 24 recognizes the preceding vehicle using a pattern matching method. Concretely, the image recognition part 24 extracts a subject image included in the target area, and then recognizes the subject image as an image of the preceding vehicle in a case where a shape of the subject image approximates a standard pattern indicating a vehicle.
- the image recognition part 24 recognizes a preceding vehicle by covering the whole area of the outside-vehicle image, a processing load for the recognition is extremely high. Therefore, the image recognition part 24 recognizes the preceding vehicle by covering a target area that is a part of the outside-vehicle image, so that the processing load is reduced.
- the target area is determined in the position where there is a high possibility that an image of the preceding vehicle appears in the outside-vehicle image.
- the target area is set so that the target area includes the vanishing point corresponding to a theoretical infinite distance in the outside-vehicle image.
- the vanishing point is a point where images of a plurality of lines which are parallel to one another (e.g. a lane mark, etc.) are extended and intersect in the outside-vehicle image on which a subject image is represented in perspective.
- FIG. 5 shows an example of an outside-vehicle image G that is obtained by the camera 3 and shows a forward view of the own vehicle 1 .
- a target area D is set so that the target area D includes a vanishing point V in the outside-vehicle image G.
- a preceding vehicle is recognized by covering only the target area D set in the outside-vehicle image G, and therefore a processing load for the recognition of the preceding vehicle can be reduced.
- controller 22 the warning controller 25 , the position identifying part 26 , the area changing part 27 , the displacement-amount deriving part 28 and the informing controller 29 .
- the warning controller 25 examines a possibility of collision between the own vehicle 1 and a preceding vehicle in a case where the preceding vehicle is recognized by the image recognition part 24 .
- the warning controller 25 examines the possibility of collision between the own vehicle 1 and the preceding vehicle based on a relative distance between the own vehicle 1 and the preceding vehicle, and relative acceleration between the own vehicle 1 and the preceding vehicle.
- the warning controller 25 derives the relative distance based on a size of an image of the preceding vehicle in an outside-vehicle image.
- the warning controller 25 derives the relative acceleration based on a change in the size of the image of the same preceding vehicle respectively included in a plurality of consecutive outside-vehicle images. Then the warning controller 25 examines the possibility of collision between the own vehicle 1 and the preceding vehicle with reference to a table stored in ROM of the controller 22 based on the derived relative distance and relative acceleration. In this table, a criterion value indicating the possibility of collision (possibility of contact) is stored, corresponding to the relative distance and relative acceleration.
- the warning controller 25 warns a user to avoid the collision in a case where the criterion value indicating the possibility of collision is higher than a predetermined threshold value. Concretely, the warning controller 25 transmits a signal to the speaker 5 so that the speaker 5 outputs warning sound inside a cabin of the own vehicle 1 . The warning controller 25 changes volume level of warning sound output by the speaker 5 according to the criterion value indicating the possibility of collision.
- the position identifying part 26 identifies an optical axis position of the camera 3 relative to the own vehicle 1 based on an outside-vehicle image. In a case where the optical axis position of the camera 3 relative to the own vehicle 1 is changed, a vanishing point position in the outside-vehicle image is also changed. Therefore, the position identifying part 26 substantively identifies the optical axis position of the camera 3 relative to the own vehicle 1 by deriving the vanishing point position in the outside-vehicle image.
- the vanishing point position in the outside-vehicle image is derived, first, an image of an object commonly included in a plurality of consecutive outside-vehicle images is identified. Then the position of the image of the applicable object in each of a plurality of consecutive images is determined.
- An object used in this time is, for example, a telephone pole and the like existing along a road on which a vehicle is running.
- the position identifying part 26 derives a plurality of lines parallel to one another actually from the position of the image of such an object. Then the position identifying part 26 derives the position of a point where these plural lines intersect, as the vanishing point position.
- the outside-vehicle image G shown in FIG. 5 includes a telephone pole image X and Y along a road on which a vehicle is running.
- the position identifying part 26 derives a plurality of lines XL and YL parallel to one another actually based on a position of the telephone pole image X and Y in each of a plurality of consecutive outside-vehicle images G.
- the telephone pole image X exists in a position X 1 , X 2 , X 3 , X 4 and X 5 respectively.
- the line XL is derived by connecting the approximate middles of these telephone pole images X.
- the telephone pole images Y exists in a position Y 1 , Y 2 , Y 3 , Y 4 and Y 5 respectively.
- the line YL is derived by connecting the approximate middles of these telephone pole images Y. Then, a position of a point where the line XL and YL intersect by extending those lines is the vanishing point position V.
- the position identifying part 26 derives the amount of change of the vanishing point position from the previous position.
- the vanishing point position in the outside-vehicle image is also displaced.
- the vanishing point V exists in approximately the middle in a vertical direction of an outside-vehicle image G 2 .
- the vanishing point V exists above the middle in a vertical direction of an outside-vehicle image G 3 obtained after time has passed, and the vanishing point V exists further above in an outside-vehicle image G 4 obtained after further time has passed.
- the position identifying part 26 derives the amount of change of the vanishing point position from the previous position. For example, in a case where the outside-vehicle image G 3 shown in FIG. 7 is obtained, if the previous position is the same as the position of the vanishing point V in the outside-vehicle image G 2 , the amount of change of the position of the vanishing point V in the outside-vehicle image G 3 relative to the position of the vanishing point V in the outside-vehicle image G 2 is derived. Also, in a case where the outside-vehicle image G 4 shown in FIG.
- the area changing part 27 changes a position of a target area in an outside-vehicle image according to an optical axis position of the camera 3 .
- the area changing part 27 changes the position of the target area in the outside-vehicle image according to the vanishing point position in the outside-vehicle image.
- the area changing part 27 changes the position of the target area to be set by the area setting part 23 , based on the amount of change of the vanishing point position derived by the position identifying part 26 . For example, if the position of the vanishing point V in the outside-vehicle image G 3 is changed by 25 pixels to an upward direction (plus side on Y-axis) relative to the position (the previous position) of the vanishing point V in the outside-vehicle image G 2 shown in FIG. 7 , the area changing part 27 changes the position of the target area D in the outside-vehicle image by 25 pixels to the upward direction (plus side on Y-axis).
- the area changing part 27 changes the position of the target area D in the outside-vehicle image by 30 pixels to an upward direction (plus side on Y-axis).
- an optical axis position of the camera 3 is changed to a downward direction relative to the own vehicle 1
- a vanishing point position in an outside-vehicle image is changed to an upward direction in this way.
- the area changing part 27 changes the position of the target area in the outside-vehicle image to an opposite direction, an upward direction.
- the area changing part 27 changes the position of the target area in the outside-vehicle image to a downward direction.
- the area changing part 27 changes the position of the target area in the outside-vehicle image to the opposite direction, the right.
- the area changing part 27 changes the position of the target area in the outside-vehicle image to the opposite direction, the left.
- the displacement-amount deriving part 28 derives an amount of displacement of the optical axis position of the camera 3 from an initial position. Concretely, the displacement-amount deriving part 28 derives the amount of displacement of the vanishing point position from the vanishing point position (hereinafter, referred to as “reference position”) in the outside-vehicle image in a case where the optical axis position of the camera 3 is in the initial position. In this way, the displacement-amount deriving part 28 substantively derives the amount of displacement of the optical axis position of the camera 3 from the initial position by deriving the amount of displacement of the vanishing point position from the reference position.
- the displacement-amount deriving part 28 derives the amount of displacement of the vanishing point position by accumulating the amount of change of the vanishing point position derived in the past. For example, if the area changing part 27 has changed the position of the target area twice and the amount of change of the vanishing point position at that time was 25 pixels and 30 pixels respectively, 55 pixels by accumulation of 25 pixels and 30 pixels is equal to the amount of displacement of the vanishing point position from the reference position.
- the displacement-amount deriving part 28 is allowed to directly derive the amount of displacement of the vanishing point position from the reference position, based on the reference position and the vanishing point position derived by the position identifying part 26 .
- the informing controller 29 takes control to inform a user that the amount of displacement of the optical axis position of the camera 3 from the initial position is more than the predetermined threshold value.
- the informing controller 29 judges whether the amount of displacement of the vanishing point from the reference position, derived by the displacement-amount deriving part 28 , is more than a predetermined threshold value or not. In the case where the amount of displacement of the vanishing point is more than the threshold value, the informing controller 29 judges that the amount of displacement of the optical axis position of the camera 3 is more than the predetermined threshold value. In this case, the informing controller 29 outputs a predetermined signal to the display 4 and the speaker 5 so that the display 4 and the speaker 5 inform a user that the optical axis position of the camera 3 needs to be adjusted.
- a threshold value is set to a value equivalent to the case where the optical axis of the camera 3 is moved to the degree that a preceding vehicle cannot be recognized.
- the area changing part 27 changes the position of the target area according to the displacement of the optical axis position of the camera 3 .
- the amount of displacement of the vanishing point equivalent to the case where one end of the target area contacts with an edge of the outside-vehicle image is set as the threshold value. It is acceptable to inform a user that the optical axis position of the camera 3 should be adjusted in the case where one end of the target area contacts with an edge of the outside-vehicle image.
- Such a process makes it possible to inform a user that the optical axis of the camera 3 is displaced to the degree that a preceding vehicle cannot be recognized. Also a user can understand that an adjustment of the optical axis of the camera 3 is required.
- FIG. 8 shows a flowchart of a collision-warning process that is a main process to be performed by the image recognition apparatus 2 .
- a preceding vehicle is recognized and a warning is given to a user in a case where there is a high possibility of collision between the own vehicle 1 and the preceding vehicle.
- the collision-warning process shown in FIG. 8 is repeated in a predetermined cycle (e.g. 1/30 seconds) after the image recognition apparatus 2 is powered on and a predetermined operation is begun by a user.
- a predetermined cycle e.g. 1/30 seconds
- FIG. 9 shows a flowchart of an optical-axis-related process that is one of processes preformed by the image recognition apparatus 2 .
- a response is made for the case where an optical axis of the camera 3 is displaced from an initial position.
- This optical-axis-related process is repeated in a relatively-long period cycle (e.g. one month).
- the camera 3 captures a forward view of the own vehicle 1 and obtains an outside-vehicle image showing a forward view of the own vehicle 1 .
- This outside-vehicle image is input to the image processing part 21 through the signal receiver 30 (a step SA 1 ).
- the area setting part 23 sets a target area in an outside-vehicle image (a step SA 2 ).
- a position of the target area to be set is stored in RAM and the like in advance.
- the image recognition part 24 performs a recognition process by covering the target area in the outside-vehicle image. Thereby, the image recognition part 24 recognizes a preceding vehicle in a case where the preceding vehicle exists in front of the own vehicle 1 (a step SA 3 ).
- the warning controller 25 derives a relative distance between the own vehicle 1 and the preceding vehicle and relative acceleration between the own vehicle 1 and the preceding vehicle based on the image of the recognized preceding vehicle. Then, the warning controller 25 derives a criterion value indicating a possibility of collision between the own vehicle 1 and the preceding vehicle based on the derived relative distance and the relative acceleration (a step SA 4 ).
- the warning controller 25 judges whether the possibility of collision between the own vehicle 1 and the preceding vehicle is high or not. In other words, the warning controller 25 judges whether the criterion value indicating the possibility of collision is more than the predetermined threshold value or not (a step SA 5 ).
- the warning controller 25 transmits a signal to the speaker 5 so that the speaker 5 outputs warning sound inside a cabin of the own vehicle 1 . Thereby, a warning is given to a user to avoid the collision (a step SA 6 ). Such a warning makes it possible for the user to avoid the collision between the own vehicle 1 and the preceding vehicle.
- the process is terminated without giving a warning.
- the position identifying part 26 derives a vanishing point position in an outside-vehicle image. Thereby, the position identifying part 26 substantively identifies the optical axis position of the camera 3 relative to the own vehicle 1 (a step SB 1 ).
- the position identifying part 26 derives an amount of change of the vanishing point position from a previous position in a case where the vanishing point position has changed from the previous position.
- the vanishing point position in an outside-vehicle image (or the reference position) in a case where the optical axis position of the camera 3 is in the initial position can be used as the previous position.
- the displacement-amount deriving part 28 derives an amount of displacement of the vanishing point position from the reference position. Concretely, the displacement-amount deriving part 28 derives the amount of change of the vanishing point position from the reference position by accumulating the amounts of change of the vanishing point position derived in the past and this time. Thereby, the displacement-amount deriving part 28 substantively derives the amount of displacement of an optical axis position of the camera 3 relative to the own vehicle 1 from an initial position (a step SB 2 ).
- the informing controller 29 judges whether the amount of displacement of the vanishing point is more than the predetermined threshold value or not (a step SB 3 ). Thereby, the informing controller 29 substantively judges whether the amount of displacement of the optical axis position of the camera 3 from the initial position is more than the predetermined threshold value or not.
- the area changing part 27 changes a position of a target area in an outside-vehicle image. Concretely, based on the amount of change of the vanishing point position derived by the position identifying part 26 , the area changing part 27 changes the position of the target area to be set by the area setting part 23 in the outside-vehicle image (a step SB 4 ). Thereby, it is possible to recognize a preceding vehicle properly even though the optical axis of the camera 3 is displaced from the initial position. In this case, a user does not have to adjust the optical axis position of the camera 3 .
- the informing controller 29 informs a user that the optical axis position of the camera 3 is displaced largely from the initial position (a step SB 5 ).
- the informing controller 29 outputs a predetermined signal to the display 4 and the speaker 5 so that the display 4 and the speaker 5 inform that the optical axis position of the camera 3 needs to be adjusted.
- the image recognition part 24 recognizes a preceding vehicle based on a target area in an outside-vehicle image obtained by the camera 3 installed in the own vehicle 1 . Then the position identifying part 26 identifies an optical axis position of the camera 3 relative to the own vehicle 1 based on an outside-vehicle image, and the area changing part 27 changes the position of the target area in the outside-vehicle image according to the optical axis position of the camera 3 . Thereby, it is possible to recognize the preceding vehicle properly based on the target area in the outside-vehicle image even though the optical axis position of the camera 3 is displaced.
- the image recognition apparatus 2 in the first embodiment informs a user that the optical axis position of the camera 3 is displaced largely from the initial position.
- the image recognition apparatus 2 in the second embodiment automatically adjusts the optical axis position of the camera 3 in the case where the optical axis position of the camera 3 is displaced largely from the initial position.
- a configuration and a process flow of the image recognition apparatus 2 in the second embodiment are approximately the same as those in the first embodiment, and therefore, differences with the first embodiment are explained hereinbelow.
- FIG. 10 shows a configuration of the camera 3 in the second embodiment.
- the camera 3 in the second embodiment includes a rotation axis 17 and a motor 18 instead of the screw 14 in the first embodiment. If the motor 18 is driven, the rotation axis 17 rotates. An angle relative to the holding part 12 of the chassis 11 including the optical lens 10 is changed by the rotation of the rotation axis 17 . Therefore, for the camera 3 in this embodiment, it is possible to change a direction of the optical axis of the camera 3 relative to the own vehicle 1 by driving the motor 18 .
- FIG. 11 mainly shows a configuration of the body part 20 of the image recognition apparatus 2 in the second embodiment.
- the controller 22 in the second embodiment includes an optical axis adjustment part 39 instead of the informing controller 29 as a function performed by arithmetic processing.
- the optical axis adjustment part 39 changes a direction of an optical axis of the camera 3 relative to the own vehicle 1 by transmitting a signal to the motor 18 of the camera 3 . In a case where an amount of displacement of the optical axis position of the camera 3 from the initial position is more than the threshold value, the optical axis adjustment part 39 adjusts the optical axis position of the camera 3 .
- FIG. 12 shows a flow of the optical-axis-related process in the second embodiment.
- the flow of the optical-axis-related process in the second embodiment is explained hereinbelow with reference to this figure.
- the process of a step SC 1 to SC 4 shown in FIG. 12 is the same as the process of the step SB 1 to SB 4 .
- the position identifying part 26 derives a vanishing point position in an outside-vehicle image. Also, in a case where the vanishing point position has changed from the previous position, the position identifying part 26 derives an amount of change of the vanishing point position from the previous position (a step SC 1 ). Next, the displacement-amount deriving part 28 derives an amount of displacement of the vanishing point position from the reference position (a step SC 2 ). Then, the optical axis adjustment part 39 judges whether the amount of displacement of the vanishing point is more than the predetermined threshold value or not (a step SC 3 ).
- the area changing part 27 changes a position of a target area in an outside-vehicle image (a step SC 4 ).
- the optical axis adjustment part 39 adjusts the optical axis position of the camera 3 (a step SC 5 ).
- the optical axis adjustment part 39 transmits a signal to the motor 18 of the camera 3 so that the motor 18 moves the optical axis of the camera 3 .
- the optical axis adjustment part 39 moves the optical axis of the camera 3 only by the amount corresponding to the amount of displacement of the vanishing point.
- the optical axis adjustment part 39 moves the optical axis of the camera 3 in the direction opposite to the direction where the optical axis of the camera 3 is displaced relative to the own vehicle 1 .
- the optical axis adjustment part 39 moves the optical axis position of the camera 3 in the opposite direction, the upward direction. Thereby, the optical axis position of the camera 3 is returned to the initial position.
- the area changing part 27 also adjusts the position of the target area in an outside-vehicle image according to the optical axis position of the camera 3 after adjustment (a step SC 6 ). Concretely, the area changing part 27 changes the position of the target area to the initial position including the reference position of the vanishing point. Thereby, it is possible to recognize a preceding vehicle properly.
- the optical axis of the camera 3 is automatically adjusted even though the optical axis of the camera 3 is largely displaced to the degree that the preceding vehicle cannot be recognized properly. Thereby, a user can skip the step to adjust the optical axis position of the camera 3 .
- a preceding vehicle C running in front of the own vehicle is recognized as an object based on the outside-vehicle image G as shown in FIG. 13 .
- a vehicle H such as a bicycle and a motorcycle is allowed to be recognized as the object.
- a thing other than a vehicle such as a pedestrian is allowed to be recognized as the object.
- a lane mark LL and RL that define the traffic lane in which the own vehicle 1 is running are allowed to be recognized as the object.
- the target area D in the position where there is a high probability of appearance of the lane mark LL and RL in the outside-vehicle image G.
- the warning controller 25 examines, based on the lane marks, a possibility of lane departure of the own vehicle 1 from the traffic lane in which the own vehicle 1 is running, and if the lane departure is likely to be caused, it is allowed that the warning controller 25 warns a user that there is a high possibility of the lane departure.
- the warning controller 25 derives a distance between the own vehicle 1 and the respective right and left lane marks LL and RL, and derives acceleration that the own vehicle 1 approaches one of two lane marks LL and RL.
- the warning controller 25 derives a criterion value indicating the possibility of lane departure of the own vehicle 1 from the traffic lane based on the derived values. In a case where the criterion value is higher than the predetermined threshold value, the warning controller 25 warns a user to avoid the lane departure.
- the angle of view of the optical lens 10 of the camera 3 is 45 degrees.
- the optical lens of the camera 3 it is allowed to adopt a wide-angle lens whose angle of view is relatively wide. It is preferable that the angle of view of such a wide-angle lens is more than 100 degrees, further preferable that the angle of view is more than 120 degrees, much further preferable that the angle of view is more than 180 degrees.
- FIG. 15 shows an example of the outside-vehicle image G obtained by the camera 3 with a wide-angle lens.
- the outside-vehicle image G includes the wide area spreading right and left in front of the own vehicle 1 as a subject.
- it is allowed to set the target areas D at the right side and the left side of the outside-vehicle image G respectively.
- it is possible to recognize an object existing on the right side or left side of the own vehicle 1 which easily becomes a blind area for a driver, by setting the target areas D on both sides of the right and the left side of the outside-vehicle image G.
- a user can recognize an object such as a vehicle and a pedestrian approaching from a blind area when the own vehicle 1 enters an intersection with bad visibility, and the like. As a result, it is possible to avoid a collision with such a vehicle or a pedestrian.
- the explanation is given as follows: the camera 3 is installed on the inner surface of the windshield 15 and the optical axis of the camera 3 points ahead of the own vehicle 1 . On the other hand, it is allowed to place the camera 3 on a sideview mirror of the own vehicle 1 so that the optical axis of the camera 3 points in the left-hand direction or the right-hand direction of the own vehicle 1 .
- FIG. 16 shows an example of the outside-vehicle image G obtained in a case where the optical axis of the camera 3 with a wide-angle lens is pointed in the right-hand direction of the own vehicle 1 .
- the outside-vehicle image G includes the area from the front to back on the right side of the own vehicle 1 as a subject. Also in this case, as shown in FIG. 16 , it is preferable to set the target area D on the right and left side of the outside-vehicle image G respectively.
- the optical axis position of the camera 3 relative to the own vehicle 1 is substantively identified by recognizing the lane mark L defining the traffic lane in which the own vehicle 1 is running based on the outside-vehicle image G and by deriving the position of the lane mark L in the outside-vehicle image G.
- the camera 3 is installed on the rear window of the own vehicle 1 and the optical axis of the camera 3 is pointed in the backward direction of the own vehicle 1 . In this case, it is possible to recognize a vehicle approaching from the backward direction of the own vehicle 1 and a pedestrian existing behind the own vehicle 1 , as an object.
- the controller 22 transmits a predetermined signal to a vehicle controller that controls the behavior of the own vehicle 1 so that this vehicle controller controls the engine and the brake of the own vehicle 1 in order to avoid the collision between the own vehicle 1 and the object.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Traffic Control Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims (20)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2010239346A JP2012093872A (en) | 2010-10-26 | 2010-10-26 | Image recognition device and image recognition method |
| JP2010-239346 | 2010-10-26 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20120099763A1 US20120099763A1 (en) | 2012-04-26 |
| US8594377B2 true US8594377B2 (en) | 2013-11-26 |
Family
ID=45973058
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/227,746 Expired - Fee Related US8594377B2 (en) | 2010-10-26 | 2011-09-08 | Image recognition apparatus |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US8594377B2 (en) |
| JP (1) | JP2012093872A (en) |
Families Citing this family (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103365063B (en) * | 2012-03-31 | 2018-05-22 | 北京三星通信技术研究有限公司 | 3-D view image pickup method and equipment |
| JP6299373B2 (en) * | 2014-04-18 | 2018-03-28 | 富士通株式会社 | Imaging direction normality determination method, imaging direction normality determination program, and imaging direction normality determination apparatus |
| WO2015174208A1 (en) | 2014-05-12 | 2015-11-19 | ボッシュ株式会社 | Image-recognition device and method for controlling same |
| JP6543050B2 (en) * | 2015-03-06 | 2019-07-10 | 株式会社デンソーテン | Obstacle detection device and obstacle detection method |
| US10650252B2 (en) * | 2015-05-07 | 2020-05-12 | Hitachi, Ltd. | Lane detection device and lane detection method |
| EP3107068A1 (en) * | 2015-06-15 | 2016-12-21 | Continental Automotive GmbH | Vehicle diagnosis and camera adjustment using a detection of camera inclination angles |
| JP7167891B2 (en) * | 2019-09-24 | 2022-11-09 | トヨタ自動車株式会社 | Image processing device |
| JP7132271B2 (en) * | 2020-03-31 | 2022-09-06 | 本田技研工業株式会社 | behavior control system |
| US12219428B2 (en) * | 2021-07-31 | 2025-02-04 | Qualcomm Incorporated | Satellite signal environment determination and/or position estimate selection |
| DE102023114862A1 (en) * | 2023-06-06 | 2024-12-12 | Claas E-Systems Gmbh | Camera system for an agricultural machine |
Citations (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH08276787A (en) | 1995-04-03 | 1996-10-22 | Suzuki Motor Corp | In-vehicle image processing device and image display system |
| US5638116A (en) * | 1993-09-08 | 1997-06-10 | Sumitomo Electric Industries, Ltd. | Object recognition apparatus and method |
| US6990397B2 (en) * | 2002-12-09 | 2006-01-24 | Valeo Vision | System for controlling the in situ orientation of a vehicle headlamp, and method of use |
| US20070165909A1 (en) * | 2006-01-19 | 2007-07-19 | Valeo Vision | Method for adjusting the orientation of a camera installed in a vehicle and system for carrying out this method |
| US7599521B2 (en) * | 2004-11-30 | 2009-10-06 | Honda Motor Co., Ltd. | Vehicle vicinity monitoring apparatus |
| US20110115912A1 (en) * | 2007-08-31 | 2011-05-19 | Valeo Schalter Und Sensoren Gmbh | Method and system for online calibration of a video system |
| US8050458B2 (en) * | 2007-06-18 | 2011-11-01 | Honda Elesys Co., Ltd. | Frontal view imaging and control device installed on movable object |
| US8212878B2 (en) * | 2006-06-29 | 2012-07-03 | Hitachi, Ltd. | Calibration apparatus of on-vehicle camera, program, and car navigation system |
| US8355539B2 (en) * | 2007-09-07 | 2013-01-15 | Sri International | Radar guided vision system for vehicle validation and vehicle motion characterization |
| US8391556B2 (en) * | 2007-01-23 | 2013-03-05 | Valeo Schalter Und Sensoren Gmbh | Method and system for video-based road lane curvature measurement |
| US8396299B2 (en) * | 2006-11-08 | 2013-03-12 | Nec Corporation | Vanishing point detecting system, vanishing point detecting method, and vanishing point detecting program |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3287117B2 (en) * | 1994-07-05 | 2002-05-27 | 株式会社日立製作所 | Environment recognition device for vehicles using imaging device |
| JP3293441B2 (en) * | 1996-01-09 | 2002-06-17 | トヨタ自動車株式会社 | Imaging device |
| JP3820342B2 (en) * | 2000-08-31 | 2006-09-13 | 株式会社日立製作所 | In-vehicle imaging device |
| JP2004247949A (en) * | 2003-02-13 | 2004-09-02 | Sony Corp | Display board for prompter, method of manufacturing the same, and prompter apparatus using the same |
-
2010
- 2010-10-26 JP JP2010239346A patent/JP2012093872A/en active Pending
-
2011
- 2011-09-08 US US13/227,746 patent/US8594377B2/en not_active Expired - Fee Related
Patent Citations (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5638116A (en) * | 1993-09-08 | 1997-06-10 | Sumitomo Electric Industries, Ltd. | Object recognition apparatus and method |
| US6285393B1 (en) * | 1993-09-08 | 2001-09-04 | Sumitomo Electric Industries, Ltd. | Object recognition apparatus and method |
| JPH08276787A (en) | 1995-04-03 | 1996-10-22 | Suzuki Motor Corp | In-vehicle image processing device and image display system |
| US6990397B2 (en) * | 2002-12-09 | 2006-01-24 | Valeo Vision | System for controlling the in situ orientation of a vehicle headlamp, and method of use |
| US7599521B2 (en) * | 2004-11-30 | 2009-10-06 | Honda Motor Co., Ltd. | Vehicle vicinity monitoring apparatus |
| JP2007215169A (en) | 2006-01-19 | 2007-08-23 | Valeo Vision | Method of adjusting orientation of a camera installed on a vehicle and system for implementing this method |
| US20070165909A1 (en) * | 2006-01-19 | 2007-07-19 | Valeo Vision | Method for adjusting the orientation of a camera installed in a vehicle and system for carrying out this method |
| US8212878B2 (en) * | 2006-06-29 | 2012-07-03 | Hitachi, Ltd. | Calibration apparatus of on-vehicle camera, program, and car navigation system |
| US8396299B2 (en) * | 2006-11-08 | 2013-03-12 | Nec Corporation | Vanishing point detecting system, vanishing point detecting method, and vanishing point detecting program |
| US8391556B2 (en) * | 2007-01-23 | 2013-03-05 | Valeo Schalter Und Sensoren Gmbh | Method and system for video-based road lane curvature measurement |
| US8050458B2 (en) * | 2007-06-18 | 2011-11-01 | Honda Elesys Co., Ltd. | Frontal view imaging and control device installed on movable object |
| US20110115912A1 (en) * | 2007-08-31 | 2011-05-19 | Valeo Schalter Und Sensoren Gmbh | Method and system for online calibration of a video system |
| US8355539B2 (en) * | 2007-09-07 | 2013-01-15 | Sri International | Radar guided vision system for vehicle validation and vehicle motion characterization |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2012093872A (en) | 2012-05-17 |
| US20120099763A1 (en) | 2012-04-26 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US8594377B2 (en) | Image recognition apparatus | |
| CN113246993B (en) | driving support system | |
| US9620009B2 (en) | Vehicle surroundings monitoring device | |
| EP2830926B1 (en) | Vehicular imaging system and method | |
| CN105313778B (en) | Camera and the vehicle with the camera | |
| KR102130059B1 (en) | Digital rearview mirror control unit and method | |
| JP4980970B2 (en) | Image pickup means adjustment device and object detection device | |
| WO2017134982A1 (en) | Imaging device | |
| US10011227B2 (en) | Vehicular viewing device | |
| US20200031273A1 (en) | System for exchanging information between vehicles and control method thereof | |
| JP2018092505A (en) | Driving assistance device | |
| KR20160076736A (en) | Environment monitoring apparatus and method for vehicle | |
| KR101697484B1 (en) | Apparatus and method for warning a dangerous element of surrounding of vehicle | |
| KR101729030B1 (en) | Apparatus and method for warning a dangerous element of surrounding of vehicle | |
| CN104553985B (en) | Automobile steering auxiliary system | |
| JP6891082B2 (en) | Object distance detector | |
| KR102010407B1 (en) | Smart Rear-view System | |
| KR20200042714A (en) | Vehicle control system and vehicle control method | |
| US12485918B2 (en) | Vehicular driving assistance system with enhanced road curve management | |
| CN104508595A (en) | Vehicle Imaging System Providing Multi-Level Alignment Stability Indication | |
| JP2018136713A (en) | Driver visibility estimation device and vehicle control device | |
| KR102485318B1 (en) | System and Method for Recognizing Cut-in Vehicle on the basis of Monocular Image | |
| JP6584862B2 (en) | Object detection apparatus, object detection system, object detection method, and program | |
| JP6189729B2 (en) | Object detection apparatus, object detection system, object detection method, and program | |
| US20200111227A1 (en) | Orientation detection apparatus for vehicle, image processing system, vehicle, and orientation detection method for vehicle |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: FUJITSU TEN LIMITED, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KATOH, TETSUHIRO;MURASHITA, KIMITAKA;REEL/FRAME:026876/0572 Effective date: 20110831 |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| FPAY | Fee payment |
Year of fee payment: 4 |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
| FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
| FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20251126 |