EP2487648B1 - Vehicle periphery monitoring apparatus - Google Patents
Vehicle periphery monitoring apparatus Download PDFInfo
- Publication number
- EP2487648B1 EP2487648B1 EP12154336.7A EP12154336A EP2487648B1 EP 2487648 B1 EP2487648 B1 EP 2487648B1 EP 12154336 A EP12154336 A EP 12154336A EP 2487648 B1 EP2487648 B1 EP 2487648B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- mobile object
- vehicle
- image
- display
- detection frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Not-in-force
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/254—Analysis of motion involving subtraction of images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- the present invention relates to a vehicle periphery monitoring apparatus for displaying a mobile object, encircled by a display frame, which approaches a vehicle, in an image that is captured by an image capturing unit on the vehicle and displayed on a display unit in the vehicle.
- the vehicle periphery monitoring apparatus disclosed in Japanese Laid-Open Patent Publication No. 2001-216520 operates in the following manner.
- an image captured behind a vehicle with a camera installed on the vehicle is displayed on a display unit in the vehicle
- the profile of another vehicle approaching from behind is generated as a differential image
- a display frame, the size of which grows in synchronism with the size of the other approaching vehicle is superposed on the image of the other vehicle around a center of gravity or central point of the differential image.
- the vehicle periphery monitoring apparatus disclosed in Japanese Laid-Open Patent Publication No. 2006-252389 operates in the following manner.
- a three-dimensional object which approaches relatively toward a vehicle that carries a first camera and a second camera, is detected as an obstacle by the first camera and the second camera.
- a frame which surrounds the detected obstacle, is generated and displayed as a combined image on a display unit in the vehicle that carries the first camera and the second camera.
- US 2007/053551 displays such a frame only if a moving object is occluded by an obstacle.
- an approaching object is necessarily displayed while being surrounded by a display frame.
- an obstacle is necessarily displayed while being surrounded by a frame.
- the displayed frame may potentially annoy and bother the driver of the vehicle when the driver sees the displayed image.
- the mobile object detector judges (analyzes) a travel path of the mobile object
- the display processor changes an area within which the detection frame is not displayed depending on the direction in which the judged (analyzed) travel path extends. Since the detection frame is displayed only when necessary in order to indicate the presence of the mobile object to the driver of the vehicle using the detection frame, the driver is prevented from becoming annoyed or bothered with unwanted images of the detection frame on the display unit.
- the image capturing device acquires a captured image behind the vehicle. Therefore, the image capturing device is useful as a rearward visual assistance tool at a time when the vehicle moves backwards or reverses direction.
- the display processor produces a non-display area for the detection frame, which includes the rear area behind the vehicle and a rear lateral area opposite to the aforementioned rear lateral area behind the vehicle.
- the display processor decides that it is necessary to notify the driver concerning the presence of the mobile object, and displays the detection frame that encircles the mobile object.
- the display processor decides that it is not necessary to notify the driver concerning the presence of the mobile object, and does not display the detection frame in the non-display area. Consequently, the detection frame is displayed only when necessary in order to indicate the presence of the mobile object to the driver using the detection frame, and hence the driver is prevented from becoming annoyed or bothered with unwanted images of the detection frame on the display unit.
- the display processor does not produce a non-display area for the detection frame.
- the mobile object is displayed as moving from a central back area toward a central near area of the displayed image on the display unit, and the detection frame encircling the mobile object does not move laterally, but grows progressively larger in size.
- the detection frame which is displayed in this manner, is useful to draw attention of the driver to the mobile object, and is unlikely to make the driver feel annoyed or bothered.
- the mobile object detector judges (analyzes) a travel path of the mobile object
- the display processor changes an area within which the detection frame is not displayed depending on the direction in which the judged (analyzed) travel path extends. Since the detection frame is displayed only when necessary in order to indicate the presence of the mobile object to the driver of the vehicle using the detection frame, the driver is prevented from becoming annoyed or bothered with unwanted images of the detection frame on the display unit.
- FIG. 1 shows in block form a vehicle 11, which incorporates therein a vehicle periphery monitoring apparatus 10 according to an embodiment of the present invention.
- the vehicle periphery monitoring apparatus 10 includes a camera (rear camera) 12 as an image capturing device for capturing an image of a mobile object or the like, and a sonar (back sonar) array 14, which serves as an ultrasonic sensor (sonic detector) for detecting an obstacle or the like.
- a camera rear camera
- sonar back sonar
- ultrasonic sensor ultrasonic detector
- the camera 12 is disposed centrally or substantially centrally on the outer side of either the tail gate of the vehicle 11, if the vehicle 11 is a minivan or the like, or the trunk lid of the vehicle 11, if the vehicle 11 is a sedan or the like.
- the sonar array 14 includes two corner sonars 14c disposed on respective corners of the rear bumper of the vehicle 11, and two central sonars 14m disposed on a central area of the rear bumper.
- the corner sonars 14c have respective detection ranges 102 each of which extends in a dotted-line sectorial shape, having a central angle of about 90° and a radius of about 60 cm rearwardly and laterally of the vehicle 11.
- the central sonars 14m have a joint detection range 104, which extends in a dotted-line trapezoidal shape having a length of about 1.5 m rearwardly of the vehicle 11.
- the camera 12 has an imaging field range, which will be described later.
- the vehicle periphery monitoring apparatus 10 includes an image processing ECU (Electric Control Unit) 16, a navigation ECU 18, a meter ECU 20, and a sonar ECU 22 provided in the form of a microcomputer.
- ECU Electronic Control Unit
- navigation ECU a navigation ECU 18
- meter ECU a meter ECU 20
- sonar ECU 22 provided in the form of a microcomputer.
- Each of the ECUs 16, 18, 20, 22 has a CPU (Central Processing Unit), a ROM (Read Only Memory) including an EEPROM (Electrically Erasable Programmable Read Only Memory), a RAM (Random Access Memory), input/output devices including an A/D converter, a D/A converter, etc., and a timer, etc.
- Each of the ECUs 16, 18, 20, 22 operates to perform various functions (also referred to as function performing means), e.g., a controller, an arithmetic unit, and a processor, etc., when the CPU reads programs stored in the ROM and executes the programs.
- the image processing ECU 16 functions as a mobile object detector 52, a detection frame generator 54, and a display processor 56.
- the detection frame generator 54 also functions as a detection arrow generator, as will be described later.
- the ECUs 16, 18, 20, 22 are connected to each other and perform mutual communications therebetween through a communication line 24 of a communication network such as a CAN (Controller Area Network) or the like.
- a communication network such as a CAN (Controller Area Network) or the like.
- a sonar switch 26 for turning on and off the detecting capability of the sonar array 14 is connected to the sonar ECU 22.
- a gear position switch 28, meters or instruments, not shown, and a speaker 34, which functions as an alarm unit, are connected to the meter ECU 20.
- the meter ECU 20 primarily controls the display of information that is indicated on the meters or instruments, which are disposed on the dashboard of the vehicle 11.
- the camera 12 and the image processing ECU 16 are connected to each other by a connecting line 33. Images, which are captured by the camera 12, are transmitted as a video signal Sv1 such as an NTSC signal or the like through the connecting line 33 to the image processing ECU 16.
- a video signal Sv1 such as an NTSC signal or the like
- a display unit 30 such as a liquid crystal display panel or the like is connected to the image processing ECU 16.
- the image processing ECU 16 also is connected to a display mode selection switch 32, which serves as a selector for selecting image display modes (display field ranges) for images captured by the camera 12.
- the navigation ECU 18, which includes a GPS receiver as a position detecting device for detecting the position of the vehicle 11, supplies an image signal Sq representative of a map image or the like to the image processing ECU 16.
- the meter ECU 20 detects a reverse position signal from the gear position switch 28, and sends the reverse position signal through the communication line 24 to the image processing ECU 16.
- the image processing ECU 16 then supplies a video signal Sv2, which is generated by processing the video signal Sv1 from the camera 12, to the display unit 30 in preference to the image signal Sq from the navigation ECU 18.
- the display unit 30 preferentially displays a real-time vehicle rear image, which is captured by the camera 12.
- the display mode selection switch 32 each time that the display mode selection switch 32 is actuated, sequentially switches to a wide view (rear wide view) image display mode, a normal view (rear normal view) image display mode, or a top down view (rear top down view) image display mode, as a display field range to be displayed on the display unit 30. Accordingly, the display mode selection switch 32 functions as a selector for selecting, one at a time, the display field ranges represented by the above image display modes.
- the display processor 56 processes the video signal Sv1 from the camera 12 in order to generate a video signal Sv2, and supplies the generated video signal Sv2 to the display unit 30.
- the display unit 30 displays a captured image represented by the video signal Sv2
- the image processing ECU 16 generates images representing, respectively, a detection frame, a detection arrow, guide lines, etc., to be described later, combines the respective generated images together with the captured image, and displays the combined images on the display unit 30.
- the camera 12 is fitted with an ultra wide angle lens, such as a fisheye lens or the like, having a horizontal visual angle of about 180°, which is capable of capturing images within a wide field range. Therefore, the video signal Sv1 output from the camera 12 is representative of an imaging field range, which corresponds to the wide view image display mode.
- an ultra wide angle lens such as a fisheye lens or the like
- FIG. 2 schematically shows display field ranges of the display unit 30 in respective image display modes (views).
- the wide view image display mode includes a display field range 44, which corresponds to the imaging field range of the camera 12.
- the display field range 44 represents a range behind a rear end face 40 of the vehicle 11, which covers lateral horizontal lines on both sides of the vehicle 11, including the ground, and a rear horizontal line including the ground.
- the normal view image display mode includes a display field range 42, which corresponds to a portion of the imaging field range of the camera 12.
- the display field range 42 represents a solid-line range covering up to 0.25 m or more laterally outward from side surfaces 41 of the vehicle 11, and from 0.3 m to 4.2 m behind the rear end face 40 of the vehicle 11.
- the display field range 42 also covers a vertical extent from the ground up to a height ranging from 2 m to 4 m.
- the display unit 30 displays an image within a range that also covers the rear horizontal line.
- the top down view image display mode has a display field range 46, which represents a dotted-line range covering up to about 0.25 m laterally outward from the side surfaces 41 of the vehicle 11, and from 0.3 m to 1.2 m behind the rear end face 40 of the vehicle 11.
- the display field range 46 also covers a vertical extent from the ground up to a height of about 1 m, at which the camera 12 is installed.
- the display unit 30 displays an image, which represents only the ground, provided there is no object other than the ground in the display field range 46.
- a basic image processing sequence of the vehicle periphery monitoring apparatus 10 will be described below with reference to FIGS. 3 and 4 .
- FIG. 3 is a view, which is illustrative of an image processing sequence of the vehicle periphery monitoring apparatus 10.
- the camera 12 captures an image behind the vehicle 11, and supplies a video signal Sv1, representing the captured image every 1/30 second at an NTSC signal rate, through the connecting line 33 to the image processing ECU 16.
- An image ia captured by the camera 12 (hereinafter referred to as a "camera image ia"), which is shown in the left-hand section of FIG. 3 , is an image represented by the video signal Sv1 (imaging field range ⁇ display field range).
- the mobile object detector 52 of the image processing ECU 16 performs a detecting process for detecting a mobile object 60 within the camera image ia, thereby generating a differential image (second differential image) ic, which is shown in the lower middle section of FIG. 3 .
- the detection frame generator 54 uses the differential image ic, the detection frame generator 54 generates a detection frame 62, which encircles the detected mobile object 60.
- the camera image ia shown in FIG. 3 includes an image of a rear bumper 80 of the vehicle 11, which is displayed at all times, an image of the mobile object 60, and an image of a background 86 including a horizontal line 84 therein.
- the display processor 56 performs, on the image ia, a vertical and horizontal magnification correcting process corresponding to an image display mode ⁇ view (display field range) ⁇ , which is selected by the display mode selection switch 32. More specifically, according to the vertical and horizontal magnification correcting process, the display processor 56 converts the camera image ia into an image ib.
- the image ib which has been processed according to the vertical and horizontal magnification correcting process in accordance with the image display mode selected by the display mode selection switch 32, is plotted as an image in the wide view image display mode, which has substantially the same field range as the camera image ia.
- the detecting process performed by the mobile object detector 52 in order to detect the mobile object 60 is a known process, e.g., an interframe differential process, which is performed on the camera image ia.
- FIG. 4 is a view, which is illustrative of an interframe differential process as the detecting process for detecting the mobile object 60.
- a scene image is illustrated in plan in the left-hand section of FIG. 4 .
- the camera 12 successively produces three camera images, i.e. i0, i1, i2, respectively at time t0, time t0+1 (1 represents a minute time ⁇ t, which is 1/30 second), and time t0+2.
- a mobile object 90 apart from the vehicle 11 does not exist (is not imaged) at time t0.
- the mobile object 90 is imaged by the camera 12.
- the same mobile object 90' is imaged at an enlarged scale as the mobile object 90' approaches the vehicle 11.
- the mobile object detector 52 performs the detecting process when the vehicle 11 travels in a reverse direction at a speed of 5 km/h or lower. Consequently, the background in the three camera images i0, i1, i2 can be regarded as a still object with respect to the mobile object 90.
- the mobile object detector 52 detects, as a second differential image ic, an image only of the mobile object 60, which is shown in the lower middle section of FIG. 3 .
- the detection frame generator 54 produces a profile image (differential image) of the second differential image ic according to a known differential process, for example, the process disclosed in Japanese Laid-Open Patent Publication No. 2001-216520 .
- the detection frame generator 54 also generates a rectangular frame, the sides of which extend parallel to the horizontal and vertical directions, such that the rectangular frame passes through vertices of a quadrangle circumscribing the profile image.
- the detection frame generator 54 superposes the generated rectangular frame on the image position of the mobile object 60, to thereby serve as the detection frame 62.
- a rectangular frame which is slightly enlarged from the generated rectangular frame, is displayed as the detection frame 62 for better visibility.
- the path along which the mobile object 60 travels (travel path) and the direction of the path along which the mobile object 60 travels (direction of travel path) can be obtained as a result of the mobile object detector 52 or the detection frame generator 54 plotting and storing the center of gravity of the image of the mobile object 60 as the mobile object 60 travels.
- the display processor 56 then superposes the detection frame 62 on the mobile object 60 in the wide view image display mode, for example, as indicated in the image id, which is shown in the right-hand section of FIG. 3 . If the sonar array 14 detects obstacles within the detection ranges 102, 104, then the display processor 56 displays the detected obstacles on the display unit 30.
- the meter ECU 20 When the sonar array 14 detects obstacles that exist within the detection ranges 102, 104, the meter ECU 20 produces an alarm (e.g., a series of beeps) through the speaker 34.
- the meter ECU 20 may also produce an alarm through the speaker 34 if the travel path of the mobile object 60 or the direction thereof is oriented toward the vehicle 11.
- the display unit 30 When the gear position switch 28 is in a selected position other than the reverse position, the display unit 30 displays a map image from the navigation ECU 18, or a title image of an audio-visual (AV) system located in the vehicle 11. When the gear position switch 28 is shifted into the reverse position, the display unit 30 displays an image behind the vehicle 11, which is captured by the camera 12.
- AV audio-visual
- the display unit 30 cyclically switches to an image 110w (upper left) that is displayed in the display field range 44 (see FIG. 2 ) of the wide view image display mode (wide view image), to an image 110n (central left) that is displayed in the display field range 42 (see FIG. 2 ) of the normal view image display mode (normal view image), to an image 110t (lower left) that is displayed in the display field range 46 (see FIG. 2 ) of the top down view image display mode (top down view image), and to the wide view image 110w (upper left), and so on.
- the display unit 30 When the gear position switch 28 is shifted into the reverse position and the sonar switch 26 is turned on, if the vehicle 11 is traveling in reverse at a speed lower than 5 km/h, then, as shown in a middle section of FIG. 5 , each time that the display mode selection switch 32 is pressed, the display unit 30 cyclically switches to a wide view image 112w (upper middle), to a normal view image 112n (central middle), to a top down view image 112t (lower middle), and to the wide view image 112w (upper middle), and so on.
- each of the images 112w, 112n, 112t includes a detection-enabled icon 120 positioned in the lower right corner thereof, which indicates that the camera 12 and the sonar array 14 are capable of detecting a mobile object 60 or the like.
- the display unit 30 displays strip-like obstacle icons 122, 124 corresponding to the detection ranges 102, 104 in a given color and at a given blinking interval. For example, if the sonar array 14 detects an obstacle only within the central joint detection range 104, then the display unit 30 displays only the obstacle icon 124.
- the color of the obstacle icon 124 changes depending on the distance of the vehicle 11 from the detected obstacle, and the blinking interval of the obstacle icon 124 similarly changes depending on the distance from the detected obstacle.
- the detection frame 62 encircling the mobile object 60 is displayed within a predetermined area, i.e., outside of the display field range 42 (see FIG. 2 ) of the normal view image 112n.
- a portion of the mobile object 60 begins to enter into the display field range 42 of the normal view image 112n while the wide view image 112w is being displayed, the detection frame 62 disappears from view.
- the reason why the detection frame 62 disappears from view at this time is that, since the driver of the vehicle 11 drives the vehicle 11 while the driver visually confirms the area behind the vehicle 11 directly, the driver is highly likely to notice the vehicle 11 by means of the driver's own vision, and the driver is prevented from becoming annoyed or bothered with unwanted images of the detection frame 62 on the display unit 30.
- the detection frame 62 is not displayed, but the mobile object 60 is displayed.
- the display processor 56 displays detection arrows 114, 116 generated by the detection frame generator 54 in a predetermined position, and more specifically at a predetermined position in the normal view image 112n, which will be described in detail later.
- the wide view image 112w and the normal view image 112n also include guide lines superimposed thereon.
- the guide lines include transverse guide lines 202, 204, which are slightly longer than the width of the vehicle 11, an open-trunk-lid guide line 206, which has a length of 0.5 m although the actual length depends on the type of vehicle 11, and guide lines 208, 210, 212, which have respective lengths of about 1 m, about 2 m, and about 3 m, respectively.
- the guide lines 202, 204, 206, 208, 210, 212 are displayed as semitransparent yellow lines.
- the display unit 30 When the gear position switch 28 is shifted into the reverse position and the sonar switch 26 is turned on, if the vehicle 11 travels in reverse at a speed equal to or higher than 5 km/h, then, as shown in a right-hand section of FIG. 5 , each time that the display mode selection switch 32 is pressed, the display unit 30 cyclically switches to a wide view image 114w (upper right), to a normal view image 114n (central right), to a top down view image 114t (lower right), to the wide view image 114w (upper middle), and so on.
- each of the images 114w, 114n, 114t includes a detection-disabled icon 121 positioned in the lower right corner thereof, indicating that the camera 12 and the sonar array 14 are incapable of detecting a mobile object 60 or the like. More specifically, no detection frame 62 is displayed over the mobile object 60 in the wide view image 114w, and no detection arrows 114, 116 are disposed in the normal view image 114n and the top down view image 114t.
- the display mode selection switch 32 When the display mode selection switch 32 is pressed to change the wide view image display mode to the normal view image display mode, in which the title "Normal View” and also the normal view image 112n are displayed as shown in FIG. 8 , since a mobile object 60 does not exist in the normal view display field range, although the mobile object 60 is present in the imaging field range of the camera 12, i.e., although the image of the mobile object 60 is represented by the video signal Sv1, the mobile object 60 and the detection frame 62 are not displayed in the normal view image 112n.
- a detection arrow 114 which is directed toward the left-hand guide line 202, is displayed as a semitransparent guide line in a given position on the left-hand side (i.e., outside) of the left-hand guide line 202.
- the detection arrow 114 will be described in detail below.
- the left-hand guide line 202 which is displayed as inclined toward the center (right) in a direction away from the vehicle 11
- a detection arrow 114L is displayed outside of the guide line 202 as a semitransparent arrow, extending substantially perpendicularly to the guide line 202 and having a pointed end directed toward the guide line 202.
- the right-hand guide line 204 which is displayed as inclined toward the center (left) in a direction away from the vehicle 11
- a detection arrow 114R is displayed outside of the guide line 204 as a semitransparent arrow, extending substantially perpendicularly to the guide line 204 and having a pointed end directed toward the guide line 204.
- the display mode selection switch 32 is pressed in order to change the wide view image display mode to the normal view image display mode, as shown in FIG. 8 , since a mobile object 60 does not exist in the normal view display field range, although the mobile object 60 is present in the imaging field range of the camera 12, i.e., although the image of the mobile object 60 is represented by the video signal Sv1, the mobile object 60 and the detection frame 62 are not displayed in the normal view image 112n.
- the display processor 56 does not display the detection frame 62. More specifically, when the mobile object 60 is not displayed in the normal view image display mode, but is approaching the vehicle 11, the detection arrow 114 is displayed, and when the mobile object 60 is displayed in the normal view image display mode, the detection frame 62 and the detection arrow 114 are not displayed.
- a detection arrow 116L which is oriented horizontally toward the center of the top down view image 112t, is displayed in a predetermined position in a left-hand side section of the top down view image 112t, i.e., in a central position at the left-hand edge of the top down view image 112t.
- the display processor 56 displays a detection arrow 116L oriented toward the center of the top down view image 112t, in a predetermined position in a left-hand side section of the top down view image 112t. Also, when the mobile object 60 approaches the vehicle 11 from the right in the top down view image 112t, the display processor 56 displays a detection arrow 116R oriented toward the center of the top down view image 112t, in a predetermined (central) position in a right-hand side section of the top down view image 112t.
- the vehicle periphery monitoring apparatus 10 includes the camera 12 mounted on the vehicle 11 as an image capturing device for acquiring a captured image of a peripheral area of the vehicle 11, the mobile object detector 52 for detecting the mobile object 60 based on the captured image, the detection frame generator 54 for generating the detection frame 62, which encircles the mobile object 60 detected by the mobile object detector 52, the display unit 30, and the display processor 56 for displaying the detection frame 62 in a superposed relation to the captured image on the display unit 30.
- the vehicle periphery monitoring apparatus 10 also includes the display mode selection switch 32, which serves as a selector for selecting display field ranges for the captured image.
- the detection frame generator 54 selectively displays or does not display the detection frame 62, depending on the display field range selected by the display mode selection switch 32. Accordingly, the detection frame 62 can optimally be displayed or not, depending on the selected display field range. In other words, a display field range can be selected in optimum combination with a displayed detection frame or a non-displayed detection frame.
- the display processor 56 may display the detection arrows 114, 116, which indicate the direction of travel of the mobile object 60, so that the driver or user (image viewer) can recognize the direction in which the mobile object 60 travels at present, even though the mobile object 60 is not displayed in the display field range.
- the camera 12 is useful as a rearward visual assistance tool at a time that the vehicle 11 moves backwards or in reverse.
- the driver will basically confirm the rearward with its own vision while driving a car in reverse, as shown in a warning at the bottom of each of FIGS. 7, 8 , 10 or the like.
- the display processor 56 switches between the normal view image display mode or the top down image display mode, which provides a first display field range for displaying a predetermined area behind the vehicle 11, and the wide view image display mode, which provides a second display field range for displaying an area greater than the predetermined area behind the vehicle 11, depending on the display field range selected by the display mode selection switch 32.
- the display processor 56 does not display the detection frame 62.
- the display processor 56 displays the detection frame 62. Therefore, in the second display field range (the wide view image display mode shown in FIG.
- the displayed detection frame 62 allows the driver to easily identify the mobile object 60.
- the first display field range the normal view image display mode shown in FIG. 8 , for example
- the driver can visually recognize the mobile object 60 appropriately.
- the display processor 56 does not display the detection frame 62, thereby enabling the driver to visually recognize the mobile object 60 appropriately.
- the vehicle periphery monitoring apparatus 10 includes the camera 12, which is mounted on the vehicle 11 as an image capturing device for acquiring a captured image of a peripheral area of the vehicle 11, the mobile object detector 52 for detecting the mobile object 60 based on the captured image, the detection frame generator 54 for generating the detection frame 62 that encircles the mobile object 60 detected by the mobile object detector 52, the display unit 30, and the display processor 56 for displaying the detection frame 62 in a superposed relation to the captured image on the display unit 30.
- the mobile object detector 52 judges (analyzes) the travel path of the mobile object 60, and the display processor 56 changes an area within which the detection frame 62 is not displayed, depending on the direction in which the judged (analyzed) travel path extends. Since the detection frame 62 is displayed only when necessary in order to indicate the presence of the mobile object 60 to the driver using the detection frame 62, the driver is prevented from becoming annoyed or bothered with unwanted images of the detection frame 62 on the display unit 30.
- the camera 12 inasmuch as the camera 12 is installed so as to be capable of acquiring a captured image behind the vehicle 11, the camera 12 also is useful as a rearward visual assistance tool, at a time that the vehicle 11 moves backwards or in reverse.
- the display processor 56 produces a non-display area 158, shown in hatching, for the detection frame 62, which includes the rear area 154 and a rear lateral area 156 that is located opposite to the rear lateral area 152 across the rear area 154.
- Detection ranges 102, 104 of the sonar array 14 are included within the non-display area.
- the detection frame 62 which is selectively displayed and not displayed in the wide view image display mode as shown in FIG. 12 , will be described in detail below. Since the travel path 150 extends from the rear lateral area 152 and traverses the rear area 154, as long as the mobile object 60 is moving from the rear lateral area 152 toward the rear area 154, the display processor 56 decides that it is necessary to notify the driver concerning the presence of the mobile object 60, and displays the detection frame 62 that encircles the mobile object 60.
- the display processor 56 decides that it is not necessary to notify the driver concerning the presence of the mobile object 60, and does not display the detection frame 62 in the non-display area 158, which is shown in hatching. Consequently, the detection frame 62 is displayed only when necessary to notify the driver concerning the presence of the mobile object 60 using the detection frame 62, and hence, the driver is prevented from becoming annoyed or bothered with unwanted images of the detection frame 62 on the display unit 30.
- the display processor 56 does not produce a non-display area for the mobile object 160.
- the mobile object 160 is displayed as moving from a central back area toward a central near area of the wide view image, and the detection frame 162 encircling the mobile object 160 does not move laterally, but grows progressively larger in size.
- the detection frame 162 which is displayed in this manner, is useful to draw the attention of the driver to the mobile object 160, and is unlikely to make the driver feel annoyed or bothered.
- the detection ranges 102, 104 of the sonar array 14 may be included in a non-display area for the detection frame 162.
- the principles of the present invention are not limited to the detection of a mobile object behind the vehicle 11, as has been described in the above embodiment, but also may be applied to the detection of a mobile object in front of the vehicle 11.
- a vehicle periphery monitoring apparatus displays a mobile object on a display unit (30), encircled by a detection frame, which notifies the driver of the vehicle concerning the presence of the mobile object.
- a mobile object detector judges a travel path of the mobile object, and a display processor (56) changes an area within which the detection frame is not displayed, depending on the direction in which the judged travel path extends. The detection frame is displayed only when necessary, so as to indicate the presence of the mobile object to the driver using the detection frame.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Closed-Circuit Television Systems (AREA)
- Traffic Control Systems (AREA)
- User Interface Of Digital Computer (AREA)
Description
- The present invention relates to a vehicle periphery monitoring apparatus for displaying a mobile object, encircled by a display frame, which approaches a vehicle, in an image that is captured by an image capturing unit on the vehicle and displayed on a display unit in the vehicle. Description of the Related Art:
- Heretofore, there have been proposed vehicle periphery monitoring apparatus for displaying an object, encircled by a display frame, which approaches a vehicle, in an image that is captured by an image capturing unit on the vehicle and displayed in a display unit on the vehicle (see Japanese Laid-Open Patent Publication No.
2001-216520 2006-252389 - The vehicle periphery monitoring apparatus disclosed in Japanese Laid-Open Patent Publication No.
2001-216520 - The vehicle periphery monitoring apparatus disclosed in Japanese Laid-Open Patent Publication No.
2006-252389 -
US 2007/053551 displays such a frame only if a moving object is occluded by an obstacle. - According to Japanese Laid-Open Patent Publication No.
2001-216520 2006-252389 - It is an object of the present invention to provide a vehicle periphery monitoring apparatus, which displays a detection frame that makes the user, such as the driver of the vehicle or the like, feel less annoyed and bothered when the user sees the detection frame.
- According to the present invention, there is provided a vehicle periphery monitoring apparatus according to
claim 1. - The mobile object detector judges (analyzes) a travel path of the mobile object, and the display processor changes an area within which the detection frame is not displayed depending on the direction in which the judged (analyzed) travel path extends. Since the detection frame is displayed only when necessary in order to indicate the presence of the mobile object to the driver of the vehicle using the detection frame, the driver is prevented from becoming annoyed or bothered with unwanted images of the detection frame on the display unit.
- Preferably, the image capturing device acquires a captured image behind the vehicle. Therefore, the image capturing device is useful as a rearward visual assistance tool at a time when the vehicle moves backwards or reverses direction.
- If the travel path judged by the mobile object detector represents a direction that extends from a rear lateral area behind the vehicle and transversely across a rear area behind the vehicle, the display processor produces a non-display area for the detection frame, which includes the rear area behind the vehicle and a rear lateral area opposite to the aforementioned rear lateral area behind the vehicle.
- Since the travel path extends from the rear lateral area and traverses the rear area, as long as the mobile object is moving from the rear lateral area toward the rear area, the display processor decides that it is necessary to notify the driver concerning the presence of the mobile object, and displays the detection frame that encircles the mobile object. On the other hand, when the mobile object traverses the rear area and then enters the rear lateral area, the display processor decides that it is not necessary to notify the driver concerning the presence of the mobile object, and does not display the detection frame in the non-display area. Consequently, the detection frame is displayed only when necessary in order to indicate the presence of the mobile object to the driver using the detection frame, and hence the driver is prevented from becoming annoyed or bothered with unwanted images of the detection frame on the display unit.
- However, if the travel path judged by the mobile object detector represents a direction along which the mobile object approaches the vehicle from behind, the display processor does not produce a non-display area for the detection frame.
- In this case, the mobile object is displayed as moving from a central back area toward a central near area of the displayed image on the display unit, and the detection frame encircling the mobile object does not move laterally, but grows progressively larger in size. The detection frame, which is displayed in this manner, is useful to draw attention of the driver to the mobile object, and is unlikely to make the driver feel annoyed or bothered.
- According to the present invention, consequently, the mobile object detector judges (analyzes) a travel path of the mobile object, and the display processor changes an area within which the detection frame is not displayed depending on the direction in which the judged (analyzed) travel path extends. Since the detection frame is displayed only when necessary in order to indicate the presence of the mobile object to the driver of the vehicle using the detection frame, the driver is prevented from becoming annoyed or bothered with unwanted images of the detection frame on the display unit.
- The above and other objects, features, and advantages of the present invention will become more apparent from the following description when taken in conjunction with the accompanying drawings in which preferred embodiments of the present invention are shown by way of illustrative example.
-
-
FIG. 1 is a block diagram of a vehicle, which incorporates therein a vehicle periphery monitoring apparatus according to an embodiment of the present invention; -
FIG. 2 is a diagram showing, by way of example, a corresponding relationship between the imaging field range of a camera and display field ranges of a display unit; -
FIG. 3 is a view, which illustrates an image processing sequence of the vehicle periphery monitoring apparatus shown inFIG. 1 ; -
FIG. 4 is a view, which illustrates an interframe differential process for detecting a mobile object; -
FIG. 5 is a view showing displayed images corresponding to respective selected positions of a gear position switch, a display mode selection switch, and a sonar switch; -
FIG. 6A is a view showing guide lines displayed on a display screen; -
FIG. 6B is a view showing an example of how positions of the guide lines and the position of a vehicle are related to each other; -
FIG. 7 is a view showing an image displayed in a wide view image display mode, with the detection frame of a mobile object being displayed therein; -
FIG. 8 is a view showing an image displayed in a normal view image display mode, with a detection arrow being displayed therein; -
FIG. 9 is a view showing detection arrows displayed in left-hand and right-hand areas of an image in the normal view image display mode; -
FIG. 10 is a view showing an image displayed in a top down view image display mode, with a detection arrow being displayed therein; -
FIG. 11 is a view showing detection arrows displayed in left-hand and right-hand areas of an image in the top down view image display mode; -
FIG. 12 is a view showing an example of a display area and a non-display area for a detection frame in the wide view image display mode; -
FIG. 13 is a view showing another example of a display area and a non-display area for a detection frame in the wide view image display mode; and -
FIG. 14 is a view showing the manner in which a sonar detection range in the wide view image display mode is converted into a non-display area for a detection frame. -
FIG. 1 shows in block form avehicle 11, which incorporates therein a vehicleperiphery monitoring apparatus 10 according to an embodiment of the present invention. - As shown in
FIG. 1 , the vehicleperiphery monitoring apparatus 10 includes a camera (rear camera) 12 as an image capturing device for capturing an image of a mobile object or the like, and a sonar (back sonar)array 14, which serves as an ultrasonic sensor (sonic detector) for detecting an obstacle or the like. - The
camera 12 is disposed centrally or substantially centrally on the outer side of either the tail gate of thevehicle 11, if thevehicle 11 is a minivan or the like, or the trunk lid of thevehicle 11, if thevehicle 11 is a sedan or the like. Thesonar array 14 includes twocorner sonars 14c disposed on respective corners of the rear bumper of thevehicle 11, and twocentral sonars 14m disposed on a central area of the rear bumper. - The
corner sonars 14c haverespective detection ranges 102 each of which extends in a dotted-line sectorial shape, having a central angle of about 90° and a radius of about 60 cm rearwardly and laterally of thevehicle 11. Thecentral sonars 14m have ajoint detection range 104, which extends in a dotted-line trapezoidal shape having a length of about 1.5 m rearwardly of thevehicle 11. Thecamera 12 has an imaging field range, which will be described later. - The vehicle
periphery monitoring apparatus 10 includes an image processing ECU (Electric Control Unit) 16, anavigation ECU 18, ameter ECU 20, and asonar ECU 22 provided in the form of a microcomputer. - Each of the
ECUs ECUs - The
image processing ECU 16 functions as amobile object detector 52, adetection frame generator 54, and adisplay processor 56. Thedetection frame generator 54 also functions as a detection arrow generator, as will be described later. - The
ECUs communication line 24 of a communication network such as a CAN (Controller Area Network) or the like. - A
sonar switch 26 for turning on and off the detecting capability of thesonar array 14 is connected to thesonar ECU 22. - A gear position switch 28, meters or instruments, not shown, and a
speaker 34, which functions as an alarm unit, are connected to themeter ECU 20. The meter ECU 20 primarily controls the display of information that is indicated on the meters or instruments, which are disposed on the dashboard of thevehicle 11. - The
camera 12 and theimage processing ECU 16 are connected to each other by a connectingline 33. Images, which are captured by thecamera 12, are transmitted as a video signal Sv1 such as an NTSC signal or the like through the connectingline 33 to theimage processing ECU 16. - A
display unit 30 such as a liquid crystal display panel or the like is connected to theimage processing ECU 16. Theimage processing ECU 16 also is connected to a displaymode selection switch 32, which serves as a selector for selecting image display modes (display field ranges) for images captured by thecamera 12. - The
navigation ECU 18, which includes a GPS receiver as a position detecting device for detecting the position of thevehicle 11, supplies an image signal Sq representative of a map image or the like to theimage processing ECU 16. - When the
gear position switch 28 is shifted into a reverse position for moving thevehicle 11 in a rearward direction, themeter ECU 20 detects a reverse position signal from thegear position switch 28, and sends the reverse position signal through thecommunication line 24 to theimage processing ECU 16. Theimage processing ECU 16 then supplies a video signal Sv2, which is generated by processing the video signal Sv1 from thecamera 12, to thedisplay unit 30 in preference to the image signal Sq from thenavigation ECU 18. - In other words, when the
gear position switch 28 is shifted into the reverse position, thedisplay unit 30 preferentially displays a real-time vehicle rear image, which is captured by thecamera 12. - According to the present embodiment, each time that the display
mode selection switch 32 is actuated, the display mode selection switch 32 sequentially switches to a wide view (rear wide view) image display mode, a normal view (rear normal view) image display mode, or a top down view (rear top down view) image display mode, as a display field range to be displayed on thedisplay unit 30. Accordingly, the display mode selection switch 32 functions as a selector for selecting, one at a time, the display field ranges represented by the above image display modes. - Depending on the view (display field range) selected by the display
mode selection switch 32, thedisplay processor 56 processes the video signal Sv1 from thecamera 12 in order to generate a video signal Sv2, and supplies the generated video signal Sv2 to thedisplay unit 30. When thedisplay unit 30 displays a captured image represented by the video signal Sv2, theimage processing ECU 16 generates images representing, respectively, a detection frame, a detection arrow, guide lines, etc., to be described later, combines the respective generated images together with the captured image, and displays the combined images on thedisplay unit 30. - According to the present embodiment, the
camera 12 is fitted with an ultra wide angle lens, such as a fisheye lens or the like, having a horizontal visual angle of about 180°, which is capable of capturing images within a wide field range. Therefore, the video signal Sv1 output from thecamera 12 is representative of an imaging field range, which corresponds to the wide view image display mode. -
FIG. 2 schematically shows display field ranges of thedisplay unit 30 in respective image display modes (views). - As shown in
FIG. 2 , the wide view image display mode includes adisplay field range 44, which corresponds to the imaging field range of thecamera 12. Thedisplay field range 44 represents a range behind a rear end face 40 of thevehicle 11, which covers lateral horizontal lines on both sides of thevehicle 11, including the ground, and a rear horizontal line including the ground. - The normal view image display mode includes a
display field range 42, which corresponds to a portion of the imaging field range of thecamera 12. Thedisplay field range 42 represents a solid-line range covering up to 0.25 m or more laterally outward from side surfaces 41 of thevehicle 11, and from 0.3 m to 4.2 m behind the rear end face 40 of thevehicle 11. Thedisplay field range 42 also covers a vertical extent from the ground up to a height ranging from 2 m to 4 m. In the normal view image display mode, thedisplay unit 30 displays an image within a range that also covers the rear horizontal line. - The top down view image display mode has a
display field range 46, which represents a dotted-line range covering up to about 0.25 m laterally outward from the side surfaces 41 of thevehicle 11, and from 0.3 m to 1.2 m behind the rear end face 40 of thevehicle 11. Thedisplay field range 46 also covers a vertical extent from the ground up to a height of about 1 m, at which thecamera 12 is installed. In the top down view image display mode, thedisplay unit 30 displays an image, which represents only the ground, provided there is no object other than the ground in thedisplay field range 46. - A basic image processing sequence of the vehicle
periphery monitoring apparatus 10 will be described below with reference toFIGS. 3 and4 . -
FIG. 3 is a view, which is illustrative of an image processing sequence of the vehicleperiphery monitoring apparatus 10. - When the
gear position switch 28 is shifted into the reverse position, thecamera 12 captures an image behind thevehicle 11, and supplies a video signal Sv1, representing the captured image every 1/30 second at an NTSC signal rate, through the connectingline 33 to theimage processing ECU 16. - An image ia captured by the camera 12 (hereinafter referred to as a "camera image ia"), which is shown in the left-hand section of
FIG. 3 , is an image represented by the video signal Sv1 (imaging field range ≈ display field range). Themobile object detector 52 of theimage processing ECU 16 performs a detecting process for detecting amobile object 60 within the camera image ia, thereby generating a differential image (second differential image) ic, which is shown in the lower middle section ofFIG. 3 . Then, using the differential image ic, thedetection frame generator 54 generates adetection frame 62, which encircles the detectedmobile object 60. - The camera image ia shown in
FIG. 3 includes an image of arear bumper 80 of thevehicle 11, which is displayed at all times, an image of themobile object 60, and an image of abackground 86 including ahorizontal line 84 therein. - The
display processor 56 performs, on the image ia, a vertical and horizontal magnification correcting process corresponding to an image display mode {view (display field range)}, which is selected by the displaymode selection switch 32. More specifically, according to the vertical and horizontal magnification correcting process, thedisplay processor 56 converts the camera image ia into an image ib. - In
FIG. 3 , the image ib, which has been processed according to the vertical and horizontal magnification correcting process in accordance with the image display mode selected by the displaymode selection switch 32, is plotted as an image in the wide view image display mode, which has substantially the same field range as the camera image ia. - The detecting process performed by the
mobile object detector 52 in order to detect themobile object 60 is a known process, e.g., an interframe differential process, which is performed on the camera image ia. -
FIG. 4 is a view, which is illustrative of an interframe differential process as the detecting process for detecting themobile object 60. A scene image is illustrated in plan in the left-hand section ofFIG. 4 . As shown in the middle section ofFIG. 4 , thecamera 12 successively produces three camera images, i.e. i0, i1, i2, respectively at time t0, time t0+1 (1 represents a minute time Δt, which is 1/30 second), andtime t0+ 2. - Within an
imaging field range 92 of thecamera 12, amobile object 90 apart from thevehicle 11 does not exist (is not imaged) at time t0. Attime t0+ 1, themobile object 90 is imaged by thecamera 12. Attime t0+ 2, the same mobile object 90' is imaged at an enlarged scale as the mobile object 90' approaches thevehicle 11. - For facilitating understanding of the present invention, it is assumed that the background remains unchanged in the three camera images i0, i1, i2. Practically, the
mobile object detector 52 performs the detecting process when thevehicle 11 travels in a reverse direction at a speed of 5 km/h or lower. Consequently, the background in the three camera images i0, i1, i2 can be regarded as a still object with respect to themobile object 90. - First, the
mobile object detector 52 performs subtraction on the pixel data of the camera images i0, i1 (i01 = i1 - i0) in order to extract an image of themobile object 90, i.e., a first differential image i01, from which the background image has been removed. - Then, the
mobile object detector 52 performs an arithmetic operation (i02 = i2 - il + i01) in order to extract a second differential image i02. More specifically, themobile object detector 52 subtracts the camera image i1 from the camera image i2 (i2 - i1) in order to delete the background, thereby leaving images of themobile objects 90 and 90'. Then, themobile object detector 52 removes the image i01 of themobile object 90 according to (i2 - i1) + i01, so as to extract only an image of the mobile object 90' attime t0+ 2, in the second differential image i02. - Therefore, as shown in
FIG. 4 , when the interframe differential process is performed on the camera image ia represented by the video signal Sv1 shown inFIG. 3 as a detecting process for detecting themobile object 60, themobile object detector 52 detects, as a second differential image ic, an image only of themobile object 60, which is shown in the lower middle section ofFIG. 3 . - As shown in
FIG. 3 , thedetection frame generator 54 produces a profile image (differential image) of the second differential image ic according to a known differential process, for example, the process disclosed in Japanese Laid-Open Patent Publication No.2001-216520 detection frame generator 54 also generates a rectangular frame, the sides of which extend parallel to the horizontal and vertical directions, such that the rectangular frame passes through vertices of a quadrangle circumscribing the profile image. Thedetection frame generator 54 superposes the generated rectangular frame on the image position of themobile object 60, to thereby serve as thedetection frame 62. According to the present embodiment, a rectangular frame, which is slightly enlarged from the generated rectangular frame, is displayed as thedetection frame 62 for better visibility. - The path along which the
mobile object 60 travels (travel path) and the direction of the path along which themobile object 60 travels (direction of travel path) can be obtained as a result of themobile object detector 52 or thedetection frame generator 54 plotting and storing the center of gravity of the image of themobile object 60 as themobile object 60 travels. - The
display processor 56 then superposes thedetection frame 62 on themobile object 60 in the wide view image display mode, for example, as indicated in the image id, which is shown in the right-hand section ofFIG. 3 . If thesonar array 14 detects obstacles within the detection ranges 102, 104, then thedisplay processor 56 displays the detected obstacles on thedisplay unit 30. - When the
sonar array 14 detects obstacles that exist within the detection ranges 102, 104, themeter ECU 20 produces an alarm (e.g., a series of beeps) through thespeaker 34. Themeter ECU 20 may also produce an alarm through thespeaker 34 if the travel path of themobile object 60 or the direction thereof is oriented toward thevehicle 11. - The basic image processing sequence of the vehicle
periphery monitoring apparatus 10 has been described above. - Relationships between selected positions of the
gear position switch 28, the displaymode selection switch 32, thesonar switch 26, and the image display modes will be described in detail below with reference toFIGS. 5 ,6A and 6B . - When the
gear position switch 28 is in a selected position other than the reverse position, thedisplay unit 30 displays a map image from thenavigation ECU 18, or a title image of an audio-visual (AV) system located in thevehicle 11. When thegear position switch 28 is shifted into the reverse position, thedisplay unit 30 displays an image behind thevehicle 11, which is captured by thecamera 12. - According to a comparative example in which the
detection frame 62 is not displayed, as shown in the left-hand section ofFIG. 5 , each time that the displaymode selection switch 32 is pressed, thedisplay unit 30 cyclically switches to animage 110w (upper left) that is displayed in the display field range 44 (seeFIG. 2 ) of the wide view image display mode (wide view image), to animage 110n (central left) that is displayed in the display field range 42 (seeFIG. 2 ) of the normal view image display mode (normal view image), to animage 110t (lower left) that is displayed in the display field range 46 (seeFIG. 2 ) of the top down view image display mode (top down view image), and to thewide view image 110w (upper left), and so on. - When the
gear position switch 28 is shifted into the reverse position and thesonar switch 26 is turned on, if thevehicle 11 is traveling in reverse at a speed lower than 5 km/h, then, as shown in a middle section ofFIG. 5 , each time that the displaymode selection switch 32 is pressed, thedisplay unit 30 cyclically switches to awide view image 112w (upper middle), to anormal view image 112n (central middle), to a top downview image 112t (lower middle), and to thewide view image 112w (upper middle), and so on. At this time, each of theimages icon 120 positioned in the lower right corner thereof, which indicates that thecamera 12 and thesonar array 14 are capable of detecting amobile object 60 or the like. If thesonar array 14 detects obstacles within the detection ranges 102, 104 (seeFIG. 1 ), then thedisplay unit 30 displays strip-like obstacle icons sonar array 14 detects an obstacle only within the centraljoint detection range 104, then thedisplay unit 30 displays only theobstacle icon 124. The color of theobstacle icon 124 changes depending on the distance of thevehicle 11 from the detected obstacle, and the blinking interval of theobstacle icon 124 similarly changes depending on the distance from the detected obstacle. - In the
wide view image 112w shown in the middle section ofFIG. 5 , thedetection frame 62 encircling themobile object 60 is displayed within a predetermined area, i.e., outside of the display field range 42 (seeFIG. 2 ) of thenormal view image 112n. When a portion of themobile object 60 begins to enter into thedisplay field range 42 of thenormal view image 112n while thewide view image 112w is being displayed, thedetection frame 62 disappears from view. The reason why thedetection frame 62 disappears from view at this time is that, since the driver of thevehicle 11 drives thevehicle 11 while the driver visually confirms the area behind thevehicle 11 directly, the driver is highly likely to notice thevehicle 11 by means of the driver's own vision, and the driver is prevented from becoming annoyed or bothered with unwanted images of thedetection frame 62 on thedisplay unit 30. - In the
normal view image 112n or the top downview image 112t shown in the middle section ofFIG. 5 , thedetection frame 62 is not displayed, but themobile object 60 is displayed. When themobile object 60 is outside of thedisplay field range 42, if themobile object 60 is detected within the range of thewide view image 112w, i.e., the imaging field range of thecamera 12, and further is approaching the vehicle 11 (i.e., the camera 12), then thedisplay processor 56displays detection arrows detection frame generator 54 in a predetermined position, and more specifically at a predetermined position in thenormal view image 112n, which will be described in detail later. - The
wide view image 112w and thenormal view image 112n also include guide lines superimposed thereon. As shown inFIGS. 6A and 6B , the guide lines includetransverse guide lines vehicle 11, an open-trunk-lid guide line 206, which has a length of 0.5 m although the actual length depends on the type ofvehicle 11, and guidelines - When the
gear position switch 28 is shifted into the reverse position and thesonar switch 26 is turned on, if thevehicle 11 travels in reverse at a speed equal to or higher than 5 km/h, then, as shown in a right-hand section ofFIG. 5 , each time that the displaymode selection switch 32 is pressed, thedisplay unit 30 cyclically switches to awide view image 114w (upper right), to anormal view image 114n (central right), to a top downview image 114t (lower right), to thewide view image 114w (upper middle), and so on. At this time, each of theimages disabled icon 121 positioned in the lower right corner thereof, indicating that thecamera 12 and thesonar array 14 are incapable of detecting amobile object 60 or the like. More specifically, nodetection frame 62 is displayed over themobile object 60 in thewide view image 114w, and nodetection arrows normal view image 114n and the top downview image 114t. - The relationship between selected positions of the
gear position switch 28, the displaymode selection switch 32, thesonar switch 26, and the image display modes has been described above. - Operations of the vehicle
periphery monitoring apparatus 10 in a first inventive example (manner of displaying a detection frame depending on switching between display field ranges) and in a second inventive example (manner of displaying a detection arrow depending on the travel path of a mobile object) will be described below. - As shown in
FIG. 7 , while the title "Wide View" and also thewide view image 112w are displayed in the wide view image display mode, if amobile object 60 is detected outside (i.e., on the right-hand or left-hand side) of the normal view display field range indicated by the two-dot-and-dash lines as imaginary lines, then adetection frame 62 is added to themobile object 60. - When the display
mode selection switch 32 is pressed to change the wide view image display mode to the normal view image display mode, in which the title "Normal View" and also thenormal view image 112n are displayed as shown inFIG. 8 , since amobile object 60 does not exist in the normal view display field range, although themobile object 60 is present in the imaging field range of thecamera 12, i.e., although the image of themobile object 60 is represented by the video signal Sv1, themobile object 60 and thedetection frame 62 are not displayed in thenormal view image 112n. - In the normal view image display mode, if the direction of the travel path of a
mobile object 60 is detected as being oriented from the left-hand side toward thevehicle 11, then adetection arrow 114, which is directed toward the left-hand guide line 202, is displayed as a semitransparent guide line in a given position on the left-hand side (i.e., outside) of the left-hand guide line 202. - The
detection arrow 114 will be described in detail below. As shown inFIG. 8 , when themobile object 60 is detected as approaching, from a rear left side, the left-hand guide line 202, which is displayed as inclined toward the center (right) in a direction away from thevehicle 11, adetection arrow 114L (seeFIG. 9 ) is displayed outside of theguide line 202 as a semitransparent arrow, extending substantially perpendicularly to theguide line 202 and having a pointed end directed toward theguide line 202. When themobile object 60 is detected as approaching, from a rear right side, the right-hand guide line 204, which is displayed as inclined toward the center (left) in a direction away from thevehicle 11, adetection arrow 114R (seeFIG. 9 ) is displayed outside of theguide line 204 as a semitransparent arrow, extending substantially perpendicularly to theguide line 204 and having a pointed end directed toward theguide line 204. - As mentioned above, when the display
mode selection switch 32 is pressed in order to change the wide view image display mode to the normal view image display mode, as shown inFIG. 8 , since amobile object 60 does not exist in the normal view display field range, although themobile object 60 is present in the imaging field range of thecamera 12, i.e., although the image of themobile object 60 is represented by the video signal Sv1, themobile object 60 and thedetection frame 62 are not displayed in thenormal view image 112n. - According to the first inventive example, even when a portion of the
mobile object 60 is displayed in the normal view image display mode, as shown inFIG. 8 , thedisplay processor 56 does not display thedetection frame 62. More specifically, when themobile object 60 is not displayed in the normal view image display mode, but is approaching thevehicle 11, thedetection arrow 114 is displayed, and when themobile object 60 is displayed in the normal view image display mode, thedetection frame 62 and thedetection arrow 114 are not displayed. - When the display
mode selection switch 32 is pressed in order to change the wide view image display mode, as shown inFIG. 7 , to the top down view image display mode, in which the title "Top Down View" and also the top downview image 112t are displayed as shown inFIG. 10 , if amobile object 60 is detected as approaching thevehicle 11 from the left in the top downview image 112t, then adetection arrow 116L, which is oriented horizontally toward the center of the top downview image 112t, is displayed in a predetermined position in a left-hand side section of the top downview image 112t, i.e., in a central position at the left-hand edge of the top downview image 112t. - In the top down view image display mode, when the
mobile object 60 approaches thevehicle 11 from the left in the top downview image 112t, thedisplay processor 56 displays adetection arrow 116L oriented toward the center of the top downview image 112t, in a predetermined position in a left-hand side section of the top downview image 112t. Also, when themobile object 60 approaches thevehicle 11 from the right in the top downview image 112t, thedisplay processor 56 displays adetection arrow 116R oriented toward the center of the top downview image 112t, in a predetermined (central) position in a right-hand side section of the top downview image 112t. - As described above, in the first inventive example, the vehicle
periphery monitoring apparatus 10 includes thecamera 12 mounted on thevehicle 11 as an image capturing device for acquiring a captured image of a peripheral area of thevehicle 11, themobile object detector 52 for detecting themobile object 60 based on the captured image, thedetection frame generator 54 for generating thedetection frame 62, which encircles themobile object 60 detected by themobile object detector 52, thedisplay unit 30, and thedisplay processor 56 for displaying thedetection frame 62 in a superposed relation to the captured image on thedisplay unit 30. - The vehicle
periphery monitoring apparatus 10 also includes the displaymode selection switch 32, which serves as a selector for selecting display field ranges for the captured image. Thedetection frame generator 54 selectively displays or does not display thedetection frame 62, depending on the display field range selected by the displaymode selection switch 32. Accordingly, thedetection frame 62 can optimally be displayed or not, depending on the selected display field range. In other words, a display field range can be selected in optimum combination with a displayed detection frame or a non-displayed detection frame. - When the
detection frame 62 is not displayed, thedisplay processor 56 may display thedetection arrows mobile object 60, so that the driver or user (image viewer) can recognize the direction in which themobile object 60 travels at present, even though themobile object 60 is not displayed in the display field range. - Inasmuch as the
camera 12 is installed so as to acquire a captured image behind thevehicle 11, thecamera 12 is useful as a rearward visual assistance tool at a time that thevehicle 11 moves backwards or in reverse. The driver, however, will basically confirm the rearward with its own vision while driving a car in reverse, as shown in a warning at the bottom of each ofFIGS. 7, 8 ,10 or the like. - When the
camera 12 is used as a rearward visual assistance tool, thedisplay processor 56 switches between the normal view image display mode or the top down image display mode, which provides a first display field range for displaying a predetermined area behind thevehicle 11, and the wide view image display mode, which provides a second display field range for displaying an area greater than the predetermined area behind thevehicle 11, depending on the display field range selected by the displaymode selection switch 32. When the first display field range (the normal view image display mode or the top down image display mode) is selected, thedisplay processor 56 does not display thedetection frame 62. When the second display field range (the wide view image display mode) is selected, thedisplay processor 56 displays thedetection frame 62. Therefore, in the second display field range (the wide view image display mode shown inFIG. 7 ) in which themobile object 60 is displayed at a relatively small scale, the displayeddetection frame 62 allows the driver to easily identify themobile object 60. On the other hand, in the first display field range (the normal view image display mode shown inFIG. 8 , for example) in which themobile object 60 is displayed at a relatively large scale, the driver can visually recognize themobile object 60 appropriately. - Stated otherwise, when the
mobile object 60, which approaches thevehicle 11 from behind, comes within a predetermined distance from thevehicle 11, thedisplay processor 56 does not display thedetection frame 62, thereby enabling the driver to visually recognize themobile object 60 appropriately. - In the second inventive example, the vehicle
periphery monitoring apparatus 10 includes thecamera 12, which is mounted on thevehicle 11 as an image capturing device for acquiring a captured image of a peripheral area of thevehicle 11, themobile object detector 52 for detecting themobile object 60 based on the captured image, thedetection frame generator 54 for generating thedetection frame 62 that encircles themobile object 60 detected by themobile object detector 52, thedisplay unit 30, and thedisplay processor 56 for displaying thedetection frame 62 in a superposed relation to the captured image on thedisplay unit 30. - The
mobile object detector 52 judges (analyzes) the travel path of themobile object 60, and thedisplay processor 56 changes an area within which thedetection frame 62 is not displayed, depending on the direction in which the judged (analyzed) travel path extends. Since thedetection frame 62 is displayed only when necessary in order to indicate the presence of themobile object 60 to the driver using thedetection frame 62, the driver is prevented from becoming annoyed or bothered with unwanted images of thedetection frame 62 on thedisplay unit 30. - In the second inventive example, inasmuch as the
camera 12 is installed so as to be capable of acquiring a captured image behind thevehicle 11, thecamera 12 also is useful as a rearward visual assistance tool, at a time that thevehicle 11 moves backwards or in reverse. - More specifically, in the wide view image display mode, as shown in
FIG. 12 , if themobile object 60 travels along atravel path 150, i.e., along the direction in which thetravel path 150 extends, which is judged (analyzed) by themobile object detector 52, and thetravel path 150 represents a direction that extends from arear lateral area 152 behind thevehicle 11 and transversely across arear area 154 behind thevehicle 11, then thedisplay processor 56 produces anon-display area 158, shown in hatching, for thedetection frame 62, which includes therear area 154 and arear lateral area 156 that is located opposite to therear lateral area 152 across therear area 154. Accordingly, the driver does not visually recognize, and hence is prevented from becoming annoyed or bothered with, thedetection frame 62 for themobile object 60, which does not actually approach thevehicle 11. Detection ranges 102, 104 of thesonar array 14 are included within the non-display area. - The
detection frame 62, which is selectively displayed and not displayed in the wide view image display mode as shown inFIG. 12 , will be described in detail below. Since thetravel path 150 extends from therear lateral area 152 and traverses therear area 154, as long as themobile object 60 is moving from therear lateral area 152 toward therear area 154, thedisplay processor 56 decides that it is necessary to notify the driver concerning the presence of themobile object 60, and displays thedetection frame 62 that encircles themobile object 60. When themobile object 60 traverses therear area 154 and then enters therear lateral area 156, thedisplay processor 56 decides that it is not necessary to notify the driver concerning the presence of themobile object 60, and does not display thedetection frame 62 in thenon-display area 158, which is shown in hatching. Consequently, thedetection frame 62 is displayed only when necessary to notify the driver concerning the presence of themobile object 60 using thedetection frame 62, and hence, the driver is prevented from becoming annoyed or bothered with unwanted images of thedetection frame 62 on thedisplay unit 30. - However, in the wide view image display mode shown in
FIG. 13 , if amobile object 160 travels along atravel path 159, i.e., along the direction in which thetravel path 159 extends, which is judged (analyzed) by themobile object detector 52, and thetravel path 159 represents a direction along which themobile object 160 approaches thevehicle 11 from straight behind, thedisplay processor 56 does not produce a non-display area for themobile object 160. In this case, themobile object 160 is displayed as moving from a central back area toward a central near area of the wide view image, and thedetection frame 162 encircling themobile object 160 does not move laterally, but grows progressively larger in size. Thedetection frame 162, which is displayed in this manner, is useful to draw the attention of the driver to themobile object 160, and is unlikely to make the driver feel annoyed or bothered. - At this time, as shown in
FIG. 14 , the detection ranges 102, 104 of thesonar array 14 may be included in a non-display area for thedetection frame 162. - The principles of the present invention are not limited to the detection of a mobile object behind the
vehicle 11, as has been described in the above embodiment, but also may be applied to the detection of a mobile object in front of thevehicle 11. - A vehicle periphery monitoring apparatus (10) displays a mobile object on a display unit (30), encircled by a detection frame, which notifies the driver of the vehicle concerning the presence of the mobile object. A mobile object detector (52) judges a travel path of the mobile object, and a display processor (56) changes an area within which the detection frame is not displayed, depending on the direction in which the judged travel path extends. The detection frame is displayed only when necessary, so as to indicate the presence of the mobile object to the driver using the detection frame.
Claims (2)
- A vehicle periphery monitoring apparatus comprising:an image capturing device (12) mounted on a vehicle (1,1), for acquiring a captured image of a peripheral area of the vehicle (11);a mobile object detector (52) for detecting a mobile object (60) based on the captured image;a detection frame generator (54) for generating a detection frame (62) that encircles the mobile object (60) detected by the mobile object detector (52);a display unit (30); anda display processor (56) for displaying the detection frame (62) in a superposed rotation to the captured image on the display unit (30),wherein the mobile object detector (52) judges a travel path of the mobile object (60), andthe display processor (56) changes an area in which the detection frame (62) is not displayed, depending on the travel path judged by the mobile object detector (52),the image capturing device (12) acquires a captured image behind the vehicle (11), and characterized in that,if the travel path (150) judged by the mobile object detector (52) represents a direction that extends from a rear lateral area (152) behind the vehicle (11) and transversely across a rear area (154) behind the vehicle (11), the display processor (56) produces a non-display area (158) for the detection frame (62), which includes the rear area (154) behind the vehicle (11) and an opposite rear lateral area (156) opposite to the rear lateral area (152) behind the vehicle (11).
- The vehicle periphery monitoring apparatus according to claim 1, wherein if the travel path (159) judged by the mobile object detector (52) represents a direction along which the mobile object (160) approaches the vehicle (11) from behind, the display processor (56) does not produce a non-display area for the detection frame (162).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP13180354.6A EP2665040B1 (en) | 2011-02-09 | 2012-02-07 | Vehicle periphery monitoring apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011025614A JP5329582B2 (en) | 2011-02-09 | 2011-02-09 | Vehicle periphery monitoring device |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13180354.6A Division EP2665040B1 (en) | 2011-02-09 | 2012-02-07 | Vehicle periphery monitoring apparatus |
EP13180354.6 Division-Into | 2013-08-14 |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2487648A1 EP2487648A1 (en) | 2012-08-15 |
EP2487648B1 true EP2487648B1 (en) | 2013-10-02 |
Family
ID=45606994
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12154336.7A Not-in-force EP2487648B1 (en) | 2011-02-09 | 2012-02-07 | Vehicle periphery monitoring apparatus |
EP13180354.6A Not-in-force EP2665040B1 (en) | 2011-02-09 | 2012-02-07 | Vehicle periphery monitoring apparatus |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13180354.6A Not-in-force EP2665040B1 (en) | 2011-02-09 | 2012-02-07 | Vehicle periphery monitoring apparatus |
Country Status (4)
Country | Link |
---|---|
US (1) | US8848056B2 (en) |
EP (2) | EP2487648B1 (en) |
JP (1) | JP5329582B2 (en) |
CN (1) | CN102632840B (en) |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015008453A (en) * | 2013-05-29 | 2015-01-15 | 京セラ株式会社 | Camera device and warning method |
JP5842110B2 (en) * | 2013-10-10 | 2016-01-13 | パナソニックIpマネジメント株式会社 | Display control device, display control program, and recording medium |
JP6231345B2 (en) * | 2013-10-18 | 2017-11-15 | クラリオン株式会社 | Vehicle start support device |
DE102014212061B4 (en) * | 2014-06-24 | 2023-12-28 | Volkswagen Ag | Method and device for displaying a vehicle-specific parameter in a vehicle as well as a combination instrument and vehicle |
CN108028892B (en) * | 2015-09-30 | 2021-02-09 | 索尼公司 | Information acquisition apparatus and information acquisition method |
JP6805716B2 (en) * | 2016-01-25 | 2020-12-23 | 株式会社Jvcケンウッド | Display device, display method, program |
WO2017159082A1 (en) | 2016-03-14 | 2017-09-21 | 株式会社リコー | Image processing device, apparatus control system, image pickup device, image processing method, and program |
JP6609513B2 (en) * | 2016-05-19 | 2019-11-20 | 株式会社東海理化電機製作所 | Vehicle visual recognition device |
US11027652B2 (en) | 2017-05-19 | 2021-06-08 | Georgios Zafeirakis | Vehicle collision avoidance system |
US10434947B2 (en) * | 2017-05-19 | 2019-10-08 | Georgios Zafeirakis | Driver sitting position controlled vehicle collision avoidance |
USD947893S1 (en) * | 2020-03-05 | 2022-04-05 | Jaguar Land Rover Limited | Display screen or portion thereof with icon |
CN113496601B (en) * | 2020-03-20 | 2022-05-24 | 宇通客车股份有限公司 | Vehicle driving assisting method, device and system |
USD1036453S1 (en) * | 2020-08-27 | 2024-07-23 | Mobileye Vision Technologies Ltd. | Display screen with graphical user interface |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100466458B1 (en) * | 1999-09-20 | 2005-01-14 | 마츠시타 덴끼 산교 가부시키가이샤 | Device for assisting automobile driver |
JP2001216520A (en) | 2000-01-31 | 2001-08-10 | Yazaki Corp | Surroundings monitor device for vehicle |
JP4687160B2 (en) * | 2005-03-14 | 2011-05-25 | アイシン精機株式会社 | Perimeter monitoring device |
JP4720386B2 (en) * | 2005-09-07 | 2011-07-13 | 株式会社日立製作所 | Driving assistance device |
JP4744995B2 (en) * | 2005-09-08 | 2011-08-10 | クラリオン株式会社 | Obstacle detection device for vehicle |
JP4707067B2 (en) * | 2006-06-30 | 2011-06-22 | 本田技研工業株式会社 | Obstacle discrimination device |
JP2008242544A (en) | 2007-03-26 | 2008-10-09 | Hitachi Ltd | Collision avoidance device and method |
JP5316805B2 (en) | 2009-03-16 | 2013-10-16 | 株式会社リコー | In-vehicle camera device image adjustment device and in-vehicle camera device |
JP5718080B2 (en) * | 2011-02-09 | 2015-05-13 | 本田技研工業株式会社 | Vehicle periphery monitoring device |
-
2011
- 2011-02-09 JP JP2011025614A patent/JP5329582B2/en active Active
-
2012
- 2012-02-02 CN CN201210025771.7A patent/CN102632840B/en active Active
- 2012-02-06 US US13/367,135 patent/US8848056B2/en active Active
- 2012-02-07 EP EP12154336.7A patent/EP2487648B1/en not_active Not-in-force
- 2012-02-07 EP EP13180354.6A patent/EP2665040B1/en not_active Not-in-force
Also Published As
Publication number | Publication date |
---|---|
EP2487648A1 (en) | 2012-08-15 |
US20120200705A1 (en) | 2012-08-09 |
CN102632840B (en) | 2015-05-06 |
CN102632840A (en) | 2012-08-15 |
JP5329582B2 (en) | 2013-10-30 |
EP2665040A3 (en) | 2013-12-11 |
EP2665040A2 (en) | 2013-11-20 |
JP2012162211A (en) | 2012-08-30 |
EP2665040B1 (en) | 2015-07-15 |
US8848056B2 (en) | 2014-09-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2487648B1 (en) | Vehicle periphery monitoring apparatus | |
US10589680B2 (en) | Method for providing at least one information from an environmental region of a motor vehicle, display system for a motor vehicle driver assistance system for a motor vehicle as well as motor vehicle | |
US9156402B2 (en) | Wide view vehicle periphery image generation apparatus | |
JP3947375B2 (en) | Parking assistance device | |
EP2723069B1 (en) | Vehicle periphery monitoring device | |
US8781731B2 (en) | Adjusting method and system of intelligent vehicle imaging device | |
US9142129B2 (en) | Vehicle surroundings monitoring device | |
JP3894322B2 (en) | Vehicle visibility monitoring system | |
CN111201558B (en) | Method for representing the surroundings of a vehicle | |
CN108259879B (en) | Image generation device and image generation method | |
US20100245573A1 (en) | Image processing method and image processing apparatus | |
JP5408198B2 (en) | Video display device and video display method | |
US20120249794A1 (en) | Image display system | |
CN201402413Y (en) | Vehicle control assistant device | |
CN103946066A (en) | Vehicle surroundings monitoring apparatus and vehicle surroundings monitoring method | |
JP5718080B2 (en) | Vehicle periphery monitoring device | |
CN103477634A (en) | Birds-eye-view image generation device, birds-eye-view image generation method, and birds-eye-view image generation program | |
JP2013168063A (en) | Image processing device, image display system, and image processing method | |
US20210289169A1 (en) | Periphery monitoring apparatus | |
JP2012001126A (en) | Vehicle surroundings monitoring device | |
JP2013161440A (en) | Vehicle surroundings monitoring device | |
CN108422932A (en) | driving assistance system, method and vehicle | |
JP2006224927A (en) | Device for visual recognition around vehicle | |
CN104670089A (en) | Panoramic driving monitoring and alarming system | |
CN108016354A (en) | A kind of visible panoramic parking system in picture blind area and its method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20120207 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
INTG | Intention to grant announced |
Effective date: 20130422 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 634953 Country of ref document: AT Kind code of ref document: T Effective date: 20131015 Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602012000344 Country of ref document: DE Effective date: 20131128 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 634953 Country of ref document: AT Kind code of ref document: T Effective date: 20131002 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: VDEP Effective date: 20131002 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20140102 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20140202 Ref country code: BE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20140203 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602012000344 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 |
|
26N | No opposition filed |
Effective date: 20140703 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: LU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20140207 Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602012000344 Country of ref document: DE Effective date: 20140703 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: ST Effective date: 20141031 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20140207 Ref country code: FR Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20140228 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20150228 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20150228 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20140101 Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20140103 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20120207 Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20160207 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GB Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20160207 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20131002 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R084 Ref document number: 602012000344 Country of ref document: DE |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20210126 Year of fee payment: 10 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R119 Ref document number: 602012000344 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20220901 |