WO2013065341A1 - 障害物警報装置 - Google Patents
障害物警報装置 Download PDFInfo
- Publication number
- WO2013065341A1 WO2013065341A1 PCT/JP2012/060397 JP2012060397W WO2013065341A1 WO 2013065341 A1 WO2013065341 A1 WO 2013065341A1 JP 2012060397 W JP2012060397 W JP 2012060397W WO 2013065341 A1 WO2013065341 A1 WO 2013065341A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- captured image
- image
- vehicle
- displayed
- index
- Prior art date
Links
- 230000000007 visual effect Effects 0.000 description 25
- 239000002131 composite material Substances 0.000 description 16
- 238000000034 method Methods 0.000 description 16
- 230000004397 blinking Effects 0.000 description 11
- 238000013459 approach Methods 0.000 description 9
- 238000001514 detection method Methods 0.000 description 8
- 238000003384 imaging method Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 239000003086 colorant Substances 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 2
- 208000019914 Mental Fatigue Diseases 0.000 description 1
- 238000010521 absorption reaction Methods 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/26—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/168—Driving aids for parking, e.g. acoustic or visual feedback on parking space
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/304—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
- B60R2300/305—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images merging camera image with lines or icons
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8093—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
Definitions
- the present invention relates to an obstacle alarm device that clearly indicates to an occupant the presence of an obstacle approaching a vehicle.
- Patent Documents 1 and 2 There are blind spots around the vehicle that cannot be seen from the driver's position, and the driver needs to pay close attention to the surrounding of the vehicle when driving the vehicle. In particular, when the vehicle is moved backward and parked, there are many users who are not good at parking itself, and there is not a lot of mental fatigue. Thus, conventionally, techniques for monitoring obstacles around the vehicle have been used (for example, Patent Documents 1 and 2).
- the vehicle obstacle alarm device described in Patent Document 1 includes a lateral movement obstacle detection means, a lateral movement direction detection means, and a lateral movement information provision means.
- the laterally moving obstacle detection means detects an obstacle that moves in the direction crossing the traveling direction in front of the vehicle.
- the lateral movement direction detection means detects the lateral movement direction of the obstacle detected by the lateral movement obstacle detection means.
- the lateral movement information providing means provides the driver with information regarding the lateral movement direction of the obstacle detected by the lateral movement direction detecting means. At this time, the lateral movement information providing means displays an arrow indicating the lateral movement direction detected by the lateral movement direction detecting means on the display unit.
- the vehicle periphery monitoring device described in Patent Document 2 includes an imaging unit, an obstacle detection unit, and a display unit.
- the imaging means images the surroundings of the vehicle including a part of the host vehicle.
- the obstacle detection means detects an obstacle located around the vehicle and calculates a distance between the detected obstacle and the host vehicle.
- the display unit displays the captured image captured by the imaging unit and the obstacle display image indicating the distance calculated by the obstacle detection unit on one screen.
- Obstacles around the vehicle are detected by detecting obstacles around the vehicle as in the techniques described in Patent Literature 1 and Patent Literature 2 and displaying information (arrows, etc.) clearly indicating the obstacles on the screen. Can be notified.
- the screen size of the display (display means) mounted on the vehicle is not large. For this reason, if an arrow or the like is displayed on the image showing the situation around the vehicle displayed on the display, the situation around the vehicle may become difficult to see or an obstacle may not be grasped.
- an object of the present invention is to provide an obstacle alarm device that can clearly indicate to the driver the presence of an obstacle approaching the vehicle without making it difficult to see the situation around the vehicle. is there.
- the characteristic configuration of the obstacle alarm device is as follows: A captured image acquisition unit that acquires a captured image of a scene around the vehicle; An attention-captured image generation unit that generates an attention-captured image based on the captured image; An object presence determination unit that determines whether an object exists around the vehicle; A moving direction determining unit that determines a moving direction of the object; When the moving direction determination unit determines that the object moves to the center side of the target captured image, a frame index that has an outer shape smaller than the contour of the target captured image and is turned on for a predetermined time, and then turned off, An explicit image output unit for displaying the image at a different position sequentially from the outer edge to the center of the captured image, and repeatedly performing the display; It is in the point equipped with.
- the attention photographed image generation unit generates a central portion of the photographed image as the attention photographed image, It is preferable that the object presence determination unit determines whether or not the object exists in an outer region outside the target captured image.
- the noticeable photographed image generation unit generates the entire photographed image as the noticeable photographed image
- the object presence determination unit may be configured to determine whether or not the object exists in a region corresponding to the target captured image.
- the frame index to be displayed later may be configured to be smaller than the frame index displayed immediately before.
- the frame index can be displayed so as to approach the center of the screen. Accordingly, the driver can easily recognize the approach of the obstacle.
- the frame index to be displayed later may be configured to have lower transparency than the frame index displayed immediately before.
- the frame index can be displayed so as to approach the center of the screen. Accordingly, the driver can easily recognize the approach of the obstacle. In addition, since the scene displayed at the end of the screen is not hidden, it is possible to properly grasp the obstacle even when the obstacle suddenly pops out.
- the explicit image output unit is configured to stop the output of the frame index when the object enters an area corresponding to the target captured image.
- the obstacle alarm device 100 has a function of clearly indicating that an object is approaching the driver of the vehicle when there is an object approaching the vehicle. Hereinafter, it demonstrates using drawing.
- FIG. 1 is a block diagram schematically showing the configuration of the obstacle alarm device 100.
- the obstacle alarm device 100 includes a captured image acquisition unit 11, a focused captured image generation unit 12, an outer region generation unit 13, an object presence determination unit 14, a movement direction determination unit 15, and an explicit image output unit.
- an explicit image storage unit 17 a composite image generation unit 18, a mask region setting unit 19, a mask region emphasis display unit 20, a motion image output unit 30, and a motion image storage unit 31.
- Each functional unit includes the above-described functional units for performing various processes for clearly indicating the approach of the object 7 to the driver of the vehicle 1 by using a CPU as a core member.
- the photographed image acquisition unit 11 acquires a photographed image G obtained by photographing a scene around the vehicle 1.
- the vehicle 1 is provided with a camera 5.
- the camera 5 in the present embodiment includes a digital camera that incorporates an image sensor such as a charge coupled device (CCD) or a CMOS image sensor (CIS) and outputs captured information as moving image information.
- CCD charge coupled device
- CIS CMOS image sensor
- FIG. 2 (a) such a camera 5 is provided near the license plate provided at the outer rear portion of the vehicle 1, or near the emblem provided at the outer rear portion of the vehicle 1. It is arranged with a slight depression angle toward the rear.
- the camera 5 includes a wide-angle lens (not shown). Thereby, the scene around the vehicle 1 can be photographed over the rear of the vehicle 1 over approximately 180 degrees. Such an imaging range is shown as “wide viewing angle” in FIG.
- the camera 5 has a performance of outputting a moving image as a captured image G in real time. Such a captured image G is transmitted to the captured image
- FIG. 2 (b) An example of such a photographed image G is shown in FIG.
- the full width of FIG. 2 (b) corresponds to the wide viewing angle of FIG. 2 (a).
- the object 7 on the left side as viewed from the rear of the vehicle 1 as shown in FIG. 2A is on the right side in the photographed image G as shown in FIG.
- mirror image processing is performed. This is because when the scene behind the vehicle 1 is displayed on the monitor 50, it is easy for the driver of the vehicle 1 to intuitively understand whether the object 7 included in the captured image G is on the left side or the right side of the vehicle 1. It is to do.
- the noticeable photographed image generation unit 12 generates a noticeable photographed image based on the photographed image G.
- the shooting range of the shot image G is a wide viewing angle.
- the noticeable photographed image generation unit 12 generates a narrow-field region N that is a central portion of the photographed image G as the noticeable photographed image.
- the captured image G is transmitted from the captured image acquisition unit 11 described above.
- the noticeable photographed image corresponds to the central portion in the horizontal direction of the photographed image G shown in FIG.
- Such a narrow viewing area N is preferably an area of about 120 to 130 degrees behind the vehicle 1, such as the “narrow viewing angle” in FIG.
- the narrow-field region N is close to the advanceable range when the vehicle 1 moves backward, it is referred to as a “focused captured image” because it is a region that should be particularly noted in the captured image G.
- a noticeable photographed image corresponds to a display image displayed on the monitor 50 described later (see FIG. 2C).
- the “focused captured image” is described as an image of a “narrow field of view”.
- the outer area generation unit 13 generates an outer area O outside the target photographed image. That is, an outer region O outside the narrow field region N in the captured image G is generated. As described above, the narrow-field region N is generated by the noticeable captured image generation unit 12 at the central portion in the horizontal direction of the captured image G. The outer region generation unit 13 generates an outer region O as shown in FIG. 2B outside the narrow visual field region N in the lateral direction. The outer region O generated by the outer region generation unit 13 is transmitted to an object presence determination unit 14 described later.
- the object presence determination unit 14 determines whether or not the object 7 exists around the vehicle 1. In the present embodiment, the object presence determination unit 14 determines whether or not the object 7 exists in the outer region O. The outer region O is transmitted from the outer region generator 13. In the present embodiment, the determination as to whether or not the object 7 exists in the outer region O is performed using a known image recognition process such as pattern matching with reference to the outer region O. Of course, it is possible to determine whether or not the object 7 exists in the outer region O by processing other than pattern matching. The determination result of the object presence determination unit 14 is transmitted to a movement direction determination unit 15 described later.
- the moving direction determination unit 15 determines the moving direction of the object 7 in the outer region O. Such determination of the moving direction is performed when the object presence determination unit 14 determines that the object 7 exists in the outer region O. In particular, in the present embodiment, the movement direction determination unit 15 determines whether or not the object 7 in the outer region O moves toward the narrow visual field region N. The movement toward the narrow visual field N indicates that the vehicle 1 moves from the outside in the width direction of the vehicle 1 to the direction just behind the vehicle 1 behind the vehicle 1. Such a determination can be performed by comparing the position of the object 7 in the current captured image G with the position of the object 7 in the captured image G a predetermined time ago, for example, or by using an optical flow. This can be done using techniques. Such a determination result of the moving direction is transmitted to the explicit image output unit 16 described later.
- the explicit image output unit 16 is on the side of the outer region O where the object 7 exists in the target captured image.
- the indicators S that are turned on for a certain time and then turned off are displayed sequentially at different positions from the center toward the center, and the display is repeated.
- the attention-captured image corresponds to an image in the narrow visual field N. Therefore, when the object 7 in the outer region O moves to the narrow field region N side of the vehicle 1, the explicit image output unit 16 moves from the outer region O side where the object 7 exists in the narrow field region N to the center side.
- the indicators S that are turned on after being turned on for a certain time are sequentially displayed at different positions, and the display is repeated. Whether or not the object 7 in the outer region O moves toward the narrow field region N of the vehicle 1 is determined by the moving direction determination unit 15 described above.
- the side of the outside field O where the object 7 exists in the narrow field area N corresponds to the left side area in the narrow field area N when the object 7 is in the left side area O.
- the right region in the narrow field region N corresponds.
- the index S that is turned on after being turned on for a certain period of time indicates not the state in which the index S is continuously displayed but the state in which the index S is blinking.
- the index S moves while blinking between two predetermined positions on the screen.
- the explicit image output unit 16 repeatedly performs such movement of the index S while blinking.
- the index S will be described.
- the index S according to the present embodiment is configured in an arrow shape having a convex portion protruding toward the center side of the noticeable photographed image (narrow visual field region N).
- Such an index S is stored in the explicit image storage unit 17 as shown in FIG.
- FIG. 2C shows a captured image G as shown in FIG. 2B, that is, an image displayed on the monitor 50 of the vehicle 1 when the object 7 is in the right outer region O.
- a plurality of indices S may be displayed on the monitor 50 in this embodiment. In such a case, the plurality of indices S are displayed at positions where they partially overlap.
- “partially overlapping” means that one of the plurality of indicators S is on the side of one of the arrow-shaped convex portions and the other portion of the plurality of indicators S is not on the other arrow-shaped convex portion. Means overlapping. Further, when there are a plurality of indexes S, an overlapping portion between the index S to be displayed later and the index S displayed immediately before is overwritten with the index S to be displayed later. That is, the index S displayed later is arranged in the upper layer than the index S displayed immediately before. In the present embodiment, the index S displayed immediately before is displayed with higher transparency than the index S displayed later. That is, the index S displayed later is displayed with lower transparency than the index S displayed immediately before. Therefore, when a plurality of indices S are displayed, the uppermost index S has the lowest transparency, and the lowermost index S has the highest transparency.
- the index S is configured so that the index S displayed later is larger than the index S displayed immediately before. Therefore, when a plurality of indices S are displayed, the index S in the uppermost layer has the largest size and the index S in the lowermost layer has the smallest size.
- Each index S may have a size similar to each other and may be set in size, or may be set in size by changing either the vertical or horizontal length of the index S. .
- the explicit image output unit 16 repeatedly performs such display.
- the index S is combined with the captured image of interest which is the narrow visual field N and displayed on the monitor 50. Therefore, the composite image generation unit 18 generates a composite image in which the index S is combined with the target captured image. As a result, an image as shown in FIG. 2C is generated.
- the index S By displaying the index S in this way, it is possible to display the index S so that it gradually increases. As a result, it is possible to visually indicate to the passenger of the vehicle 1 that the object 7 is approaching the vehicle 1.
- the explicit image output unit 16 has an outer shape smaller than the contour of the target captured image and is constant.
- the frame index W that is turned on after the time is turned on is displayed sequentially at different positions from the outer edge portion of the target photographed image toward the center side, and the display is repeated.
- the noticeable captured image corresponds to the narrow visual field region N. Therefore, when the object 7 in the outer region O moves toward the narrow field region N, the explicit image output unit 16 has a frame shape that is smaller than the contour of the narrow field region N and lights up for a certain period of time, and then turns off.
- the frame index W having an outer shape smaller than the outline of the narrow visual field region N means that the frame index W is smaller than the screen size of the monitor 50.
- the frame index W that is turned on after being turned on for a certain period of time indicates not the state in which the frame index W is continuously displayed but the state in which the frame index W is displayed blinking.
- a frame index W is displayed and then turned off, and when it is displayed next, it is displayed at a different central position. Therefore, the frame index W is displayed so as to become gradually smaller.
- the explicit image output unit 16 repeatedly performs such movement of the frame index W while blinking.
- the frame index W will be described.
- the frame index W according to the present embodiment is configured with an outer shape smaller than the contour of the noticeable photographed image (narrow field of view region N).
- a plurality of frame indexes W may be displayed on the monitor 50.
- the frame index W is configured so that the frame index W displayed later is smaller in size than the frame index W displayed immediately before.
- the frame index W is configured so that the frame index W displayed later has lower transparency than the frame index W displayed immediately before.
- Such a frame index W is stored in the explicit image storage unit 17 as shown in FIG.
- the explicit image output unit 16 repeatedly performs such display.
- the frame index W is combined with the target captured image that is the narrow visual field N and displayed on the monitor 50. Therefore, the composite image generation unit 18 generates a composite image in which the frame index W is combined with the target captured image. As a result, an image as shown in FIG. 2C is generated.
- the mask area setting unit 19 sets a mask area M in which at least a part of a scene around the vehicle 1 is not displayed in the target photographed image.
- the mask area M is set at the upper part of the screen, that is, the upper part in the target captured image. This mask area M is formed across both sides of the target image.
- the mask area M is colored, for example, in black so that the scene above the vehicle 1 cannot be seen. Of course, other colors may be used.
- the motion image output unit 30 absorbs the explicit index from the side of the mask region M where the object 7 exists. Output images. Whether or not the object 7 in the outer region O has entered the narrow field region N is determined by the movement direction determination unit 15.
- the explicit index corresponds to the index S displayed on the monitor 50 when the object 7 enters the narrow field area N.
- the side where the object 7 exists in the mask area M is the right side of the mask area M when the object 7 exists in the right outer area O, and the object 7 exists in the left outer area O. If it is, it is the left side of the mask area M.
- the absorbed image is an image in which the index S is absorbed in the mask area M and disappears. Such an image is stored in the motion image storage unit 31 in advance.
- the object 7 when the object 7 enters the narrow visual field N, the object 7 is displayed at the end of the noticeable captured image. For this reason, when the object 7 enters the narrow field region N, the index S is absorbed by the mask region M so that the object 7 displayed at the end of the narrow field region N is displayed without being hidden by the explicit index. It becomes possible. Therefore, it is possible to appropriately indicate the presence of the object 7 to the driver of the vehicle 1.
- the action image output unit 30 is configured to change the display color of the mask area M from the position where the explicit index is absorbed in accordance with the absorption of the explicit index to the mask area M. That is, when the explicit index is absorbed from the right side of the mask area M, an image that colors the mask area M from the right side to the left side is output, and when the explicit index enters from the left side of the mask area M. Outputs an image that colors the mask region M from the left to the right. By coloring the mask area M in this way, it is possible to clearly indicate to the driver of the vehicle 1 the side on which the object 7 has entered.
- FIG. 3 and FIG. 4 show an example of a series of images in which such an explicit index enters the mask area M and the mask area M is colored.
- FIG. 3 shows an example in which the index S and the frame index W are displayed superimposed on the narrow field region N when the object 7 in the outer region O moves to the narrow field region N side. .
- FIG. 4A when the object 7 enters the narrow visual field N from the right outer region O, the superimposition of the frame index W is finished.
- FIGS. 4B to 4E the mask S enters the mask region M so that the index S is sucked from the right side of the mask region M.
- the mask area M is sequentially colored from the right side, and finally, the entire area of the mask area M is colored (FIG. 4F).
- the mask region highlighting display unit 20 highlights the mask region M when the object 7 in the outer region O enters the region corresponding to the target captured image, that is, the narrow field region N.
- the emphasis display is a blinking display. Whether or not the object 7 in the outer region O has entered the narrow field region N is determined by the movement direction determination unit 15.
- the mask area highlighting display unit 20 blinks the mask area M in accordance with the determination result of the movement direction determination unit 15. Thereby, it is possible to visually indicate to the driver of the vehicle 1 that the object 7 exists in the narrow visual field region N.
- the mask area highlighting display unit 20 stops highlighting of the mask area M when the object 7 leaves the area corresponding to the target photographed image, that is, the narrow field area N. Whether or not the object 7 has left the narrow field area N can be determined by the moving direction determination unit 15. That is, if there is an object 7 that enters the outer region O from the narrow field region N side of the outer region O, the moving direction determination unit 15 can determine that the object 7 has left the narrow field region N. Such a determination result is also transmitted to the mask region emphasis display unit 20.
- the highlighted display is a blinking display. Therefore, the mask area highlighting display unit 20 stops blinking of the mask area M when such a determination result is transmitted. Thereby, it is possible to visually indicate to the driver of the vehicle 1 that the object 7 is not present in the narrow visual field region N.
- the captured image acquisition unit 11 acquires a captured image G captured by the camera 5 of the vehicle 1 (step # 1).
- the noticeable photographed image generation unit 12 generates the central portion of the acquired photographed image G as the noticeable photographed image (step # 2).
- the outer area generation unit 13 generates both lateral portions of the acquired captured image G as the outer area O (step # 3). Whether or not the object 7 is present in the outer region O generated in this way is determined by the object presence determination unit 14 (step # 4).
- the movement direction determination unit 15 determines the movement direction of the object 7 (step # 5).
- an explicit image is output by the explicit image output unit 16 (step # 6). This explicit image is output with reference to the explicit image stored in the explicit image storage unit 17.
- the composite image generation unit 18 generates a composite image by superimposing the explicit image output in step # 6 on the noticeable captured image generated in step # 2 (step # 7).
- the generated composite image is displayed on the monitor 50 (step # 8).
- the object 7 approaching the vehicle 1 has entered the imaging range even if the object 7 is not shown in the screen of the monitor 50 provided in the vehicle 1.
- the presence and direction of the object 7 approaching the vehicle 1 can be clearly shown to the driver while displaying the situation around the vehicle 1. Therefore, even when the screen size of the monitor 50 is small, the object 7 approaching the vehicle 1 is not missed.
- a frame index W is displayed at the side edge of the screen, it is not difficult for the driver to see the situation around the vehicle 1. Therefore, the presence of an obstacle (object 7) approaching the vehicle 1 can be clearly shown to the driver without making it difficult to see the situation around the vehicle 1.
- the target captured image is the central portion of the captured image G, and the central portion of the captured image G is displayed on the monitor 50.
- the second embodiment differs from the first embodiment in that the target captured image is the entire captured image G, and the entire captured image G is displayed on the monitor 50.
- other than the generation of the noticeable photographed image and the display on the monitor 50 are the same as in the first embodiment.
- the difference will be mainly described.
- FIG. 6 is a block diagram schematically showing the configuration of the obstacle alarm device 100 according to the present embodiment.
- the obstacle alarm device 100 includes a captured image acquisition unit 11, a focused captured image generation unit 12, an object presence determination unit 14, a movement direction determination unit 15, and an explicit image output unit 16.
- Each functional unit includes the above-described functional units for performing various processes for clearly indicating the approach of the object 7 to the driver of the vehicle 1 by using a CPU as a core member.
- the photographed image acquisition unit 11 acquires a photographed image G obtained by photographing a scene around the vehicle 1. Similarly to the first embodiment, the captured image acquisition unit 11 acquires a scene around the vehicle 1 that has been captured about 180 degrees behind the vehicle 1 by the camera 5 provided in the vehicle 1. Such an imaging range is indicated as “wide viewing angle” in FIG.
- the camera 5 has a performance of outputting a moving image as a captured image G in real time.
- FIG. 7 An example of such a photographed image G is shown in FIG.
- the full width of FIG. 7 (b) corresponds to the wide viewing angle of FIG. 7 (a).
- FIG. 7B mirror image processing is performed and displayed as in FIG.
- the noticeable photographed image generation unit 12 generates a noticeable photographed image based on the photographed image G.
- the noticeable photographed image generation unit 12 generates a wide field of view region B that is the entire photographed image G as the noticeable photographed image.
- the shooting range of the shot image G is a range having a viewing angle of approximately 180 degrees. Therefore, the wide viewing area B is an image having a viewing angle of approximately 180 degrees.
- a noticeable photographed image that is such a wide visual field region B is shown in FIG.
- such a noticeable photographed image corresponds to a display image displayed on the monitor 50 (see FIG. 7C).
- FIG. 7B and FIG. 7C for example, the “narrow viewing angle” shown in FIG.
- FIG. 7B an image sandwiched between two broken lines is referred to as a “narrow field region N” (see FIG. 7B).
- the object presence determination unit 14 determines whether or not the object 7 exists around the vehicle 1. In the present embodiment, the object presence determination unit 14 determines whether or not the object 7 exists in a region corresponding to the target captured image. The noticeable photographed image is transmitted from the noticeable photographed image generation unit 12. Further, the area corresponding to the noticeable photographed image is an area in the real space corresponding to the noticeable photographed image. In the present embodiment, the object presence determination unit 14 determines whether or not the object 7 exists in the wide visual field region B. Such a determination is performed by using a known image recognition process such as pattern matching with reference to the target captured image. Of course, it is possible to determine whether or not the object 7 exists in the target captured image by a process other than pattern matching. The determination result of the object presence determination unit 14 is transmitted to a movement direction determination unit 15 described later.
- the moving direction determination unit 15 determines the moving direction of the object 7.
- the object 7 is the object 7 determined to be present in the wide field of view region B by the object presence determination unit 14 described above. Such determination of the moving direction is performed when the object presence determination unit 14 determines that the object 7 is present in the wide visual field region B.
- the movement direction determination unit 15 determines whether or not the object 7 in the region corresponding to the target captured image moves to the center side of the target captured image.
- To move to the center side of the noticeable photographed image means to move to the center side of the wide field of view area B, and to move from the outside in the width direction of the vehicle 1 to the direction just behind the vehicle 1 at the rear of the vehicle 1. Show.
- Such a determination can be performed by comparing the position of the object 7 in the current captured image G with the position of the object 7 in the captured image G a predetermined time ago, for example, or using an optical flow. It is possible to use this method. Such a determination result of the moving direction is transmitted to the explicit image output unit 16 described later.
- the explicit image output unit 16 is constant from the side where the object 7 exists in the target captured image toward the center side.
- the indicators S that are turned on after the time is turned on are sequentially displayed at different positions, and the display is repeated.
- such an index S is displayed from the outer edge of the narrow field region N (the region divided by the broken line in FIG. 7C) in the screen of the monitor 50 toward the center side. Therefore, also in the present embodiment, when the object 7 moves to the center side of the target photographed image, the indicator S that is turned on for a certain period from the outer edge of the narrow field region N where the object 7 exists toward the center side. Are sequentially displayed at different positions in the narrow visual field region N.
- the explicit image output unit 16 when the moving direction determination unit 15 determines that the object 7 moves to the center side of the target captured image, the explicit image output unit 16 has an outer shape smaller than the contour of the target captured image and lights up for a certain period of time.
- the turn-off frame index W is displayed at different positions sequentially from the outer edge portion of the target photographed image toward the center, and the display is repeated.
- the noticeable captured image corresponds to the wide visual field region B. Therefore, when the object 7 moves to the center side of the wide field area B, the explicit image output unit 16 has a frame index W that has an outer shape smaller than the outline of the wide field area B and is turned on for a certain period of time.
- the display is sequentially performed at different positions from the outer edge of the wide viewing area B toward the center, and the display is repeated.
- the noticeable captured image which is the wide visual field region B is displayed on the monitor 50. Therefore, the frame index W having an outer shape smaller than the contour of the wide visual field region B means that the frame index W is smaller than the screen size of the monitor 50.
- the frame index W is displayed at different positions sequentially from the outer edge of the target image to the center side as the index S moves to the center side of the target image.
- “displayed sequentially at different positions” indicates not the state in which the frame index W is continuously displayed but the state in which the frame index W is moved and displayed while blinking. Therefore, the frame index W is displayed so as to become gradually smaller.
- the explicit image output unit 16 repeatedly performs such movement of the frame index W while blinking.
- the frame index W will be described. As shown in FIG. 7C, in the present embodiment, a plurality of frame indexes W may be displayed on the monitor 50. In such a case, the frame index W is configured so that the frame index W displayed later is smaller in size than the frame index W displayed immediately before. Further, the frame index W is configured so that the frame index W displayed later has lower transparency than the frame index W displayed immediately before. As a result, it is possible to display the frame index W so as to pop out from the center side of the screen. Such a frame index W is stored together with the arrow-shaped index S in the explicit image storage unit 17 as shown in FIG.
- the explicit image output unit 16 repeatedly performs such display.
- the frame index W is combined with the target captured image together with the arrow-shaped index S and displayed on the monitor 50. Therefore, the composite image generation unit 18 generates a composite image in which the frame index W and the arrow-shaped index S are combined with the target captured image. As a result, an image as shown in FIG. 7C is generated.
- FIG. 8 shows an arrow-shaped index S and a frame index W superimposed on the target captured image when the object 7 in the region corresponding to the target captured image has moved to the center side of the wide-field region B.
- FIG. 8 shows an arrow-shaped index S and a frame index W superimposed on the target captured image when the object 7 in the region corresponding to the target captured image has moved to the center side of the wide-field region B.
- the arrow shape is repeatedly displayed on the outer edge of the image corresponding to the narrow field area N until the object 7 enters the narrow field area N.
- the frame index W is repeatedly displayed until the narrow field region N is entered. That is, the display of FIGS. 8B to 8E is repeatedly performed. As shown in FIG.
- the captured image acquisition unit 11 acquires a captured image G captured by the camera 5 of the vehicle 1 (step # 21).
- the noticeable photographed image generation unit 12 generates the entire acquired photographed image G as a noticeable photographed image (step # 22).
- the object presence determination unit 14 determines whether or not the object 7 is present in the wide visual field region B corresponding to the noticeable captured image generated in this way (step # 23).
- the movement direction determination unit 15 determines the movement direction of the object 7 (step # 24). If the moving direction of the object 7 is toward the center of the wide viewing area B, the explicit image output unit 16 outputs an explicit image (step # 25). This explicit image is output with reference to the explicit image stored in the explicit image storage unit 17.
- the composite image generation unit 18 generates a composite image by superimposing the explicit image output in step # 25 on the noticeable captured image generated in step # 22 (step # 26).
- the generated composite image is displayed on the monitor 50 (step # 27).
- the frame index W is displayed in the noticeable captured image in accordance with the object 7 approaching the vehicle 1, so that the vehicle can be obtained even when the screen size of the monitor 50 is small.
- the object 7 approaching 1 is not missed.
- a frame index W is displayed at the side edge of the screen, it is not difficult for the driver to see the situation around the vehicle 1. That is, since the scene displayed at the end of the screen is not hidden by the frame index W, the object 7 can be properly grasped even when the object 7 suddenly pops out. Therefore, the presence of an obstacle (object 7) approaching the vehicle 1 can be clearly shown to the driver without making it difficult to see the situation around the vehicle 1.
- the explicit image output unit 16 has been described as being displayed so as to gradually increase when the blinking indicator S moves.
- the scope of application of the present invention is not limited to this. It is naturally possible to display the index S in the same size. Of course, it is naturally possible to display the index S so that it gradually decreases. Even with such a configuration, the object 7 that appropriately approaches the vehicle 1 can be clearly shown to the driver of the vehicle 1.
- the index S displayed immediately before is described as being higher than the transparency of the index S displayed later.
- the scope of application of the present invention is not limited to this.
- the index S displayed immediately before can be displayed lower than the transparency of the index S displayed later, and the index S displayed immediately before is the same as the transparency of the index S displayed later.
- the scope of application of the present invention is not limited to this. Even when a plurality of indices S are displayed, it is possible to configure the indices S so as not to overlap each other.
- the index S is described as being configured in the shape of an arrow having a convex portion protruding toward the center of the narrow field region N.
- the scope of application of the present invention is not limited to this.
- the index S can be formed in other shapes.
- the motion image output unit 30 when the object 7 in the outer region O enters the region (narrow field region N) corresponding to the target captured image, the motion image output unit 30 has the object 7 in the mask region M as the explicit index. It has been described that an image absorbed from the side to be output is output. However, the scope of application of the present invention is not limited to this.
- the explicit image output unit 16 can be configured to stop the output of the index S when the object 7 enters the region (narrow field region N) corresponding to the target captured image. It is. Of course, in such a case, the output of the frame index W can also be stopped. Whether or not the object 7 has entered the narrow field area N is determined by the moving direction determination unit 15 described above.
- the object 7 When the object 7 enters the narrow field area N, the object 7 is displayed at the end of the noticeable captured image. For this reason, even when the output of the explicit image is stopped in this way, the driver of the vehicle 1 can visually recognize the object 7 displayed at the end of the noticeable captured image without being hidden by the explicit image.
- indices S and frame indices W may be displayed on the screen.
- the index S and the frame index W may be displayed on the screen one by one, or only the index S may be displayed.
- the frame index W to be displayed later is described as being displayed smaller in size than the frame index W displayed immediately before.
- the scope of application of the present invention is not limited to this. It is naturally possible to display the frame index W in the same size. Of course, it is naturally possible to display the frame index W so as to gradually increase. Even with such a configuration, the object 7 that appropriately approaches the vehicle 1 can be clearly shown to the driver of the vehicle 1.
- the frame index W to be displayed later is described as being less transparent than the frame index W displayed immediately before.
- the scope of application of the present invention is not limited to this.
- the frame index W displayed immediately before can be displayed with lower transparency than the frame index W displayed later, and the frame index W displayed immediately before is displayed as the frame index W displayed later.
- the attention-captured image generation unit 12 has been described as generating the center portion of the captured image G as the attention-captured image.
- the scope of application of the present invention is not limited to this.
- the attention-captured image generation unit 12 may generate a portion that is not the central portion of the captured image G, that is, a portion shifted from the center of the captured image G, as the attention-captured image.
- the determination as to whether or not the object 7 exists can be performed using a known image recognition process such as pattern matching.
- the scope of application of the present invention is not limited to this. For example, it is naturally possible to detect with sonar.
- the attention-captured image generation unit 12 has been described as generating the narrow-field region N that is the central portion of the captured image G as the attention-captured image.
- the scope of application of the present invention is not limited to this.
- the captured image G is acquired by the camera 5 having a narrow viewing angle corresponding to the narrow viewing area N, it is naturally possible to use the captured image G as it is as the focused captured image. In such a case, it is preferable to determine whether or not the object 7 is present in the outer region O as described above, for example, using sonar.
- first embodiment and the second embodiment may be provided together in one apparatus.
- first embodiment and the second embodiment are configured to be switched manually or automatically.
- the present invention can be used for an obstacle alarm device that clearly shows the presence of an obstacle approaching a vehicle to an occupant.
- SYMBOLS 1 Vehicle 7: Object 11: Captured image acquisition unit 12: Focused captured image generation unit 14: Object presence determination unit 15: Movement direction determination unit 16: Explicit image output unit 100: Obstacle alarm device G: Captured image O: Outside Area W: Frame index
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Image Processing (AREA)
Abstract
Description
車両の周囲の情景を撮影した撮影画像を取得する撮影画像取得部と、
前記撮影画像に基づいて注目撮影画像を生成する注目撮影画像生成部と、
前記車両の周囲に物体が存在するか否かを判定する物体存在判定部と、
前記物体の移動方向を判定する移動方向判定部と、
前記移動方向判定部により前記物体が前記注目撮影画像の中央側に移動すると判定された場合に、前記注目撮影画像の輪郭よりも小さい外形を有し一定時間点灯したのち消灯する枠指標を、前記注目撮影画像の外縁部から中央側に向けて順次異なる位置に表示させると共に、当該表示を繰り返し行なう明示画像出力部と、
を備えている点にある。
前記物体存在判定部は、前記注目撮影画像の外側の外側領域に前記物体が存在するか否かを判定すると好適である。
前記物体存在判定部は、前記注目撮影画像に対応する領域に前記物体が存在するか否かを判定する構成としても良い。
以下、本発明の実施の形態について、詳細に説明する。本発明に係る障害物警報装置100は、車両に接近する物体がある場合に当該車両の運転者に物体が接近していることを明示する機能を備えている。以下、図面を用いて説明する。
次に、本発明に係る第2の実施形態について説明する。上記第1の実施形態では、注目撮影画像が撮影画像Gの中央部分であり、当該撮影画像Gの中央部分がモニタ50に表示されるとして説明した。第2の実施形態では、注目撮影画像が撮影画像Gの全体であり、当該撮影画像Gの全体がモニタ50に表示される点で上記第1の実施形態と異なる。特に注目撮影画像の生成及びモニタ50への表示以外については、上記第1の実施形態と同様である。以下では差異を中心に説明する。
上記実施形態では、明示画像出力部16は、点滅表示した指標Sが移動する際、次第に大きくなるように表示するとして説明した。しかしながら、本発明に適用範囲はこれに限定されるものではない。指標Sを同じサイズで表示することも当然に可能である。もちろん、指標Sが、次第に小さくなるように表示することも当然に可能である。このような構成であっても、適切に車両1に接近する物体7を車両1の運転者に明示することが可能である。
7:物体
11:撮影画像取得部
12:注目撮影画像生成部
14:物体存在判定部
15:移動方向判定部
16:明示画像出力部
100:障害物警報装置
G:撮影画像
O:外側領域
W:枠指標
Claims (6)
- 車両の周囲の情景を撮影した撮影画像を取得する撮影画像取得部と、
前記撮影画像に基づいて注目撮影画像を生成する注目撮影画像生成部と、
前記車両の周囲に物体が存在するか否かを判定する物体存在判定部と、
前記物体の移動方向を判定する移動方向判定部と、
前記移動方向判定部により前記物体が前記注目撮影画像の中央側に移動すると判定された場合に、前記注目撮影画像の輪郭よりも小さい外形を有し一定時間点灯したのち消灯する枠指標を、前記注目撮影画像の外縁部から中央側に向けて順次異なる位置に表示させると共に、当該表示を繰り返し行なう明示画像出力部と、
を備える障害物警報装置。 - 前記注目撮影画像生成部は、前記撮影画像の中央部分を前記注目撮影画像として生成し、
前記物体存在判定部は、前記注目撮影画像の外側の外側領域に前記物体が存在するか否かを判定する請求項1に記載の障害物警報装置。 - 前記注目撮影画像生成部は、前記撮影画像の全体を前記注目撮影画像として生成し、
前記物体存在判定部は、前記注目撮影画像に対応する領域に前記物体が存在するか否かを判定する請求項1に記載の障害物警報装置。 - 前記枠指標が複数ある場合に、後に表示する枠指標が、直前に表示した枠指標よりもサイズが小さいことを特徴とする請求項1から3の何れか一項に記載の障害物警報装置。
- 前記枠指標が複数ある場合に、後に表示する枠指標が、直前に表示した枠指標よりも透明度が低いことを特徴とする請求項1から4の何れか一項に記載の障害物警報装置。
- 前記明示画像出力部は、前記物体が前記注目撮影画像に対応する領域に進入した場合に前記枠指標の出力を中止するよう構成してある請求項1又は2に記載の障害物警報装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013541647A JP5704417B2 (ja) | 2011-11-01 | 2012-04-18 | 障害物警報装置 |
CN201280042223.3A CN103765489B (zh) | 2011-11-01 | 2012-04-18 | 障碍物警报装置 |
EP12845726.4A EP2775468B1 (en) | 2011-11-01 | 2012-04-18 | Obstacle alert device |
US13/483,828 US9393908B2 (en) | 2011-11-01 | 2012-05-30 | Obstacle alarm device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2011/075125 WO2013065120A1 (ja) | 2011-11-01 | 2011-11-01 | 障害物警報装置 |
JPPCT/JP2011/075125 | 2011-11-01 |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/075125 Continuation-In-Part WO2013065120A1 (ja) | 2011-11-01 | 2011-11-01 | 障害物警報装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/483,828 Continuation-In-Part US9393908B2 (en) | 2011-11-01 | 2012-05-30 | Obstacle alarm device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013065341A1 true WO2013065341A1 (ja) | 2013-05-10 |
Family
ID=48191516
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/075125 WO2013065120A1 (ja) | 2011-11-01 | 2011-11-01 | 障害物警報装置 |
PCT/JP2012/060397 WO2013065341A1 (ja) | 2011-11-01 | 2012-04-18 | 障害物警報装置 |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/075125 WO2013065120A1 (ja) | 2011-11-01 | 2011-11-01 | 障害物警報装置 |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP2775468B1 (ja) |
CN (1) | CN103765489B (ja) |
WO (2) | WO2013065120A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015004784A1 (ja) * | 2013-07-11 | 2015-01-15 | トヨタ自動車株式会社 | 車両用情報表示装置及び車両用情報表示方法 |
JP2016062368A (ja) * | 2014-09-18 | 2016-04-25 | 日本精機株式会社 | 車両用周辺情報表示システム及び表示装置 |
CN116968638A (zh) * | 2023-07-31 | 2023-10-31 | 广州博太电子有限公司 | 一种车辆行驶预警方法、系统、介质及计算机 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015178303A1 (ja) * | 2014-05-21 | 2015-11-26 | 矢崎総業株式会社 | 安全確認支援装置 |
CN105469535A (zh) * | 2014-08-25 | 2016-04-06 | 中兴通讯股份有限公司 | 一种周围环境的反馈方法及终端 |
JP6439913B2 (ja) * | 2014-08-29 | 2018-12-19 | 株式会社富士通ゼネラル | 移動体検出装置 |
JP6299651B2 (ja) * | 2015-04-02 | 2018-03-28 | 株式会社デンソー | 画像処理装置 |
JP6699230B2 (ja) * | 2016-02-25 | 2020-05-27 | 住友電気工業株式会社 | 道路異常警告システム及び車載機 |
CN109979237A (zh) * | 2017-12-26 | 2019-07-05 | 奥迪股份公司 | 车辆驾驶辅助系统和方法 |
CN112668371B (zh) * | 2019-10-16 | 2024-04-09 | 北京京东乾石科技有限公司 | 用于输出信息的方法和装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11115660A (ja) | 1997-10-21 | 1999-04-27 | Mazda Motor Corp | 車両の障害物警報装置 |
JP2005266899A (ja) * | 2004-03-16 | 2005-09-29 | Denso Corp | 移動物体危険判定装置 |
JP2008009843A (ja) * | 2006-06-30 | 2008-01-17 | Honda Motor Co Ltd | 障害物判別装置 |
JP2009217740A (ja) | 2008-03-12 | 2009-09-24 | Panasonic Corp | 車両周囲監視装置および車両周囲監視方法 |
JP2010202010A (ja) * | 2009-03-02 | 2010-09-16 | Aisin Seiki Co Ltd | 駐車支援装置 |
JP2010210486A (ja) * | 2009-03-11 | 2010-09-24 | Omron Corp | 画像処理装置および方法、並びに、プログラム |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002316602A (ja) * | 2001-04-24 | 2002-10-29 | Matsushita Electric Ind Co Ltd | 車載カメラの撮像画像表示方法及びその装置 |
JP2004312638A (ja) * | 2003-04-10 | 2004-11-04 | Mitsubishi Electric Corp | 障害物検知装置 |
JP2006040008A (ja) * | 2004-07-28 | 2006-02-09 | Auto Network Gijutsu Kenkyusho:Kk | 運転支援装置 |
JP4744995B2 (ja) * | 2005-09-08 | 2011-08-10 | クラリオン株式会社 | 車両用障害物検出装置 |
JP2008219063A (ja) * | 2007-02-28 | 2008-09-18 | Sanyo Electric Co Ltd | 車両周辺監視装置及び方法 |
JP5410730B2 (ja) * | 2008-10-09 | 2014-02-05 | 日立オートモティブシステムズ株式会社 | 自動車の外界認識装置 |
JP5099451B2 (ja) * | 2008-12-01 | 2012-12-19 | アイシン精機株式会社 | 車両周辺確認装置 |
CN201380816Y (zh) * | 2008-12-29 | 2010-01-13 | 广东铁将军防盗设备有限公司 | 自动导航汽车和车辆避障告警系统及其相应的电路 |
JP2011118482A (ja) * | 2009-11-30 | 2011-06-16 | Fujitsu Ten Ltd | 車載装置および認知支援システム |
-
2011
- 2011-11-01 WO PCT/JP2011/075125 patent/WO2013065120A1/ja active Application Filing
-
2012
- 2012-04-18 EP EP12845726.4A patent/EP2775468B1/en not_active Not-in-force
- 2012-04-18 WO PCT/JP2012/060397 patent/WO2013065341A1/ja active Application Filing
- 2012-04-18 CN CN201280042223.3A patent/CN103765489B/zh not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11115660A (ja) | 1997-10-21 | 1999-04-27 | Mazda Motor Corp | 車両の障害物警報装置 |
JP2005266899A (ja) * | 2004-03-16 | 2005-09-29 | Denso Corp | 移動物体危険判定装置 |
JP2008009843A (ja) * | 2006-06-30 | 2008-01-17 | Honda Motor Co Ltd | 障害物判別装置 |
JP2009217740A (ja) | 2008-03-12 | 2009-09-24 | Panasonic Corp | 車両周囲監視装置および車両周囲監視方法 |
JP2010202010A (ja) * | 2009-03-02 | 2010-09-16 | Aisin Seiki Co Ltd | 駐車支援装置 |
JP2010210486A (ja) * | 2009-03-11 | 2010-09-24 | Omron Corp | 画像処理装置および方法、並びに、プログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2775468A4 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015004784A1 (ja) * | 2013-07-11 | 2015-01-15 | トヨタ自動車株式会社 | 車両用情報表示装置及び車両用情報表示方法 |
JP2016062368A (ja) * | 2014-09-18 | 2016-04-25 | 日本精機株式会社 | 車両用周辺情報表示システム及び表示装置 |
CN116968638A (zh) * | 2023-07-31 | 2023-10-31 | 广州博太电子有限公司 | 一种车辆行驶预警方法、系统、介质及计算机 |
CN116968638B (zh) * | 2023-07-31 | 2024-04-09 | 广州博太电子有限公司 | 一种车辆行驶预警方法、系统、介质及计算机 |
Also Published As
Publication number | Publication date |
---|---|
EP2775468A4 (en) | 2015-03-25 |
EP2775468A1 (en) | 2014-09-10 |
CN103765489B (zh) | 2016-03-30 |
EP2775468B1 (en) | 2017-10-04 |
WO2013065120A1 (ja) | 2013-05-10 |
CN103765489A (zh) | 2014-04-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5692403B2 (ja) | 障害物警報装置 | |
WO2013065341A1 (ja) | 障害物警報装置 | |
JP6148887B2 (ja) | 画像処理装置、画像処理方法、及び、画像処理システム | |
US9396401B2 (en) | Obstacle alarm device | |
JP5861449B2 (ja) | 障害物警報装置 | |
JP5660395B2 (ja) | 障害物警報装置 | |
WO2013065325A1 (ja) | 障害物警報装置 | |
JP5974476B2 (ja) | 障害物警報装置 | |
JP2011087006A (ja) | 車両用表示装置 | |
JP5845909B2 (ja) | 障害物警報装置 | |
JP5704416B2 (ja) | 障害物警報装置 | |
JP5787168B2 (ja) | 障害物警報装置 | |
JP5754605B2 (ja) | 障害物警報装置 | |
JP5765576B2 (ja) | 障害物警報装置 | |
JP5704417B2 (ja) | 障害物警報装置 | |
JP5674071B2 (ja) | 障害物警報装置 | |
WO2013094496A1 (ja) | 障害物警報装置 | |
JP5821622B2 (ja) | 障害物警報装置 | |
JP2013131178A (ja) | 障害物警報装置 | |
JP5765575B2 (ja) | 撮影領域明示装置 | |
JP5825091B2 (ja) | 撮影領域明示装置 | |
WO2013094345A1 (ja) | 撮影領域明示装置 | |
WO2013038509A1 (ja) | 車両周辺監視装置および車両周辺監視システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201280042223.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12845726 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013541647 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2012845726 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012845726 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |