JP6590538B2 - Image processing apparatus and image processing method - Google Patents

Image processing apparatus and image processing method Download PDF

Info

Publication number
JP6590538B2
JP6590538B2 JP2015116829A JP2015116829A JP6590538B2 JP 6590538 B2 JP6590538 B2 JP 6590538B2 JP 2015116829 A JP2015116829 A JP 2015116829A JP 2015116829 A JP2015116829 A JP 2015116829A JP 6590538 B2 JP6590538 B2 JP 6590538B2
Authority
JP
Japan
Prior art keywords
notification
image
image processing
target
divided
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2015116829A
Other languages
Japanese (ja)
Other versions
JP2017005468A (en
Inventor
明英 藤尾
明英 藤尾
周樹 村角
周樹 村角
裕生 松本
裕生 松本
茂人 竹内
茂人 竹内
Original Assignee
株式会社デンソーテン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソーテン filed Critical 株式会社デンソーテン
Priority to JP2015116829A priority Critical patent/JP6590538B2/en
Publication of JP2017005468A publication Critical patent/JP2017005468A/en
Application granted granted Critical
Publication of JP6590538B2 publication Critical patent/JP6590538B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an image processing apparatus and an image processing method.

  2. Description of the Related Art Conventionally, there has been known an apparatus that presents a detected object to a driver by displaying a warning mark on a display unit when a predetermined object existing around the vehicle is detected. In such an apparatus, the displayed warning mark is moved so as to gradually approach the object (see, for example, Patent Document 1).

JP 2010-160606 A

  However, in the conventional apparatus, the warning mark is displayed on the display unit and then moved to the vicinity of the object. Therefore, it may take time for the driver to recognize the object, and it cannot be said that the driver can easily recognize the object.

  The present invention has been made in view of the above, and an object of the present invention is to provide an image processing apparatus and an image processing method capable of notifying a driver so that a notification object can be recognized more easily. To do.

  In order to solve the above problems and achieve the object, an image processing apparatus of the present invention includes a detection unit, a determination unit, and an image processing unit. The detection unit detects a notification object from a captured image input from an imaging device mounted on the moving body. The determination unit determines a target area including the notification target object among divided areas obtained by dividing the captured image into a plurality of areas. The image processing unit causes the display unit to display a notification image for notifying the notification target object on a position corresponding to the target region of the captured image.

  ADVANTAGE OF THE INVENTION According to this invention, the image processing apparatus and the image processing method which can alert | report so that a notification target object can be recognized more easily with respect to a driver | operator can be provided.

FIG. 1 is an explanatory diagram illustrating an image processing method according to the embodiment. FIG. 2 is a diagram illustrating a configuration example of the periphery monitoring system according to the embodiment. FIG. 3 is a diagram illustrating an arrangement example of the imaging devices according to the embodiment. FIG. 4 is a diagram for explaining a plurality of divided regions according to the embodiment. FIG. 5 is a diagram illustrating determination of a target area by the determination unit according to the embodiment. FIG. 6 is a diagram illustrating an example of a plurality of divided regions according to the embodiment. FIG. 7 is a diagram illustrating an example of a plurality of divided regions according to the embodiment. FIG. 8 is a diagram illustrating a notification image selection table according to the embodiment. FIG. 9 is a diagram illustrating an example of a notification image selection table according to the embodiment. FIG. 10 is a diagram illustrating an example of a notification image selection table according to the embodiment. FIG. 11 is a diagram illustrating the relationship between the distance to the notification object and the blinking speed according to the embodiment. FIG. 12 is a diagram illustrating a relationship between the distance to the notification target object and the blinking speed according to the embodiment. FIG. 13 is a diagram illustrating an example of a composite image generated by the image processing unit according to the embodiment. FIG. 14 is a diagram illustrating an example of a composite image generated by the image processing unit according to the embodiment. FIG. 15 is a flowchart illustrating image display processing performed by the image processing apparatus according to the embodiment. FIG. 16 is a diagram illustrating a modification of the image processing apparatus according to the embodiment.

  Hereinafter, embodiments of an image processing apparatus and an image processing method disclosed in the present application will be described in detail with reference to the accompanying drawings. In addition, this invention is not limited by embodiment shown below.

<1. Image processing method>
FIG. 1 is an explanatory diagram showing an image processing method according to an embodiment of the present invention. Such an image processing method is executed by, for example, an image processing apparatus 10 (not shown) mounted on a moving body.

  In the following, a case where the moving body is a vehicle C (not shown) will be described as an example, but the moving body is not limited to an automobile. The moving body may be one on which the user gets on, boarding, or maneuvering, such as a train, a ship, or an aircraft.

  In the image processing method according to the present embodiment, first, a notification target is detected from a captured image P1 input from an imaging device (not shown) mounted on the vehicle C. In the example of FIG. 1, the image processing apparatus 10 detects a pedestrian H as a notification object from the captured image P1.

  Hereinafter, although the case where the pedestrian H is detected as an example of the notification target will be described, the notification target is not limited to the pedestrian H. Examples of the notification target include moving objects such as traveling vehicles, bicycles, and motorcycles, stationary vehicles, and stationary objects such as utility poles and trees.

  For example, the image processing apparatus 10 detects a notification target object based on temporal changes of a plurality of captured images P1 captured continuously at a predetermined interval.

  The image processing apparatus 10 determines a target region R4 including the pedestrian H among the divided regions obtained by dividing the captured image P1 into a plurality of regions. Specifically, for example, the image processing apparatus 10 stores information related to a divided region obtained by dividing the captured image P1 into a plurality of regions, and whether the position of the pedestrian H in the captured image P1 is included in the plurality of divided regions. By determining whether or not, the target area including the pedestrian H is determined.

  Note that the image P2 in FIG. 1 is an image showing an example of division of the captured image P1. In the example of FIG. 1, the captured image P1 is divided into nine divided areas. That is, the image P2 in FIG. 1 shows an example in which rectangular divided areas are arranged in a matrix of 3 rows and 3 columns.

  The image processing apparatus 10 generates a composite image P3 by superimposing a notification image that notifies the pedestrian H on a position corresponding to the target region R4 of the captured image P1. The image processing apparatus 10 displays the generated composite image P3 on a display unit (not shown). In the example of FIG. 1, the image processing apparatus 10 displays a caution mark M <b> 1 as a notification image that notifies the pedestrian H.

  Note that the position corresponding to the target area R4 of the captured image P1 is a predetermined position in the target area R4 of the captured image P1. Therefore, even if the pedestrian H moves, when the pedestrian H is included in the target region R4, the caution mark M1 that is a notification image is displayed at the same position in the captured image P1.

  In FIG. 1, the image processing apparatus 10 displays a frame line M2 surrounding the pedestrian H in addition to the caution mark M1. When the pedestrian H moves, the frame line M2 also moves according to the movement of the pedestrian H. In this way, by notifying the pedestrian H itself by the frame line M2 while notifying the rough area where the pedestrian H exists by the attention mark M1, the driver can more easily recognize the notification object. Can be notified.

  Further, in FIG. 1, the image processing apparatus 10 includes a prediction line L1 indicating the predicted traveling direction of the vehicle C, a vehicle width line L2 indicating an extension line of the vehicle width W2 of the vehicle C, and the vehicle C in addition to the notification image. The fixed lines L21 and L22 indicating the distances D1 and D2 are superimposed to generate a composite image P3. Thus, the image which alert | reports information other than a notification target object can be superimposed on the synthesized image P3.

  As described above, the image processing apparatus 10 determines the target area including the pedestrian H among the divided areas obtained by dividing the captured image P1 into a plurality, and displays the notification image at a position corresponding to the target area. Thereby, the image processing apparatus 10 can notify the driver so that the notification object can be recognized more easily. More specifically, instead of displaying the notification image in association with the notification object, the notification object is displayed in association with the included area, so that the driver can display the notification object in any region on the screen. Can be notified so that it can be recognized intuitively. Hereinafter, the periphery monitoring system 1 including the image processing apparatus 10 will be further described.

<2. Perimeter monitoring system 1>
FIG. 2 is a diagram illustrating a configuration example of the periphery monitoring system 1 according to the embodiment of the present invention. As shown in FIG. 2, the periphery monitoring system 1 includes an image processing device 10, an imaging device 20, a navigation device 30, a shift sensor 40, and a steering sensor 50.

<2.1. Imaging Device 20>
FIG. 3 is a diagram illustrating an arrangement example of the imaging device 20. As illustrated in FIG. 3, the imaging device 20 is disposed behind the vehicle C and performs imaging with the rear of the vehicle C as an imaging direction. A region R illustrated in FIG. 3 is a region indicating an imaging range of the imaging device 20.

  The imaging device 20 includes an imaging device such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS), and an image processing device that captures an image behind the vehicle C (hereinafter referred to as a captured image) captured by the imaging device. 10 is output.

  Further, by adopting a wide-angle lens such as a fisheye lens as the lens of the imaging device 20, the angle of view of the imaging device 20 may be 180 degrees or more. As a result, it is possible to photograph the rear of the vehicle C widely.

  In addition, although the example which arrange | positions the imaging device 20 back was demonstrated here, it is not restricted to this. For example, the imaging device 20 is disposed in front of the vehicle C, the rear imaging unit disposed behind the vehicle C, the right imaging unit disposed on the right side of the vehicle C, and the left side of the vehicle C. And a left imaging unit arranged in the direction, and imaging may be performed with the front, rear, right, and left sides of the vehicle C as imaging directions. As a result, the entire periphery of the vehicle C can be photographed.

<2.2. Navigation Device 30>
The navigation device 30 of FIG. 2 includes a display unit 31, an audio output unit 32, an input operation unit 33, and a control unit 34. The navigation device 30 has a navigation function and an audio function when the driver of the vehicle C drives.

  The display unit 31 includes an LCD (Liquid Crystal Display), and displays, for example, a navigation image or an image based on a television broadcast signal in accordance with an instruction from the control unit 34. The display unit 31 displays the captured image P1 and the composite image P3 input from the image processing apparatus 10.

  The audio output unit 32 includes a speaker, and outputs audio based on, for example, audio guidance for a navigation function or a television broadcast signal in accordance with an instruction from the control unit 34. The input operation unit 33 receives an input operation on the navigation device 30 from the operator. When the display unit 31 is a touch panel display, for example, the display unit 31 may have the function of the input operation unit 33.

  The control unit 34 controls each unit included in the navigation device 30. For example, based on the input operation received by the input operation unit 33, the control unit 34 displays a predetermined image on the display unit 31 or causes the audio output unit 32 to output audio data.

<2.3. Shift sensor 40 and steering sensor 50>
The periphery monitoring system 1 includes various sensors that detect the state of the vehicle C, such as a shift sensor 40 and a steering sensor 50.

  The shift sensor 40 detects the position of a shift lever (not shown). Examples of the position of the shift lever include “parking” where the vehicle C is completely stopped, “rear” where the vehicle C is moving backward, and the like. The shift sensor 40 outputs the detected position of the shift lever to the image processing apparatus 10.

  The steering sensor 50 detects the steering angle of the steering wheel of the vehicle C. Alternatively, the steering angle of the wheel of the vehicle C may be detected. The steering sensor 50 outputs the detected steering angle to the image processing apparatus 10.

<2.4. Image Processing Device 10>
The image processing apparatus 10 includes a progress prediction unit 11, a detection unit 12, a determination unit 13, an image processing unit 14, and a storage unit 15.

<2.4.1. Progression Prediction Unit 11>
The progress prediction unit 11 predicts the traveling direction of the vehicle C based on the steering angle of the wheels input from the steering sensor 50. For example, the progress prediction unit 11 calculates a trajectory line when the vehicle C moves while maintaining the steering angle of the wheel as the prediction line L1 indicating the traveling direction. The traveling direction prediction unit 11 outputs the calculated prediction line L1 to the determination unit 13 and the image processing unit 14.

  In addition, the prediction of the advancing direction by the advancing prediction part 11 is not restricted to the method based on the steering angle of a wheel. For example, the progress prediction unit 11 may predict the traveling direction based on the captured image P <b> 1 by the imaging device 20. As a method for predicting the traveling direction based on the captured image P1, for example, there is a method of predicting the traveling direction from a plurality of temporally continuous captured images P1 using an optical flow.

<2.4.2. Detection unit 12>
The detection unit 12 detects a notification target object based on temporal changes of a plurality of captured images P1 that are continuously captured at a predetermined interval. The detection unit 12 outputs the detected notification object to the determination unit 13 and the image processing unit 14.

  As a detection method of the notification target object by the detection unit 12, for example, there is a method using the optical flow described above. Specifically, first, the detection unit 12 detects the feature point F1 from the captured image captured at time t1. As a method of detecting feature points from an image, for example, there are a method of detecting a corner which is an intersection of edges and a KLT method.

  Subsequently, the detection unit 12 detects the feature point F2 from the captured image captured at time t2 next to time t1. The detection unit 12 calculates a movement vector (optical flow) connecting the position of the feature point F1 at time t1 and the feature point F2 at time t2, and detects a notification target object based on the calculated optical flow.

  The number of feature points F1 and F2 detected by the detection unit 12 is not limited to one. The progress prediction unit 11 may detect a plurality of feature points F1 and F2. Moreover, the detection method of the notification target object by the detection part 12 is not restricted to the method using an optical flow. For example, the notification target may be detected by using a pattern matching method that collates a captured image with a pattern image.

<2.4.3. Determination unit 13>
For example, when the vehicle C is moving backward based on the detection result input from the shift sensor 40, the determination unit 13 includes the notification target object among the divided regions R1 to R9 obtained by dividing the captured image P1 into a plurality of regions. Determine the target area. Here, the case where the detection part 12 detects the pedestrian H as a notification target object is demonstrated.

  First, the plurality of divided regions R1 to R9 will be described with reference to FIG. FIG. 4 is a diagram illustrating a plurality of divided regions R1 to R9.

  As shown in FIG. 4, the image P2 is divided by two straight lines LH1 and LH2 extending in the vertical direction and two straight lines LW1 and LW2 extending in the horizontal direction. Therefore, nine divided regions R1 to R9 are included in the image P2 shown in FIG.

  Here, the distance W1 between the two straight lines LH1 and LH2 is, for example, the vehicle width W2, that is, the same length W2 as the longest distance among the two straight lines of the vehicle width line L2 (see FIG. 1). Suppose that The straight line LW1 extending in the lateral direction is a line indicating a position away from the vehicle C by the first distance D1, and is a line obtained by extending the fixed line L21 to the left and right (see FIG. 1). A straight line LW2 extending in the lateral direction is a line indicating a position away from the vehicle C by the second distance D2, and is a line obtained by extending the fixed line L22 to the left and right (see FIG. 1).

  The determination unit 13 determines a divided region including the pedestrian H as a target region among the nine divided regions R1 to R9 divided by four straight lines. For example, when the pedestrian H exists over a plurality of divided regions, the determination unit 13 determines the target region according to the area of the pedestrian H in each divided region. For example, the determination unit 13 counts the number of pixels included in the image indicating the pedestrian H for each of the divided regions R1 to R9, and determines the divided region having the largest number of pixels as the target region.

  Thus, the determination part 13 determines the division area with the largest ratio in which the pedestrian H is included among division area R1-R9 as an object area | region. Here, when there is a divided area where the ratio of the pedestrian H is included in the divided areas R1 to R9, that is, when the area of the pedestrian H in each divided area is equal, the determination unit 13 determines the traveling direction of the vehicle C. The target area is determined according to the above. This point will be described with reference to FIG.

  FIG. 5 is a diagram illustrating determination of the target area by the determination unit 13. As shown in FIG. 5, depending on the position of the pedestrian H in the captured image P1, the pedestrian H may exist across a plurality of divided regions (divided regions R5 and R6 in FIG. 5).

  As described above, when the pedestrian H exists in one of the plurality of divided areas, the determination unit 13 determines the divided area in which the pedestrian H exists as a target area. On the other hand, when the ratio of the pedestrians H included in the plurality of divided regions is the same as illustrated in FIG. 5, the determination unit 13 determines the target region according to the traveling direction predicted by the progress prediction unit 11.

  When the traveling direction of the vehicle C is straight, the determination unit 13 sets the divided region R5 close to the center side of the captured image P1, that is, the image P2 among the plurality of divided regions R5 and R6 where the pedestrian H exists as a target region. decide.

  Here, the divided region closest to the center side of the image P2 indicates, for example, a divided region in which the distance between the central pixel of the divided region and the central pixel of the image P2 is the shortest among a plurality of divided regions. Therefore, for example, when the pedestrian H straddles a plurality of divided regions R5 and R8 at the same rate, the determination unit 13 determines the divided region R5 as the target region.

  The central region is a region that is more easily noticed by the driver than the peripheral region of the captured image P1. Therefore, by making a divided region close to the center side of the image P2 among the plurality of divided regions R1 to R9 as a target region, it is possible to notify the driver so that the notification image can be recognized more easily.

  Next, when the traveling direction of the vehicle C is turning, the determination unit 13 determines a divided region on the traveling direction side as a target region among the plurality of divided regions R5 and R6 where the pedestrian H exists. For example, when the traveling direction of the vehicle C is right, the determination unit 13 determines the right divided region R6 among the plurality of divided regions R5 and R6 as the target region.

  Note that, for example, when the pedestrian H is straddling at the same ratio in a plurality of vertically aligned regions such as the divided regions R5 and R8, for example, as described above, the divided region R5 close to the center side of the image P2 is determined as the target region. You may make it do.

  Here, although the case where the determination unit 13 determines the target area according to the traveling direction of the vehicle C has been described, the vehicle C does not necessarily have to move in such a case. Even when the vehicle C is stopped, for example, the progress prediction unit 11 predicts the traveling direction of the vehicle C according to the steering angle of the wheel, and the determination unit according to the traveling direction predicted by the progress prediction unit 11. 13 may determine the target area.

  Note that the case where the traveling direction of the vehicle C is straight is not limited to the case where the prediction line L1 predicted by the traveling prediction unit 11 is a straight line. For example, the traveling direction of the vehicle C may be straight when the radius of curvature of the prediction line L1 of the vehicle C is equal to or greater than a predetermined threshold, and the vehicle may be turned when it is less than the predetermined threshold.

  In FIG. 5, when the ratio of the pedestrians H included in the divided areas R5 and R6 is the same, that is, when the areas of the pedestrians H in the divided areas R5 and R6 are equal, the determination unit 13 responds to the traveling direction of the vehicle C However, the present invention is not limited to this. For example, when the difference in the area of the pedestrian H in the divided regions R5 and R6 is equal to or less than the predetermined threshold value Sth, the determination unit 13 may determine the target region according to the traveling direction of the vehicle C.

  Note that the plurality of divided regions R1 to R9 are not limited to the example illustrated in FIG. For example, as shown in FIG. 6, among the plurality of divided regions R1 to R9, the divided regions R2, R5, and R8 positioned at the center in the horizontal direction are further subdivided into a plurality of subregions R2_L, R2_R, R5_L, R5_R, R8_L, and R8_R. You may divide into. FIG. 6 is a diagram illustrating an example of a plurality of divided regions.

  In this case, the determination unit 13 includes the divided regions R1, R3, R4, R6, R7, and R9 positioned on the left and right of the captured image P1, that is, the image P2, and a plurality of sub-regions R2_L positioned substantially at the center of the captured image P1. A target region including the pedestrian H is determined from among R2_R, R5_L, R5_R, R8_L, and R8_R.

  Although FIG. 6 shows an example in which each of the divided areas R2, R5, and R8 is divided into two sub-areas, the number of divisions may be two or more. Further, the other divided regions R1, R3, R4, R6, R7, and R9 may be divided into a plurality of subregions. The number of the plurality of divided regions is not limited to nine, and may be more or less than nine.

  Another example of the divided area will be described with reference to FIG. FIG. 7 is a diagram illustrating an example of a plurality of divided regions. FIG. 7 shows an example in which the captured image P1, that is, the image P2, is divided into nine divided regions R21 to R29 along the vehicle width line L2 and the fixed lines L21 and L22.

  As described above, the two straight lines LH1 and LH2 in the vertical direction and the two straight lines LW1 and LW2 in the horizontal direction that divide the image P2 do not have to be parallel to each other. Also good.

  In FIG. 7, the captured image P1, that is, the image P2, is divided into the plurality of divided regions R21 to R29 along the vehicle width line L2 and the fixed lines L21 and L22. For example, you may divide | segment along the prediction line L1 and the fixed lines L21 and L22. In this way, the image P2 may be divided by a curve.

  The number and shape of the plurality of divided regions may be any number or shape depending on, for example, the size of the notification object or the size of the notification image.

<2.4.4. Image Processing Unit 14>
The image processing unit 14 in FIG. 2 superimposes a notification image for notifying the pedestrian H on a position corresponding to a target region (for example, the divided region R4 in the example of FIG. 1) of the captured image P1. For example, the image processing unit 14 superimposes the caution mark M1 on the captured image P1 as a notification image for notifying the pedestrian H. Furthermore, the image processing unit 14 superimposes a frame line M2 surrounding the pedestrian H on the captured image P1.

  The image processing unit 14 further includes a prediction line L1 indicating the predicted traveling direction of the vehicle C, a vehicle width line L2 indicating an extension line of the vehicle width W2 of the vehicle C, and fixed lines indicating distances D1 and D2 from the vehicle C. A composite image P3 is generated by superimposing L21 and L22. The image processing unit 14 causes the display unit 31 to display the generated composite image P3. For example, when the vehicle C is moving backward, the image processing unit 14 generates a composite image P3 and outputs it to the display unit 31.

  Here, the image processing unit 14 selects a notification image that notifies the pedestrian H according to the notification image selection table, and superimposes the selected notification image on the captured image. The notification image selection table will be described with reference to FIG.

<2.4.4.1. Notification image selection table>
FIG. 8 is a diagram illustrating a notification image selection table according to the present embodiment. The notification image selection table shown in FIG. 8 stores each of the divided areas R1 to R9 in association with the type (display image) of the notification image to be displayed, the display location, and the display method. Note that “area No.” in the notification image selection table shown in FIG. 8 corresponds to, for example, the symbols of the divided areas R1 to R9 shown in FIG.

  As shown in FIG. 8, the display images of the divided regions R1 to R3 are “no display”. In this case, the image processing unit 14 does not superimpose the notification image on the captured image P1. The divided areas R1 to R3 are areas that are separated from the vehicle C by a second distance D2 or more.

  As described above, in the example of FIG. 8, the image processing unit 14 determines that the notification object existing at a position separated from the vehicle C by the second distance D2 or more is low in risk and low in notification priority. No notification is made using. However, the notification object itself may be notified by, for example, surrounding the notification object with a frame line M2.

  In the notification image selection table of FIG. 8, the divided areas R4 to R9 include the divided areas R4 to R9 and the display places corresponding to the display image “Caution Mark” and the display area (hereinafter also referred to as the display area). Among them, the position, that is, the display position in the display area (hereinafter also referred to as the display position) is “center”. The display method of the divided areas R4 to R6 is “flashing”, and the display method of the divided areas R7 to R9 is “lighting”.

  As described above, for the notification object existing between the vehicle C and the second distance D2, the image processing unit 14 displays the attention mark M1 at the display position in the target area determined by the determination unit 13. Here, it is assumed that the caution mark M1 is, for example, a mark displaying an exclamation mark in a triangular yellow frame. At this time, the image processing unit 14 blinks the caution mark M1 for the notification object existing between the first distance D1 and the second distance D2, and between the vehicle C and the first distance D1. The caution mark M1 is lit on the existing notification object.

  Accordingly, the notification object can be notified to the driver according to the distance from the vehicle C, and the driver can more easily recognize the distance between the notification object and the vehicle C in addition to the notification object. become.

  The notification image selection table is not limited to the example shown in FIG. For example, the image processing unit 14 may superimpose the notification image on the captured image P1 according to the notification image selection table shown in FIGS. 9 and 10 are diagrams illustrating an example of the notification image selection table according to the present embodiment.

  The notification image selection table shown in FIG. 9 includes items of “flashing speed” and “display size” in addition to the items of the notification image selection table of FIG. In addition, the display images of the divided areas R1 to R6 are “caution marks”, and the display images of the divided areas R7 to R9 are “warning marks”. Here, the warning mark is, for example, a mark in which an exclamation mark is displayed within a triangular red frame.

  Further, in the notification image selection table shown in FIG. 9, the display position of the display image is changed according to the positions of the divided areas R1 to R9 in the horizontal direction, such as “left side”, “center”, and “right side” of each area. Yes. Thus, the display position of the display image may be changed according to the positions of the divided areas R1 to R9 in the captured image P1.

  Further, the display method is “flashing” in any of the divided regions R1 to R9, and the flashing speed is changed according to the distance from the vehicle C. Specifically, the blinking speed is “slow” in the divided areas R1 to R6, and “fast” in the divided areas R7 to R9. The display size is “small” in the divided areas R1 to R3, “medium” in the divided areas R4 to R6, and “large” in the divided areas R7 to R9.

  Next, in the notification image selection table shown in FIG. 10, the display positions of the divided areas R4 to R9 are set as predetermined positions of the divided areas adjacent to the target area. Specifically, the divided areas R1 to R6 located above the divided areas R4 to R9 in the display unit 31 are set as the display areas of the divided areas R4 to R9.

  Thus, the notification image does not necessarily have to be displayed in the target area. As described above, the notification image can be displayed on the display unit 31 without hindering the display of the pedestrian H, for example, by displaying in the adjacent divided regions.

  In the above-described example, the case where the blinking speed and the display size are changed has been described. However, the present invention is not limited to this. For example, when there is a notification target in a region where the distance to the notification target such as the divided regions R1 to R3 is far, the notification target is blue in a region where the distance to the notification target such as the divided regions R4 to R6 is medium. The notification image may be displayed in yellow when it exists, and in red when the notification target exists in a region close to the notification target such as the divided regions R7 to R9. In this way, the display color of the notification image may be changed according to the distance to the notification object.

  In this way, the display format of the notification image may be determined separately for each of the divided regions R1 to R9, and the number and contents of such items are shown in the examples shown in the notification image selection tables shown in FIGS. Not limited. Further, for example, the driver may be able to change the contents of each item.

  In the example shown in FIG. 9, the blinking speed is set for each divided area, but the present invention is not limited to this. For example, as shown in FIG. 11, the blinking speed may be changed according to the distance between the notification object and the vehicle C.

  FIG. 11 is a diagram illustrating the relationship between the distance to the notification object and the blinking speed. The vertical axis in FIG. 11 indicates the distance from the vehicle C to the notification object, and the horizontal axis indicates the blinking speed of the notification image.

  As shown in FIG. 11, when the distance from the vehicle C to the notification object is long, the blinking speed is slowed down, and the blinking speed becomes faster as it gets closer. As described above, in this embodiment, a notification image is displayed at a predetermined position in each of the divided regions R1 to R9. Therefore, when the notification target moves and the target area changes, the position of the notification image that notifies the notification target is changed.

  Therefore, in the example shown in FIG. 11, the flashing speed of the notification image is changed according to the distance between the notification target and the vehicle C even when the target area does not change even if the notification target moves. ing. Thereby, it is possible to notify the driver that the notification object is approaching the vehicle C.

  In the example illustrated in FIG. 11, the case where the blinking speed of the notification image continuously changes according to the distance is illustrated, but the present invention is not limited thereto. For example, as shown in FIG. 12, the blinking speed may be increased stepwise. FIG. 12 is a diagram illustrating the relationship between the distance to the notification object and the blinking speed.

  When receiving the notification of the target area from the determination unit 13, the image processing unit 14 determines the display content of the notification image based on, for example, the notification image selection table illustrated in FIG. 8. For example, when the target region is the divided region R4, the image processing unit 14 causes the “caution mark” to blink in the center of the divided region R4 for display.

  In addition, although the case where the image processing unit 14 displays a caution mark or a warning mark in the divided area as a notification image has been described here, the notification image to be displayed is not limited thereto. For example, as shown in FIGS. 13 and 14, a notification image may be displayed on the upper part of the captured image P1.

  13 and 14 are diagrams illustrating an example of the composite image P31 generated by the image processing unit 14. In the composite image P31 shown in FIG. 13, a caution mark M1 is displayed as a notification image, and a substantially rectangular image M3 is displayed.

  The image M3 is a band-like image having the same length as the horizontal width of the composite image P31 and having a predetermined width. Hereinafter, the image M3 is also referred to as a notification bar M3.

  For example, when strong light such as sunlight is applied to the display unit 31, it may be difficult for the driver to visually recognize the composite image P <b> 31 displayed on the display unit 31 due to light reflection. Thus, even if it is difficult for the driver to visually recognize the composite image P31, the image processing unit 14 generates a composite image P31 by superimposing the notification bar M3 larger than the caution mark M1, thereby informing the driver. An object can be notified.

  Further, since the upper part of the display unit 31 is not easily affected by light, the driver can easily recognize the notification bar M3 by superimposing the notification bar M3 so as to be displayed on the upper part of the display unit 31. become able to.

  In addition, according to the distance from the vehicle C to the notification target, the driver may be notified of the distance to the notification target by changing the display color of the notification bar M3, for example. Moreover, you may make it alert | report a notification target object with respect to a driver | operator also by this character information by superimposing character information (not shown) on alerting | reporting bar M3, for example.

  Alternatively, as shown in FIG. 14, gradation display may be performed by increasing the transmittance of the notification bar M3 toward the lower side. This makes it easier for the driver to visually recognize the image displayed on the back surface of the notification bar M3.

  Here, the case where the imaging device 20 captures an image behind the vehicle C has been described. However, for example, when the entire periphery of the vehicle C is captured, the image processing unit 14 performs virtual processing based on the captured image obtained by capturing the entire periphery. A viewpoint image may be generated.

  In this case, for example, from the imaging device 20, a plurality of captured images with the front, rear, right side, and left side of the vehicle C as imaging directions are input. The image processing unit 14 performs coordinate conversion processing on the plurality of captured images, and generates a virtual viewpoint image that is an image viewed from the virtual viewpoint. As the coordinate conversion process, the image processing unit 14 projects (maps) a captured image on a predetermined projection plane, for example, and is included in a predetermined viewing angle as viewed from the virtual viewpoint in the captured image P1 projected on the predetermined projection plane. The image of the area to be recorded is assumed to be a virtual viewpoint image.

  The image processing unit 14 stores, for example, a table indicating the correspondence between the positions of data included in a plurality of captured images and the position of a predetermined projection plane, and is included in the plurality of captured images using such a table. Data to be projected onto a corresponding position on a predetermined projection plane.

  The predetermined projection surface has, for example, a substantially hemispherical shape (for example, bowl shape), and the central area (for example, the bottom portion of the bowl) is the position of the vehicle C, and is outside the position of the vehicle C (for example, The substantially hemispherical outer peripheral area) corresponds to an area around the vehicle C. The predetermined projection surface may be a flat surface instead of a curved surface, for example.

  The image processing unit 14 generates a composite image P3 by superimposing a notification image or the like on the generated virtual viewpoint image and displays it on the display unit 31 of the navigation device 30.

  Here, although the image processing unit 14 displays the composite image P3 on the display unit 31 when the vehicle C is moving backward, the present invention is not limited to this. For example, the composite image P3 may be displayed on the display unit 31 even when the vehicle C is moving forward. In this case, the image processing unit 14 displays a notification image or the like superimposed on the captured image in front of the vehicle C captured by the imaging device 20.

  For example, when the vehicle C passes through a narrow road or when the vehicle C passes another vehicle, the composite image P3 is displayed on the display unit 31 to show the traveling direction in which the vehicle can travel safely. can do.

<2.4.5. Storage unit 15>
The storage unit 15 stores, for example, a notification image selection table shown in FIG. In addition, the storage unit 15 stores information related to the plurality of divided regions R1 to R9. Thus, the storage unit 15 stores various information used for processing by each unit of the image processing apparatus 10. The storage unit 15 stores the processing results of each unit. The storage unit 15 is configured by a semiconductor memory device such as a RAM (Random Access Memory) or a flash memory, or a storage device such as a hard disk or an optical disk.

<3. Image display processing>
Subsequently, an image display process performed by the image processing apparatus 10 will be described. FIG. 15 is a flowchart illustrating image display processing performed by the image processing apparatus 10. For example, the image processing apparatus 10 starts the image display process when the shift lever position of the vehicle C is changed to “rear”.

  The image processing device 10 acquires the captured image P1 from the imaging device 20 (step S101). The image processing apparatus 10 determines whether or not a notification target exists in the acquired captured image P1 (step S102). When the notification target does not exist (step S102; No), the image processing apparatus 10 ends the image display process.

  On the other hand, when a notification target exists (step S102; Yes), the image processing apparatus 10 determines a divided region where the notification target exists as a target region (step S103). The image processing apparatus 10 causes the display unit 31 to display the notification image by superimposing the notification image on the position corresponding to the target region determined in step S103 of the captured image P1 (step S104).

  Here, for example, the image display process is started when the position of the shift lever of the vehicle C is changed to “rear”, but the present invention is not limited to this. For example, when the position of the shift lever is changed to “drive”, that is, when the vehicle C is traveling straight, the image display process may be started. Or you may make it start an image display process by a driver | operator's operation.

  As described above, the target area including the pedestrian H is determined from the divided areas obtained by dividing the captured image P1 into a plurality of areas, and the notification image is displayed at a position corresponding to the target area. Thereby, the image processing apparatus 10 can notify the driver so that the notification object can be recognized more easily.

  Moreover, the position of the notification target object can be notified to the driver by displaying a notification image for each divided region. Further, the image processing apparatus 10 changes the display of the notification image according to the distance from the vehicle C to the notification target, such as the blinking speed and the display size, for example, so that the driver can move from the vehicle C to the notification target. The distance can be notified.

<4. Modification>
FIG. 16 is a diagram illustrating a modification of the image processing apparatus 10 according to the present embodiment. The image processing apparatus 10 illustrated in FIG. 1 notifies the driver of the notification target object by causing the display unit 31 to display a notification image.

  In this modification, in addition to the notification image, the notification target object is notified using sound. Since other components and operations are the same as those of the image processing apparatus 10 according to the embodiment, description thereof is omitted.

  Specifically, the image processing apparatus 10 further includes an audio processing unit 16. The voice processing unit 16 generates a sound (hereinafter referred to as a notification sound) corresponding to the target area determined by the determination unit 13 and outputs the sound to the voice output unit 32 of the navigation device 30, for example.

  The sound processing unit 16 performs various signal processing such as frequency conversion and amplitude amplification on the sound data to generate a notification sound. The notification sound may be, for example, a mechanical sound such as an alarm sound or a warning sound, or may be a voice that reads out character information such as a synthesized voice. The sound data is stored in the storage unit 15, for example.

  For example, the sound processing unit 16 generates a notification sound from different types of sound data for each of the plurality of divided regions R1 to R9. In this case, for example, the first sound data in the divided areas R1 to R3, the second sound data in the divided areas R4 to R6, and the third sound data in the divided areas R7 to R9, up to the vehicle C and the divided areas R1 to R9. Different sound data may be used depending on the distance.

  Or you may vary the height of a notification sound, ie, a frequency, or the volume of a notification sound according to division area R1-R9. In this case, for example, like the blinking speed shown in FIG. 11 and FIG. 12, it changes continuously or stepwise depending on the distance from the vehicle C to the notification object regardless of the divided regions R1 to R9. May be.

  At this time, the frequency of the notification sound is increased so that the notification sound becomes higher as the distance from the vehicle C to the notification object is shorter. Alternatively, the volume of the notification sound is increased as the distance from the vehicle C to the notification target is shorter. As a result, the closer the distance from the vehicle C to the notification object and the higher the risk level, the higher the risk level, and the driver can be notified of the notification target.

  As described above, the image processing device 10 notifies the driver of the notification target object using the notification image and the notification sound, so that the notification target object is displayed even when the driver is not viewing the display unit 31, for example. Can be notified.

  At this time, for example, by changing the notification sound according to the target region, for example, even when the driver is not viewing the display unit 31, the divided regions R1 to R9 including the notification target, that is, the notification target The driver can be notified of the position.

  Further effects and modifications can be easily derived by those skilled in the art. Thus, the broader aspects of the present invention are not limited to the specific details and representative embodiments shown and described above. Accordingly, various modifications can be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.

DESCRIPTION OF SYMBOLS 1 Perimeter monitoring system 10 Image processing apparatus 11 Progression prediction part 12 Detection part 13 Determination part 14 Image processing part 15 Memory | storage part 16 Audio | voice processing part 20 Imaging device 30 Navigation apparatus 31 Display part 32 Voice output part 33 Input operation part 34 Control part 40 Shift sensor 50 Steering sensor

Claims (7)

  1. A detection unit that detects a notification object from a captured image input from an imaging device mounted on a moving body;
    A determination unit that determines a target region including the notification target object among divided regions obtained by dividing the captured image;
    An image processing unit for displaying a notification image for notifying the notification object on a display unit so as to be superimposed on a position corresponding to the target region of the captured image ;
    The determination unit
    In the case where the notification object exists over a plurality of the divided areas, and when the difference in area of the notification objects in the plurality of divided areas is equal to or less than a predetermined threshold, depending on the traveling direction of the moving object An image processing apparatus that determines the target area .
  2. The determination unit
    If the moving body is straight, among the plurality of the divided regions the notification object is present, the area close to the center of the captured image to claim 1, wherein the determining the target region The image processing apparatus described.
  3. The determination unit
    If the moving body is turning, among the plurality of the divided regions the notification object is present, according to claim 1, characterized in that to determine the traveling direction of the area of the moving body to the target area or 2. The image processing apparatus according to 2.
  4. The determination unit
    The target area including the notification target is determined from the divided areas located on the left and right of the captured image and the sub areas obtained by further dividing the divided area located substantially in the center of the captured image. The image processing apparatus according to claim 1, 2, or 3 .
  5. The image processing unit
    The image processing apparatus according to any one of claims 1 to 4, characterized in that to be displayed on the display unit superimposes the notification image to the predetermined position of the divided area adjacent to the target area.
  6. The image processing unit
    The image processing apparatus according to any one of claims 1 to 5, characterized in that to change the display of the notification image in accordance with the distance from the moving body to the notification object.
  7. Detecting a notification object from a captured image input from an imaging device mounted on a moving body;
    Determining a target area including the notification target object among divided areas obtained by dividing the captured image;
    The notification image for notifying the notification object, seen including a step of displaying on the display unit so as to overlap a position corresponding to the target region of the captured image,
    The step of determining includes
    In the case where the notification object exists over a plurality of the divided areas, and when the difference in area of the notification objects in the plurality of divided areas is equal to or less than a predetermined threshold, depending on the traveling direction of the moving body An image processing method, wherein the target region is determined .
JP2015116829A 2015-06-09 2015-06-09 Image processing apparatus and image processing method Active JP6590538B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015116829A JP6590538B2 (en) 2015-06-09 2015-06-09 Image processing apparatus and image processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2015116829A JP6590538B2 (en) 2015-06-09 2015-06-09 Image processing apparatus and image processing method

Publications (2)

Publication Number Publication Date
JP2017005468A JP2017005468A (en) 2017-01-05
JP6590538B2 true JP6590538B2 (en) 2019-10-16

Family

ID=57752859

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2015116829A Active JP6590538B2 (en) 2015-06-09 2015-06-09 Image processing apparatus and image processing method

Country Status (1)

Country Link
JP (1) JP6590538B2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA3049935A1 (en) 2017-01-16 2018-07-19 Tomoegawa Co., Ltd. Copper fiber nonwoven fabric for wiring, wiring unit, method for cooling copper fiber nonwoven fabric for wiring, and temperature control method for copper fiber nonwoven fabric for wiring

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3559083B2 (en) * 1994-12-26 2004-08-25 本田技研工業株式会社 Driving support device
JP4961160B2 (en) * 2006-04-18 2012-06-27 パナソニック株式会社 Vehicle surroundings confirmation device
JP2011076214A (en) * 2009-09-29 2011-04-14 Alps Electric Co Ltd Obstacle detection device
JP5603835B2 (en) * 2011-06-27 2014-10-08 クラリオン株式会社 Vehicle perimeter monitoring device

Also Published As

Publication number Publication date
JP2017005468A (en) 2017-01-05

Similar Documents

Publication Publication Date Title
DE102014217437B4 (en) Warning display device and warning display method
US9771022B2 (en) Display apparatus
EP2974909B1 (en) Periphery surveillance apparatus and program
US20180174460A1 (en) Apparatus and method for sensing and notifying pedestrian
US8862389B2 (en) Display system, display method, and display program
US9965957B2 (en) Driving support apparatus and driving support method
JP6346614B2 (en) Information display system
US9318018B2 (en) User interface method for terminal for vehicle and apparatus thereof
JP6440115B2 (en) Display control apparatus, display control method, and display control program
JP6252316B2 (en) Display control device for vehicle
JP6091759B2 (en) Vehicle surround view system
JP5639283B2 (en) Vehicle periphery monitoring device
EP2759999B1 (en) Apparatus for monitoring surroundings of vehicle
JP5620472B2 (en) Camera system for use in vehicle parking
JP4967015B2 (en) Safe driving support device
US9789819B2 (en) Driving assistance device
JP5316698B2 (en) Driving assistance device
KR100956858B1 (en) Sensing method and apparatus of lane departure using vehicle around image
JP4855158B2 (en) Driving assistance device
US8441536B2 (en) Vehicle periphery displaying apparatus
JP5031801B2 (en) In-vehicle image display device
US7190281B2 (en) Vehicle environment monitoring device, vehicle environment monitoring method, control program and computer-readable recording medium
JP2014167676A (en) Inter-vehicle distance calculation device and motion controlling method for the same
JP4134939B2 (en) Vehicle periphery display control device
JP4107605B2 (en) Moving object periphery monitoring device, moving object periphery monitoring method, control program, and readable recording medium

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20180330

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20190129

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20190205

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20190403

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20190827

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20190917

R150 Certificate of patent or registration of utility model

Ref document number: 6590538

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150