JP2013154730A - Apparatus and method for processing image, and parking support system - Google Patents

Apparatus and method for processing image, and parking support system Download PDF

Info

Publication number
JP2013154730A
JP2013154730A JP2012016164A JP2012016164A JP2013154730A JP 2013154730 A JP2013154730 A JP 2013154730A JP 2012016164 A JP2012016164 A JP 2012016164A JP 2012016164 A JP2012016164 A JP 2012016164A JP 2013154730 A JP2013154730 A JP 2013154730A
Authority
JP
Japan
Prior art keywords
parking
vehicle
image
parking area
image processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2012016164A
Other languages
Japanese (ja)
Inventor
Shigeyuki Hisai
茂幸 久井
Hiroaki Maruno
浩明 丸野
Yusuke Iguchi
裕介 井口
礼文 ▲高▼須賀
Ayafumi Takasuga
Original Assignee
Fujitsu Ten Ltd
富士通テン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ten Ltd, 富士通テン株式会社 filed Critical Fujitsu Ten Ltd
Priority to JP2012016164A priority Critical patent/JP2013154730A/en
Publication of JP2013154730A publication Critical patent/JP2013154730A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

A technique capable of detecting a parking space without a driver operating a system is provided.
An image processing device that detects a parking area that is a parking target of a vehicle based on an image, an image acquisition unit that acquires an image obtained by a photographing device that images the outside of the vehicle, and vehicle information Vehicle information acquisition means for acquiring vehicle information obtained by the acquisition device, and detection means for starting detection of the parking area when the vehicle information acquisition means acquires vehicle travel information.
[Selection] Figure 2

Description

  The present invention relates to a technique for detecting a parking area of a vehicle.

  2. Description of the Related Art Conventionally, a technique is known in which a vehicle mounted on a vehicle is used to photograph the surroundings of the vehicle, and a parking space is recognized based on the captured image to perform parking assistance (see, for example, Patent Document 1). In the parking assistance system of Patent Document 1, when the vehicle speed is 10 km / hour or less with the driver turning on the parking assistance system, the camera operates to continuously store the running image and the distance information in the memory. When the driver puts the back gear, the series of images and distance information are read from the memory in reverse order and reproduced, and the parking target position is retrieved and displayed.

JP 2006-96312 A

  However, since the parking assistance system of patent document 1 is started in a state where the driver switches on the system, the parking assistance system cannot be used when the driver forgets to turn on the switch. Moreover, since it is necessary to memorize | store driving | running | working image and distance information in memory continuously until a driver finds a parking space, a memory | storage amount becomes enormous. Furthermore, since the driving image and the distance information are reproduced in reverse order and the parking target position is searched after the driver puts the back gear, it takes a long time to specify the parking target position.

  The present invention has been made in view of the above-described problem, and can detect a parking space without a driver operating the system, and can immediately display the parking space when parking is started. The purpose is to provide.

  In order to solve the above-mentioned problem, the invention of claim 1 is an image processing device for detecting a parking area to be parked on a vehicle based on an image, and obtains an image obtained by a photographing device for photographing the outside of the vehicle. Image acquisition means, vehicle information acquisition means for acquiring vehicle information obtained by an acquisition device for acquiring vehicle information, and detection of the parking area when the vehicle information acquisition means acquires vehicle travel information. And detecting means.

  According to a second aspect of the present invention, in the image processing device according to the first aspect of the present invention, a display image for displaying the parking area on an external display device is generated, and the vehicle information acquisition unit is configured to park the vehicle. When starting information is acquired, the image processing device further includes image generation means for outputting the display image to a display device.

  According to a third aspect of the present invention, in the image processing apparatus according to the second aspect, the parking start information is information indicating that the gear of the vehicle has been switched to a reverse gear.

  According to a fourth aspect of the present invention, in the image processing device according to the second or third aspect, when the display image includes a plurality of parking areas, the image generation means is configured to prioritize based on a preset criterion. A display image in which the parking area with the highest rank is emphasized is generated.

  Further, in the image processing apparatus according to any one of claims 2 to 4, the image generation unit may be configured to perform an operation by a user when the display image includes a plurality of parking areas. A display image in which the parking area selected based on the above is emphasized is generated.

  According to a sixth aspect of the present invention, in the image processing device according to any one of the first to fifth aspects, the vehicle information acquisition means acquires vehicle speed information, and the detection means has the vehicle speed equal to or lower than a predetermined value. If so, the detection of the parking area is started.

  The invention according to claim 7 is the image processing device according to any one of claims 1 to 6, wherein the position information acquisition means for acquiring the position information of the vehicle, and the parking lot information acquisition for acquiring the parking lot information. And a detecting unit that starts detecting the parking area when the traveling position of the vehicle is a parking lot.

  Further, the invention of claim 8 is a parking support system, wherein an imaging device that images the outside of the vehicle, an acquisition device that acquires vehicle information, and when the acquisition device acquires vehicle travel information, the imaging device And an image processing device for detecting a parking area to be parked on the vehicle based on the photographed image.

  The invention according to claim 9 is the parking support system according to claim 8, further comprising a display device for displaying an image, wherein the image processing device generates a display image including the parking area, and the display When the acquisition device acquires the parking start information of the vehicle, the device displays the display image.

  In addition, in the parking support system according to claim 9, when the display image includes a plurality of parking areas, the image processing device has a priority order based on a preset criterion. A display image in which the highest parking area is emphasized is generated.

  The invention according to claim 11 is the parking support system according to claim 9 or 10, wherein when the display image includes a plurality of parking areas, the image processing device is selected based on a user operation. A display image in which the parking area is emphasized is generated.

  According to a twelfth aspect of the present invention, in the parking support system according to any one of the eighth to eleventh aspects, the photographing devices are provided on the left and right sides of the vehicle, and the image processing device is provided by each photographing device. Based on each photographed image, parking areas on both the left and right sides of the vehicle are detected.

  The invention of claim 13 is an image processing method for detecting a parking area to be parked on a vehicle based on an image, and (a) a step of acquiring an image obtained by a photographing device for photographing the outside of the vehicle. And (b) a step of acquiring vehicle information obtained by an acquisition device for acquiring vehicle information, and (c) when the vehicle driving information is acquired in the step (b), the parking area is detected. And a step of starting.

  According to the first to thirteenth aspects of the present invention, when the vehicle travel information is acquired, the detection of the parking area is started, and therefore the parking area detection process is executed during the traveling of the vehicle regardless of whether or not a user operation is performed. Therefore, when the user grasps the parking area, it is possible to present the parking area immediately without requiring time for the detection process of the parking area.

  According to the invention of claim 2 in particular, when the parking start information of the vehicle is acquired, a display image is output to the display device. Therefore, when the vehicle is to be parked, the detected parking area is immediately displayed. It can be displayed on the display device.

  In particular, according to the third aspect of the invention, it is possible to grasp that the vehicle is about to be parked by acquiring information indicating that the gear of the vehicle has been switched to the reverse gear.

  In particular, according to the invention of claim 4 or 10, it is easy to visually grasp a parking area suitable for parking by emphasizing the parking area having the highest priority.

  In particular, according to the invention of claim 5 or 11, by visually emphasizing the parking area selected by the user, it is easy to visually grasp the parking area to be parked.

  In particular, according to the invention of claim 6, the detection of the parking area is started when the vehicle speed is equal to or lower than the predetermined value. For this reason, the parking area detection process is executed when the vehicle is decelerated to park the vehicle, and the detection process is not executed when the vehicle is running normally, thereby reducing the processing load. it can.

  In particular, according to the seventh aspect of the invention, the detection of the parking area is started when the traveling position of the vehicle is a parking lot. For this reason, since a detection process can be performed only when there is a high possibility of parking the vehicle, the processing load can be reduced.

  In particular, according to the ninth aspect of the present invention, when the parking start information of the vehicle is acquired, the display device displays a display image, so that the parking area can be displayed immediately when attempting to park.

  In particular, according to the invention of claim 12, since the parking area detection processing is executed on both the left and right sides, it is possible to immediately grasp the existence of the parking areas on both sides even in a parking lot having parking areas on both sides. .

FIG. 1 is a diagram showing an outline of a parking assistance system. FIG. 2 is a block diagram illustrating a configuration of the parking assistance system. FIG. 3 is a flowchart showing the flow of processing of the parking assistance system. FIG. 4 is a flowchart showing a process flow of the parking support system. FIG. 5 is a flowchart showing a process flow of the parking assistance system. FIG. 6 is a flowchart showing the flow of processing of the parking assistance system. FIG. 7 is a diagram illustrating a display example of a parking area. FIG. 8 is a diagram illustrating a display example of a parking area. FIG. 9 is a diagram illustrating a display example of a parking area. FIG. 10 is a diagram illustrating a display example of a parking area.

  Hereinafter, embodiments of the present invention will be described with reference to the drawings.

<1. Overview of parking support system>
The parking assistance system according to the present embodiment is a system that assists parking control by presenting a parking area to a driver of a vehicle when parking the vehicle in a parking area that is a parking space of the vehicle. In particular, in the parking support system according to the present embodiment, the parking area is automatically detected while the vehicle is traveling, and the detected parking area is immediately displayed when the vehicle is parked in the parking lot. Is. For example, while the vehicle is traveling in a parking lot where parking areas are arranged, the image processing device provided in the vehicle detects the parking area depicted in the parking lot without any instruction from the driver. When the vehicle is to be moved backwards and parked in the parking area, the image of the parking area is immediately presented to the driver by a camera provided on the vehicle to assist the driver in the backward control. is there. In the following, the passenger of the vehicle 1 including the driver of the vehicle is referred to as “user”.

  FIG. 1 is a diagram showing an outline of a parking assistance system 100 according to the present embodiment. The parking support system 100 includes a front camera 21, a right side camera 22, a left side camera, a rear camera 24, an image processing device 5, a display 6, a parking support device 7 and the like in the vehicle 1, and the vehicle 1 automatically This is a system for detecting a parking area PA. Details of each part will be described later. The image processing device 5 automatically detects between the parking frames 12 where no other vehicles are parked, and when the vehicle 1 is parked, the parking area PA partitioned by the detected parking frame 12 is displayed on the display 6. indicate. When the user who drives the vehicle 1 selects the parking area PA that is actually parked from the parking areas PA displayed on the display 6, the parking assistance device 7 is parked in the selected parking area PA. Provide support.

  As described above, the parking assistance system 100 according to the present embodiment automatically detects the parking area PA while the vehicle 1 is traveling, and thus immediately displays the parking area when the user tries to park. It is a system that can. Hereinafter, the configuration and processing of the parking assistance system 100 will be described.

<2. Configuration of parking support system>
FIG. 2 is a block diagram illustrating a configuration of the parking support system 100. As shown in FIG. 2, the parking support system 100 includes a photographing unit 2, a sensor unit 3, a GPS receiver 4, an image processing device 5, a display 6, and a parking support device 7.

  The photographing unit 2 includes a front camera 21, a right side camera 22, a left side camera 23, and a rear camera 24 as photographing devices, and photographs the outside of the vehicle. Each of the cameras 22 to 24 includes a wide-angle lens, digitally captures over an angle of view of 180 degrees or more, and transmits the captured image data to the image processing device 5. By installing cameras on the front, rear, left and right of the vehicle 1, the photographing unit 2 can photograph the entire circumference of the vehicle 1. Images captured by the cameras 22 to 24 can be corrected to a flat image by image correction processing even when the image is captured with distortion caused by distortion of the wide-angle lens.

  The front camera 21 is disposed at the tip of the vehicle 1 and photographs the front of the vehicle 1. The front camera 21 is installed on a so-called front grille of the vehicle 1, but it may be installed on the rear surface of the rearview mirror in the vehicle interior and photograph the front of the vehicle 1 through the windshield. The right side camera 22 is installed in the case of the right side mirror of the vehicle 1 and photographs the right side of the vehicle 1. The left side camera 23 is installed in the case of the left side mirror of the vehicle 1 and photographs the left side of the vehicle 1. The rear camera 24 is installed at the rear of the vehicle 1 and photographs the rear of the vehicle 1. The rear camera 24 is installed in the vicinity of the opening lever of the rear gate, but may be installed in the lower part of the rear spoiler. In this case, the shooting viewpoint is raised, and the rear of the vehicle can be shot farther.

  The sensor unit 3 includes a steering angle sensor 31, a vehicle speed sensor 32, a gear position sensor 33, and a gyro sensor 34. Moreover, although the sensor part 3 is provided with each sensor which detects the rotation speed of the engine with which the vehicle 1 was equipped, the temperature of a cooling device, etc., illustration is abbreviate | omitted. Each of the sensors 31 to 34 functions as an acquisition device that acquires vehicle information. Specifically, the steering angle sensor 31 detects the rotation angle of the steering wheel, and transmits angle information to the image processing device 5 and a control device (not shown) of the electric power steering. The vehicle speed sensor 32 outputs the rotation of the rotor provided on the wheel shaft of the vehicle 1 as a pulse signal, and is constituted by a magnetic sensor or an optical sensor. The gear position sensor 33 detects a gear position from a transmission (not shown) and acquires gear position information. The gyro sensor 34 is a measuring instrument that detects the traveling angle and angular velocity of the vehicle 1.

  The GPS receiver 4 is an antenna that receives signals transmitted from a plurality of GPS (Global Positioning System) satellites that orbit around the vehicle 1. The signal from the GPS satellite includes time data and satellite orbit information. The GPS receiver 4 transmits GPS data that is a received signal to the image processing device 5.

  The image processing device 5 is an electronic control device that detects a parking area by automatically recognizing a parking frame while the vehicle 1 is traveling. The image processing apparatus 5 includes a control unit 51, an input / output unit 52, and a storage unit 53.

  The control unit 51 is a computer including a CPU, RAM, and ROM (not shown). The control unit 51 is connected to the input / output unit 52 and the storage unit 53 provided in the image processing apparatus 5, and transmits / receives data to / from each other to control the entire image processing apparatus 5. The control unit 51 includes a detection unit 51a, a display control unit 51b, a movement amount calculation unit 51c, a coordinate value correction unit 51d, and a coordinate value update unit 51e. These units included in the control unit 51 are functions realized by the CPU performing arithmetic processing according to a program.

  The detection unit 51a recognizes a parking frame that is captured in an image outside the vehicle photographed by the photographing unit 2, and performs a process of detecting a parking area and a parking available area. A parking frame is a frame line that divides a parking space of a vehicle. In addition, the parking area is a parking space of a vehicle surrounded by a parking frame, and the parking area is a parking area where there is no obstacle such as another vehicle in the parking area and the own vehicle can be parked. That's it.

  The detection unit 51a recognizes the parking frame based on the luminance change in the images of the right side camera 22 and the left side camera 23 among the images taken by the cameras 21 to 24. Such recognition processing is preferably performed on a predetermined area in the image. The reason why the recognition process is performed only on the predetermined region is that if the entire image is recognized, the processing load becomes enormous and delays in obtaining the recognition result. In general, when the vehicle 1 is about to park, it is considered that the vehicle 1 travels a certain distance with respect to the parking frame or the parking area, so that the parking frame or the parking area is displayed at a predetermined position in the image. It is also to be done. That is, a predetermined area including a position where a parking frame or a parking area is considered to be displayed in the images of the right side camera 22 and the left side camera 23 is set as a predetermined area, and recognition processing is performed on the predetermined area. The processing load can be reduced. In addition, the detail of the method in which the detection part 51a recognizes a parking frame is mentioned later.

  Moreover, the detection part 51a calculates a parking area | region from the recognized parking frame. If the entire range of the parking area can be calculated from the images taken by the cameras 21 to 24, the parking area calculated from the parking frame is set as the parking area. On the other hand, when only a part of the parking lot range can be calculated from the images taken by the cameras 21 to 24, the standard for general public defined by the parking frame that is recognized and the laws and regulations relating to the parking lot It is possible to calculate a parking area based on the width and depth of a typical parking lot. For example, the width and depth of a standard parking lot are 2.5 m wide and 5.0 m deep.

  Moreover, the detection part 51a detects a parking possible area | region based on the calculated parking area | region and the obstruction detection result in the parking area | region. In other words, if another vehicle is already parked in the parking area or if any object is present, the vehicle cannot be parked. It is determined whether it is a possible area. An obstacle can be detected based on an image taken by the photographing unit 2. The detection unit 51a detects a parking area where no obstacle is present as a parking area, and stores the coordinate value of the parking frame in the storage unit 53 as relative position information between the detected parking area and the host vehicle. A process in which the detection unit 51a recognizes the parking frame and detects the parking area and the parking available area may be simply referred to as “detection process”.

  The display control unit 51b generates a display image to be displayed on the display 6 and controls display. The display control unit 51 b captures and stores the image information captured by the imaging unit 2 as a still image or a moving image file, converts the image information into a data format that can be displayed on the display 6, generates a display image, and the input / output unit 52. Are output to the display 6 via the output unit 52a. For example, the display control unit 51b combines the images of the respective directions taken by the cameras 21 to 24 to synthesize an image including a parking area in which the surroundings of the vehicle 1 are projected, and displays the control on the display 6. Do.

  The movement amount calculation unit 51c calculates the direction and distance in which the vehicle 1 has moved during successive detection processes. That is, the movement amount calculation unit 51c calculates the movement direction and the movement distance of the vehicle 1 from when the image processing apparatus 5 performs the previous detection process to when the current detection process is performed. The movement amount calculation unit 51 c calculates the movement amount of the vehicle 1 based on values input from the steering angle sensor 31 and the vehicle speed sensor 32 of the sensor unit 3.

  The coordinate value correction unit 51 d corrects the coordinate value of the parking frame based on the movement amount of the vehicle 1. As the vehicle 1 moves, the position of the recognized parking frame also moves. For this reason, it is necessary to correct the coordinate value of the parking frame. Since the coordinate value of the parking frame recognized last time is stored in the storage unit 53, the coordinate value correction unit 51d corrects the stored coordinate value based on the movement amount of the vehicle.

  The coordinate value update unit 51e updates the coordinate value of the parking frame. The coordinate value update unit 51e compares the corrected coordinate value of the parking frame with the coordinate value of the parking frame newly recognized by the detection unit 51a, and indicates whether the same parking frame is indicated or a different parking frame. It is judged whether it is what is shown. If the coordinate value update unit 51e determines that the parking frame is the same, the coordinate value update unit 51e updates the corrected coordinate value as a new coordinate value of the parking frame. If the coordinate value update unit 51e determines that the parking frame is different, the coordinate value update unit 51e has detected a new parking frame and adds the coordinate value corresponding to the new parking frame to the storage unit 53.

  The input / output unit 52 includes an output unit 52a, an information acquisition unit 52b, and a position acquisition unit 52c, and controls data input to and output from the image processing apparatus 5.

  When the vehicle 1 starts parking, the output unit 52a outputs a signal requesting the parking support device 7 to start parking support control. Further, the output unit 52a outputs information on the parking frame recognized by the detection unit 51b and information on the detected parking area and parking available area to the parking assistance device 7. Further, the output unit 52 a outputs the display image generated by the display control unit 51 b to the display 6 in accordance with the output request from the control unit 51.

  The information acquisition unit 52 b acquires image data transmitted from the cameras 21 to 24 of the photographing unit 2, data related to the operation of the vehicle 1 transmitted from the sensors 31 to 34 of the sensor unit 3, and the like. That is, the information acquisition unit 52 b acquires image information obtained by the cameras 21 to 24 and acquires vehicle information obtained by the sensors 31 to 34 provided in the vehicle 1.

  The position acquisition unit 52c acquires position information of the vehicle 1 using GPS. The position acquisition unit 52 c receives GPS data transmitted from the GPS receiver 4. The GPS data includes time data received by the GPS receiver 4, satellite orbit information, and the like. The position acquisition unit 52c performs arithmetic processing on the received GPS data, and calculates the position of the vehicle 1 in space as latitude and longitude. The position acquisition unit 52 c transmits the calculated latitude and longitude information to the control unit 51. For the calculation of the latitude and longitude by the position acquisition unit 52c, a position specifying system using an artificial satellite such as a Galileo positioning system may be used in addition to the GPS.

  The storage unit 53 stores parking frame information 53a, map information 53b, and parameters 53c. The storage unit 53 is a nonvolatile semiconductor memory that can electrically read and write data and does not erase data even when the power is turned off. For example, the storage unit 53 is configured by an EEPROM (Electrically Erasable Programmable Read-Only Memory) or a flash memory. The storage unit 53 can also be configured by a hard disk drive provided with a magnetic disk.

  The parking frame information 53a includes information indicating the coordinate value of the recognized parking frame and the detected parking area and parking area. The map information 53b includes information on buildings and parking lots associated with latitude / longitude and address, and road information on which the vehicle 1 can travel. The parameter 53c is various parameters used in the detection process, and is, for example, a vehicle speed value for determining the start of a detection process described later, a predetermined range used for edge detection, or the like.

  The display 6 is a display device that displays an image or the like provided with a touch panel 61. For example, the display 6 is a liquid crystal display.

  The touch panel 61 senses the touch of the finger on the area indicating the button displayed on the display 6, and outputs the sensed position information on the display 6 to the information acquisition unit 52b. From the position information on the display 6 sensed by the touch panel 61 and the information on the display image generated by the display control unit 51b, the control unit 51 can recognize the user's operation content. For example, when the user presses one parking area from a plurality of parking areas displayed on the display 6 with a finger, it is recognized that the parking area has been selected. And the information corresponding to the parking possible area | region is transmitted to the information acquisition part 52b.

  The parking assist device 7 is an electronic control device that includes a route calculation unit 71 and a guidance unit 72. The parking assistance device 7 assists the vehicle 1 to be parked in the parking available area selected by the user. Specifically, the route calculation unit 71 calculates a route for parking from the position of the vehicle 1 and the position of the parking area. In addition, the guidance unit 72 includes an accelerator control unit, a brake control unit, and a steering control unit (not shown), and each control unit is configured to move the vehicle 1 along the route calculated by the route calculation unit 71. By controlling the movement of the vehicle, the vehicle 1 is automatically parked in the parking area.

  In addition, the parking assistance apparatus 7 is not limited to what performs control which parks the vehicle 1 automatically. For example, the route calculation unit 71 may calculate a route for parking and display the route on the display 6. In this case, the user operates and steers the vehicle while referring to the route displayed on the display 6.

<3. Processing of parking support system>
Next, processing in which the image processing device 5 in the parking support system 100 recognizes a parking frame, detects and displays a parking area, and the parking support device 7 performs parking support will be described. 3-6 is a figure which shows the flow of a process of the parking assistance system 100. FIG. Such processing is started when an ignition switch (not shown) is turned on and power is supplied to the parking assistance system 100.

  First, the image processing device 5 determines whether or not the vehicle 1 is traveling (step S301). This determination is made when the information acquisition unit 52b acquires vehicle information from the vehicle speed sensor 32, the gear position sensor 33, or the like of the sensor unit 3. For example, the image processing device 5 calculates the speed of the vehicle 1 by calculating the number of wheel rotations per unit time using the pulse signal received from the vehicle speed sensor 32, and determines whether or not the vehicle 1 is traveling. . Alternatively, it may be determined that the vehicle is traveling if the gear position is in the D range.

  When it is determined that the vehicle 1 is not traveling (No in step S301), the detection process is not performed and the process proceeds to the subsequent process (A in FIG. 3). On the other hand, when it is determined that the vehicle 1 is traveling (Yes in Step S301), the image processing device 5 determines whether or not the vehicle speed is equal to or lower than a predetermined value (Step S302). This determination is made by comparing the vehicle speed calculated based on the information acquired by the information acquisition unit 52b from the vehicle speed sensor 32 and the parameter 53c stored in the storage unit 53. The parameter 53c in this case may be set as appropriate as long as it is at a speed (for example, 20 Km / h) at which the vehicle 1 travels when searching for a parking space for parking.

  If it is determined that the vehicle speed is not less than or equal to the predetermined value (No in step S302), the process proceeds to the subsequent process without performing the detection process (A in FIG. 3). On the other hand, when it is determined that the vehicle speed is equal to or lower than the predetermined value (Yes in step S302), the detection process of the parking area is started on the assumption that the travel information is acquired. In the present embodiment, information indicating that the vehicle is traveling corresponds to the traveling information. In this case, if the vehicle is running, the parking frame recognition process is performed, so step S302 is not necessary. However, since there is a low possibility of parking when traveling at high speed, there is a possibility that the recognition process of the parking frame may be a wasteful process. It is desirable that the driving information is.

  Specifically, the parking area detection process in the present embodiment first performs a parking frame recognition process (step S303). That is, the information acquisition unit 52b acquires image information captured by the cameras 21 to 24 of the imaging unit. And the detection part 51a recognizes the parking frame which exists in the right side and the left side of the own vehicle based on the image information of the right and left side cameras 22 and 23 among the image information of each camera 21-24 acquired by the information acquisition part 52b. I do. The parking frame is recognized by the detection unit 51a calculating the luminance of each pixel based on the image information and detecting the edge. An edge refers to a point where the luminance between pixels changes with a predetermined magnitude or more.

  Specifically, the detection unit 51a compares the luminance between pixels arranged in the vertical direction (vertical direction) of the image among the calculated luminances. Then, the detection unit 51a detects pixels in which the luminance between the vertical pixels changes with a predetermined magnitude or more. The luminance change point between the pixels arranged in the vertical direction of the image is hereinafter referred to as “horizontal edge”. The case where the change in luminance is large in the vertical direction is a case where an object exists in the horizontal direction with respect to the image, and the change in luminance is caused by the presence of such an object. That is, the detection unit 51a detects a horizontal edge in the image.

  In addition, the detection unit 51a compares the luminance between the pixels arranged in the horizontal direction (horizontal direction) of the image among the calculated luminances, and determines the pixel in which the luminance between the horizontal pixels changes by a predetermined level or more. To detect. Note that the change point of luminance between pixels arranged in the horizontal direction of the image is hereinafter referred to as “vertical edge”. A large change in luminance in the horizontal direction is a case where an object exists in a direction perpendicular to the image, and this is because a change in luminance occurs due to the presence of such an object. That is, the detection unit 51a detects a vertical edge in the image.

  When the horizontal edge and the vertical edge are detected, the detection unit 51a extracts an edge based on the parking frame line from the detected edges. Parking frame lines are generally often given bright colors such as white and yellow, and such colors have high luminance in the image. Also, the parking lot surface is often dark in color and has low brightness in the image. For this reason, the change between the brightness of the parking frame line and the brightness of the parking lot surface often falls within a predetermined range. Therefore, the detection unit 51a can detect an edge based on the parking frame line by detecting a horizontal edge and a vertical edge whose luminance change is within a predetermined range. The predetermined range may be set based on the difference between the luminance of a general parking frame line and the luminance of the parking lot surface, and stored in the storage unit 53 in advance.

  Next, the detection unit 51a calculates the distance between the detected horizontal edges and the distance between the vertical edges, and determines whether the distance corresponds to the width of the parking frame line. When the detected distance corresponds to the width of the parking frame line, the detection unit 51a determines that the region sandwiched between the edges is a parking frame, and recognizes the parking frame. As described above, since the parking frame line is brightly colored in the parking lot, an area in which the luminance change and the distance between edges correspond to the parking frame line can be regarded as a parking frame. On the other hand, when the detected distance does not correspond to the width of the parking frame line, the detection unit 51a determines that the region sandwiched between the edges is not a parking frame.

  Next, the detection unit 51a performs a parking area detection process (step S304). A parking area | region is an area | region which parks the vehicle divided by the parking frame. When two or more areas recognized as parking frames are detected, the detection unit 51a determines whether the distance between the two areas is a predetermined distance. This is because when two or more areas recognized as parking frames cannot be detected, the corner of the parking area cannot be determined and the entire area of the parking area cannot be recognized.

  The predetermined distance is, for example, the width of a standard public parking area defined by laws and regulations related to parking lots. In this case, for example, it is 2.5 m. However, the present invention is not limited to this, and can be set as appropriate within a range of 3.0 m or 2.5 m to 3.0 m. The value indicating the predetermined distance is also included in the parameter 53c stored in the storage unit 53. That is, when the distance between both areas corresponds to the width of a general parking area, the detection unit 51a determines that there is a parking area in an area sandwiched between edges recognized as parking frames in step S303. it can.

  Then, when two or more areas recognized as parking frames are detected and the corners of the parking area can be determined, the detection unit 51a can determine a standard parking frame for general public defined by the above-mentioned laws and regulations on the parking lot. It is possible to detect the entire parking area based on the width and depth of the vehicle. As such a width and depth, a width of 2.5 m and a depth of 5.0 m can be set. When two or more areas recognized as parking frames are detected and the four corners of the parking area can be determined, the entire parking area may be derived based on the four corners.

  Next, the detection part 51a performs the detection process of a parking possible area | region (step S305). The determination as to whether or not the area is a parking area is performed by determining the presence or absence of an obstacle in the detected parking area. The obstacle is, for example, another vehicle. This is because the vehicle 1 cannot be parked in the parking area if another vehicle is parked in the parking area. First, the detection unit 51a sets a recognition area corresponding to a general vehicle height of a parked vehicle in the vertical direction of the image from the parking area. That is, this recognition area is an area including a substantial portion of the front surface or the rear surface of the parked vehicle. In addition, the general vehicle height of a parked vehicle is 1.8 m, for example.

  Then, the detection unit 51a detects the horizontal edge of the recognition area, and counts the number of times the horizontal edge is detected from the recognition area. The detection unit 51a determines that there is a parked vehicle in the recognition area when the number of times the horizontal edge is detected is equal to or greater than a predetermined number. When the parked vehicle is photographed by the photographing unit 2, the vehicle bumper, the structure of the body, and the like exist as objects in the horizontal direction with respect to the image. For this reason, when the number of times the horizontal edge is detected is equal to or greater than the predetermined number, it can be determined that a parked vehicle exists in the parking area. When it is determined that the vehicle is present in the parking area, the detection unit 51a determines that the parking area cannot be parked. That is, it determines that it is not a parking area. On the other hand, if the detection unit 51a determines that there is no vehicle in the parking area, the detection unit 51a determines that it is possible to park in the parking area. That is, a parking area is detected.

  When detecting the parking area, the detection unit 51a calculates parking area information (step S306). The parking area information is a coordinate value indicating the position of the parking area for the vehicle 1. This coordinate value is a value indicating the position of the parking area in the real space with reference to the position of the vehicle 1. The coordinates (xi, yi) of each pixel in the image photographed by the camera have a one-to-one correspondence with the coordinates for the vehicle 1 in the actual space, so-called world coordinates (Xi, Yi). Therefore, by converting the coordinates of the detected parking area on the image into the world coordinates in the actual space, the position of the parking area can be obtained based on the position of the vehicle 1 in the real space. In the following description, when it is simply expressed as a coordinate value, it indicates a world coordinate value. In the present embodiment, when the parking area is on the right side of the vehicle 1, the distance in the left-right direction from the vehicle 1 is RX, and the distance in the front-rear direction is RY. (RX, RY). Similarly, when the parking area is on the left side of the vehicle 1, the world coordinate value of the parking area is described as (LX, LY).

  Note that the coordinate value to be acquired is not limited as long as the position of the parking area can be specified, and may be a value of one or more arbitrary points in the parking area. In the present embodiment, the coordinate value of the parking frame of the parking area is used as parking area information. For this reason, in the following, parking area information is described as parking frame information.

  When the detection unit 51a calculates the coordinate value of the parking frame, the detection unit 51a stores the coordinate value in the storage unit 53 as the parking frame information 53a. Note that the image processing apparatus 5 proceeds to the next process after performing the processes of steps S303 to S306 described above on both the right side and the left side of the vehicle 1 (B in FIG. 3).

  Next, the control unit 51 determines whether or not there is a result of the previous detection process (step S401). When the vehicle 1 is traveling between the previous detection process and the current detection process, it is necessary to correct the coordinate value of the previous parking area (the coordinate value of the parking frame) by the amount of movement of the vehicle 1. Because there is. Since the coordinate value of the parking area is stored as the parking frame information 53 a in the storage unit 53, the determination of the presence or absence of the detection result is performed by the control unit 51 as to whether or not the parking frame information 53 a is stored in the storage unit 53. It is done by judging. When there is no previous detection result (when the parking frame information 53a is not stored) (No in step S401), the coordinate value correction process is not performed and the process proceeds to the subsequent process.

  On the other hand, when there is a previous detection result (when there is parking frame information 53a) (Yes in step S401), a movement amount calculation process of the vehicle 1 is performed (step S402). The movement amount calculation process is performed by the movement amount calculation unit 51c calculating the movement amount in the X direction and the Y direction of the vehicle 1 between the previous detection process and the current detection process. The calculation method of the movement amount is as follows.

  First, the movement amount calculation unit 51c calculates the movement distance and movement direction of the vehicle 1 based on the output value of the steering angle sensor 31 and the output value of the vehicle speed sensor 32 acquired by the information acquisition unit 52b. If the movement distance is ΔL, ΔL can be calculated by the following equation.

ΔL = vehicle speed × travel time from the previous time Note that, as the travel time from the previous time, when the detection unit 51a performs the detection process at a constant cycle, a time corresponding to that cycle may be used. In the case of a configuration having a clock function, the time information may be used.

  If the moving direction is θi, θi can be calculated by the following equation.

θi = ΣΔL × Ki
However, Ki = sin (steering angle / wheel base).

  Next, the movement amount calculation unit 51c calculates the movement amounts (dX, dY) in the X direction and the Y direction by the following equations.

dX = ΣΔL × cos θi
dY = ΣΔL × sin θi
In this way, it is possible to calculate the amount of movement of the vehicle in the X and Y directions from the previous detection process to the current detection process.

  Next, the coordinate value correcting unit 51d corrects the coordinate value of the previous parking frame information 53a (step S403). This is because the parking area is relatively moved by the amount of movement of the vehicle 1 from the previous detection process to the current detection process, and the correction is performed.

  When there are n detected parking areas, the coordinate value of the parking area on the right side of the vehicle 1 is (RXn, RYn), and the coordinate value of the parking area on the left side of the vehicle 1 is (LXn). , LYn). Accordingly, the coordinate value correcting unit 51d corrects the coordinate value by the following expression using the movement amounts (dX, dY) of the vehicle 1 in the X direction and the Y direction calculated by the movement amount calculation unit 51c.

RXn (current) = RXn (previous) + dX
RYn (current) = RYn (previous) + dY
LXn (current) = LXn (previous) + dX
LYn (current) = LYn (previous) + dY
When there are a plurality of coordinate values of the parking frame information 53a calculated in the previous detection process, the coordinate value correction unit 51d performs the same correction on each coordinate value.

  Next, the coordinate value update unit 51e performs a process of updating the parking frame information 53a (that is, the coordinate value of the parking area) stored in the storage unit 53. The coordinate value update process is a process of updating the previous coordinate value to the corrected coordinate value when the available parking area detected in the current detection process is also detected in the previous detection process. It is. In addition, when a new parking area is detected in the current detection process, the current coordinate value is newly added and stored.

  First, the coordinate value update unit 51e acquires the parking frame information 53a as a result detected by the current detection process from the storage unit 53 (step S404). When both left and right parking frame information 53a is stored in the storage unit 53, the coordinate value update unit 51e acquires both left and right parking frame information 53a. The coordinate value update unit 51e also acquires parking frame information 53a as a previous detection result after correction from the storage unit 53.

  The coordinate value update unit 51e determines whether the parking frame information 53a acquired from the storage unit 53 includes a coordinate value corresponding to the right parking area (hereinafter, referred to as “right side coordinate value”). (Step S405). If the right coordinate value is not included (No in step S405), the process relating to the right coordinate value is not performed, and the process proceeds to the subsequent process (C in FIG. 4).

  On the other hand, when the right coordinate value is included (Yes in step S405), the display control unit 51b controls the display 6 to display that there is a parking area on the right side (step S406).

  Thereafter, the coordinate value update unit 51e compares the corrected previous coordinate value with the coordinate value acquired in the current detection process (step S407). That is, the coordinate value update unit 51e compares the coordinate value corrected according to the movement amount of the vehicle 1 with the coordinate value actually acquired by the detection process. Specifically, the X-direction value of the previous coordinate value after correction is compared with the X-direction value of the coordinate value acquired in the current detection process, and the difference is calculated. Similarly, a difference is calculated in the Y-direction value.

  Next, the coordinate value update unit 51e determines whether or not the parking available area detected in the current detection process is the same as the parking available area detected in the previous detection process (step S408). This determination is made based on a result of the coordinate value update unit 51e comparing the corrected previous coordinate value with the current coordinate value. Specifically, as a result of the comparison performed in step S407, if the corrected previous coordinate value and the current coordinate value are the same, that is, if the difference is 0, the same parking area is assumed. to decide. Even if the comparison results are not exactly the same (difference is 0), if the difference is within a predetermined range, it is determined that the parking area is the same. The predetermined range may be a range where the difference can be regarded as an error range of the detection process, and may be set as appropriate, for example, 1 m, 2 m, or 3 m. The value indicating the predetermined range is also included in the parameter 53c stored in the storage unit 53. On the other hand, as a result of the comparison, when the corrected previous coordinate value and the current coordinate value are not the same and the difference is not within the predetermined range, it is determined that the areas are different parking areas.

  If the coordinate value update unit 51e determines that the currently available parking area is the same as the previously detected parking area (Yes in step S408), the coordinate value update unit 51e updates the coordinate value (step S409). That is, the coordinate value updating unit 51e updates the parking frame information 53a by overwriting the previous coordinate value after correction with the previous coordinate value before correction and storing it in the storage unit 53, so that the next process is performed. Proceed (C in FIG. 4). On the other hand, when the coordinate value updating unit 51e determines that the currently detected parking area is different from the previously detected parking area (No in step S408), the coordinate value updating unit 51e is a newly detected parking area. As a result, the coordinate value corresponding to the currently detected parking area is newly stored in the storage unit 53 as the parking frame information 53a (step S410), and the process proceeds to the next process (C in FIG. 4).

  Next, the coordinate value update unit 51e determines whether or not the parking frame information 53a acquired from the storage unit 53 includes a coordinate value corresponding to the left parking area (hereinafter, referred to as “left coordinate value”). Is determined (step S501). If the left coordinate value is not included (No in step S501), the process related to the left coordinate value is not performed, and the process proceeds to the subsequent process (A in FIG. 5).

  On the other hand, if the left coordinate value is included (Yes in step S501), the display control unit 51b controls the display 6 to display that there is a parking area on the left side (step S502).

  Thereafter, the coordinate value updating unit 51e compares the corrected previous coordinate value with the coordinate value acquired in the current detection process (step S503). That is, the coordinate value update unit 51e compares the coordinate value corrected according to the movement amount of the vehicle 1 with the coordinate value actually acquired by the detection process. A specific comparison method can be performed in the same manner as the processing using the right coordinate value (step S407).

  Next, the coordinate value update unit 51e determines whether the parking available area detected in the current detection process is the same as the parking available area detected in the previous detection process (step S504). This determination can also be performed in the same manner as the processing using the right coordinate value described above (step S408).

  In addition, when the coordinate value update unit 51e determines that the currently available parking area is the same as the previously detected parking area (Yes in step S504), the coordinate value update unit 51e updates the coordinate value (step S505). ). On the other hand, when the coordinate value updating unit 51e determines that the currently detected parking area is different from the previously detected parking area (No in step S504), the coordinate value updating unit 51e is a newly detected parking area. As a result, the coordinate value corresponding to the currently detected parking area is newly stored in the storage unit 53 as the parking frame information 53a (step S506). These coordinate value update processing and addition processing can also be performed in the same manner as the right coordinate value update processing and addition processing (steps S409 and S410) described above.

  Next, the image processing device 5 determines whether or not a gear position signal has been received (step S507). Specifically, it is determined whether or not the information acquisition unit 52b has received an output from the gear position sensor 33 of the sensor unit 3. If the information acquisition unit 52b determines that the output from the gear position sensor 33 has not been received (No in step S507), the process ends (D in FIG. 5).

  On the other hand, if the information acquisition unit 52b determines that the output from the gear position sensor 33 has been received (Yes in step S507), the received output from the gear position sensor 33 indicates that the parking start information is reverse. It is determined whether or not it is a signal for turning on (a signal that the gear position is in the R range; hereinafter referred to as “reverse on signal”) (step S508). If the received signal is not a reverse-on signal (No in step S508), the process ends (D in FIG. 5). On the other hand, if the received signal is a reverse-on signal (Yes in step S508), the process proceeds to the next process (E in FIG. 5).

  Next, the display control part 51b performs the process which extracts the parking possible area | region within a display range (step S601). That is, when reverse is turned on (when the gear position is set to the R range), an image captured by the rear camera 24 is displayed on the display 6, so that the detected parking area is within the range displayed on the display 6. Extract the included parking area. That is, for the rear camera 24, the coordinates (xi, yi) of each pixel in the rear camera image correspond to the world coordinates (Xi, Yi) for the vehicle 1 in the actual space on a one-to-one basis. By comparing the world coordinate value of the possible area with the world coordinate value for the image captured by the rear camera 24, a parking possible area included in the range displayed on the display 6 is extracted.

  When the parking area is extracted, a process of sorting the extracted coordinate values of the parking area is performed (step S602). The display control part 51b highlights and displays the parking priority area | region with high priority among the parking possible area | regions displayed on the display 6. FIG. In the present embodiment, the priority order of the parking available area close to the vehicle 1 is increased. Therefore, the display control unit 51b compares the coordinate values of the parkable areas, increases the priority in order of short distance from the vehicle 1, and sorts the parkable areas in the order of higher priority. Note that the priority order is determined based on a preset criterion, and the distance from the vehicle 1 is a criterion in the present embodiment, but is not limited to this. For example, the priority order of either the left or right may be increased, or the priority order may be increased for a large parking area.

  Next, the display control unit 51b performs control to display a parking area on the display 6 (step S603). That is, the world coordinate value of the parking area is converted into the camera coordinate value in the rear camera image, and the parking area is superimposed and drawn on the rear camera image. FIG. 7 is a diagram illustrating an example in which a parking area PA is displayed on the display 6. As shown in FIG. 7, the closest parking area PA from the vehicle 1 is displayed surrounded by a solid line, and the second parking area PA at the second distance is displayed surrounded by a dotted line. . That is, the display control unit 51b performs control to display the parking area PA closest to the vehicle 1 as a parking area having the highest priority so as to be surrounded by a solid line. Moreover, the display control part 51b performs control which displays so that the parking possible area PA with the distance from the vehicle 1 after 2nd may be enclosed by a dotted line as a parking priority area with a low priority.

  Returning to FIG. 6, the display control unit 51b determines whether or not the user has selected any of the parking available areas displayed on the display 6 (step S604). The display control unit 51b acquires the coordinate position where the user operated the touch panel 61, and determines which parking area is selected. Thereby, the parking possible area | region where the user is actually parking the vehicle 1 can be judged. When the display control unit 51b determines that the user has not selected a parking area (No in step S604), the process ends.

  On the other hand, when the display control unit 51b determines that the user has selected a parking area (Yes in step S604), the coordinate values are re-sorted (step S605). That is, the display control unit 51b raises the priority of the parking available area selected by the user among the parking available areas displayed on the display 6 and lowers the priority of the other parking available areas not selected. . Then, the display control unit 51b displays the parking area with the highest priority selected by the user so as to be surrounded by a solid line according to the newly set priority, and surrounds the other parking areas with a dotted line. It is displayed again (step S606). FIG. 8 is a diagram illustrating an example in which the parking area PA is redisplayed. As illustrated in FIG. 8, when the user selects a parking available area PA at the second distance, the display control unit 51 b surrounds the parking available area PA at the second distance on the display 6 with a solid line. Control is performed so that the closest parking area PA is displayed so as to be surrounded by a dotted line.

  Returning to FIG. 6 again, the image processing apparatus 5 sets the parking area selected by the user as the target parking area (step S607), and outputs the coordinate value of the target parking area and the parking assistance request signal via the output unit 52a. Is output to the parking assistance device 7.

  The parking assistance device 7 performs parking assistance processing when the coordinate value of the target parking area and the parking assistance request signal are input (step S608). Specifically, the route calculation unit 71 first calculates a route for parking the vehicle 1 in the target parking area. The route calculation unit 71 performs a predetermined calculation process on the basis of the coordinate value of the target parking area, the total length, the vehicle width, the rotation radius, and the like of the vehicle 1 stored in advance in the parking assist device 7 from the current position of the vehicle 1. The route to the target parking area is calculated.

  When the route to the target parking area is calculated, the guiding unit 72 controls the accelerator, the brake, the steering, and the like, and executes parking control so that the vehicle 1 travels according to the route calculated by the route calculating unit 71. When the guidance unit 72 parks the vehicle 1 in the parking area defined by the coordinate value of the target parking area, the driving assistance process by the parking assistance device 7 ends.

  The parking assist device 7 displays the route calculated by the route calculation unit 71 on the display 6 as a guide line in addition to the vehicle 1 that automatically travels and parks the vehicle in the target parking area. The structure which assists may be sufficient.

  As described above, in the parking assistance system 100 according to the present embodiment, since the parking area is always detected while the vehicle 1 is traveling, it is always possible to obtain information on the parking area. it can. For this reason, it is possible to immediately display the parking available area on the display 6 when the gear is reversed at the start of parking. If a parking area is detected after the user's instruction is received, it may take time to detect the parking area, so the parking area may not be displayed immediately even if the gear is reversed. Such an inconvenience can be avoided in this embodiment.

<4. Modification>
Although the embodiments of the present invention have been described above, the present invention is not limited to the above-described embodiments, and various modifications can be made. Below, such a modification is demonstrated. All the forms including the above-described embodiment and the form described below can be appropriately combined.

  In the above embodiment, an example in which an image captured by the rear camera 24 is displayed on the display 6 when reverse is turned on has been described. However, the present invention is not limited to this. For example, it is good also as a structure which combines the image image | photographed with each camera 22-24, and displays on the display 6 the image which image | photographed the perimeter of the vehicle 1 from upper direction.

  A display example in this case is shown in FIG. The display control unit 51 b performs control to combine the images captured by the cameras 22 to 24 and display an overhead image viewed from the virtual viewpoint set above the vehicle 1 on the display 6. Even in this case, similarly to the above-described embodiment, the display control unit 51b displays the parking area PA closest to the vehicle 1 as the parking area with the highest priority, surrounded by a solid line, Control is performed so that the parking available area PA at the second and subsequent distances is displayed so as to be surrounded by a dotted line as a parking available area with a low priority. In FIG. 9, since the vehicle 1 is not shown in any camera, a vehicle image diagram based on a bitmap is superimposed on the overhead image as the vehicle 1.

  When the user selects any of the parking areas displayed on the display 6, the priority of the selected parking area is increased by 1 and the priority of other parking areas not selected is decreased. To do. Then, the display control unit 51b displays the parking area with the highest priority selected by the user so as to be surrounded by a solid line according to the newly set priority, and surrounds the other parking areas with a dotted line. Redisplay. FIG. 10 is a diagram illustrating an example in which the parking area PA in the modified example is displayed again. As illustrated in FIG. 10, when the user selects a parking available area PA at the third distance, the display control unit 51 b surrounds the parking available area PA at the third distance on the display 6 with a solid line. And the control is performed so that the closest parkingable area PA and the second parkingable area PA are surrounded by a dotted line.

  Moreover, in the said embodiment, although it was set as the structure which performs the recognition process of a parking frame, when the vehicle speed is less than predetermined value (it is Yes at step S302), it is not limited to this. For example, based on the position information acquired by the position acquisition unit 52c and the map information stored in the storage unit 53, the parking frame recognition process is executed after detecting that the vehicle 1 has entered the parking lot. It is good also as a structure.

  Further, in the above-described embodiment, it has been described that various functions are realized in software by the arithmetic processing of the CPU according to the program. However, some of these functions are realized by an electrical hardware circuit. Also good. Conversely, some of the functions realized by the hardware circuit may be realized by software.

DESCRIPTION OF SYMBOLS 1 Vehicle 2 Image pick-up part 3 Sensor part 4 GPS receiver 5 Image processing apparatus 51 Control part 51a Detection part 51b Display control part 51c Movement amount calculation part 51d Coordinate value correction part 51e Coordinate value update part 52 Input / output part 52a Output part 52b Information Acquisition unit 52c Position acquisition unit 53 Storage unit 53a Parking frame information 53b Map information 53c Parameter 6 Display 7 Parking support device 100 Parking support system

Claims (13)

  1. An image processing device that detects a parking area to be parked on a vehicle based on an image,
    Image obtaining means for obtaining an image obtained by a photographing device for photographing the outside of the vehicle;
    Vehicle information acquisition means for acquiring vehicle information obtained by an acquisition device for acquiring vehicle information;
    When the vehicle information acquisition means acquires vehicle travel information, detection means for starting detection of the parking area;
    An image processing apparatus comprising:
  2. The image processing apparatus according to claim 1.
    Image generating means for generating a display image for displaying the parking area on an external display device and outputting the display image to the display device when the vehicle information acquisition means acquires parking start information of the vehicle. An image processing apparatus further comprising:
  3. The image processing apparatus according to claim 2,
    The image processing apparatus according to claim 1, wherein the parking start information is information indicating that a vehicle gear is switched to a reverse gear.
  4. The image processing apparatus according to claim 2 or 3,
    When the display image includes a plurality of parking areas,
    The image processing device generates an image for display in which a parking area having the highest priority based on a preset criterion is emphasized.
  5. The image processing apparatus according to any one of claims 2 to 4,
    When the display image includes a plurality of parking areas,
    The image processing device is characterized in that the image generation unit generates a display image in which a parking area selected based on a user operation is emphasized.
  6. The image processing apparatus according to any one of claims 1 to 5,
    The vehicle information acquisition means acquires vehicle speed information,
    The image processing apparatus according to claim 1, wherein the detection unit starts detection of a parking area when the vehicle speed is equal to or lower than a predetermined value.
  7. The image processing apparatus according to any one of claims 1 to 6,
    Position information acquisition means for acquiring vehicle position information;
    A parking lot information acquisition means for acquiring parking lot information;
    The image processing apparatus according to claim 1, wherein the detection unit starts detection of a parking area when the traveling position of the vehicle is a parking lot.
  8. A photographing device for photographing the outside of the vehicle;
    An acquisition device for acquiring vehicle information;
    When the acquisition device acquires vehicle travel information, the image processing device detects a parking area to be parked on the vehicle based on an image captured by the imaging device;
    A parking assistance system characterized by comprising:
  9. In the parking assistance system according to claim 8,
    A display device for displaying an image;
    The image processing device generates a display image including the parking area,
    The display device displays the display image when the acquisition device acquires parking start information of a vehicle.
  10. In the parking assistance system according to claim 9,
    When the display image includes a plurality of parking areas,
    The said image processing apparatus produces | generates the image for a display which emphasized the parking area with the highest priority based on the preset reference | standard, The parking assistance system characterized by the above-mentioned.
  11. In the parking assistance system according to claim 9 or 10,
    When the display image includes a plurality of parking areas,
    The said image processing apparatus produces | generates the image for a display which emphasized the parking area selected based on a user's operation, The parking assistance system characterized by the above-mentioned.
  12. In the parking assistance system according to any one of claims 8 to 11,
    The photographing devices are provided on the left and right sides of the vehicle,
    The said image processing apparatus detects the parking area | region in the both right and left sides of a vehicle based on each image image | photographed with each imaging device, The parking assistance system characterized by the above-mentioned.
  13. An image processing method for detecting a parking area to be parked on a vehicle based on an image,
    (A) a step of obtaining an image obtained by a photographing device for photographing the outside of the vehicle;
    (B) acquiring vehicle information obtained by an acquisition device for acquiring vehicle information;
    (C) When the travel information of the vehicle is acquired in the step (b), the step of starting detection of the parking area;
    An image processing method comprising:
JP2012016164A 2012-01-30 2012-01-30 Apparatus and method for processing image, and parking support system Pending JP2013154730A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2012016164A JP2013154730A (en) 2012-01-30 2012-01-30 Apparatus and method for processing image, and parking support system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2012016164A JP2013154730A (en) 2012-01-30 2012-01-30 Apparatus and method for processing image, and parking support system

Publications (1)

Publication Number Publication Date
JP2013154730A true JP2013154730A (en) 2013-08-15

Family

ID=49050369

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2012016164A Pending JP2013154730A (en) 2012-01-30 2012-01-30 Apparatus and method for processing image, and parking support system

Country Status (1)

Country Link
JP (1) JP2013154730A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150097956A1 (en) * 2013-10-04 2015-04-09 Aisin Seiki Kabushiki Kaisha Parking assistance device
JP2015064795A (en) * 2013-09-25 2015-04-09 日産自動車株式会社 Determination device and determination method for parking place
JP2015076645A (en) * 2013-10-04 2015-04-20 本田技研工業株式会社 Vehicle periphery display device
JP2016126605A (en) * 2015-01-06 2016-07-11 株式会社日立製作所 Travel environment recognition system
JPWO2015137012A1 (en) * 2014-03-12 2017-04-06 日産自動車株式会社 Vehicle operating device
WO2017068694A1 (en) * 2015-10-22 2017-04-27 日産自動車株式会社 Parking support method and parking support device
WO2017179206A1 (en) * 2016-04-15 2017-10-19 三菱電機株式会社 Display control device for parking assistance, and display control method for parking assistance
WO2017179205A1 (en) * 2016-04-15 2017-10-19 三菱電機株式会社 Display control device for parking assistance, and display control method for parking assistance
US9869966B2 (en) 2015-10-09 2018-01-16 Fuji Xerox Co., Ltd. Image forming apparatus and removable unit

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001030936A (en) * 1999-07-21 2001-02-06 Nissan Motor Co Ltd Vehicle operation control device
JP2001199298A (en) * 2000-01-19 2001-07-24 Equos Research Co Ltd Parking aiding device and computer-readable recording medium with parking aiding program recorded
JP2004048295A (en) * 2002-07-10 2004-02-12 Toyota Motor Corp Image processor, parking assist apparatus, and image processing method
JP2006224778A (en) * 2005-02-16 2006-08-31 Denso Corp Parking support apparatus
JP2008221942A (en) * 2007-03-09 2008-09-25 Clarion Co Ltd Parking support device
JP2009083735A (en) * 2007-10-01 2009-04-23 Nissan Motor Co Ltd Parking assistant device and method
JP2009140175A (en) * 2007-12-05 2009-06-25 Toyota Motor Corp White line detection device
WO2009090695A1 (en) * 2008-01-16 2009-07-23 Mitsubishi Electric Corporation Sensor system for vehicle
JP2009205191A (en) * 2008-02-26 2009-09-10 Hitachi Ltd Parking space recognition system
JP2011034297A (en) * 2009-07-31 2011-02-17 Clarion Co Ltd Parking space recognition device
JP2011046335A (en) * 2009-08-28 2011-03-10 Nissan Motor Co Ltd Device and method for parking support
WO2011158713A1 (en) * 2010-06-18 2011-12-22 アイシン精機株式会社 Parking support device
JP2012001081A (en) * 2010-06-16 2012-01-05 Nissan Motor Co Ltd Parking support system

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001030936A (en) * 1999-07-21 2001-02-06 Nissan Motor Co Ltd Vehicle operation control device
JP2001199298A (en) * 2000-01-19 2001-07-24 Equos Research Co Ltd Parking aiding device and computer-readable recording medium with parking aiding program recorded
JP2004048295A (en) * 2002-07-10 2004-02-12 Toyota Motor Corp Image processor, parking assist apparatus, and image processing method
JP2006224778A (en) * 2005-02-16 2006-08-31 Denso Corp Parking support apparatus
JP2008221942A (en) * 2007-03-09 2008-09-25 Clarion Co Ltd Parking support device
JP2009083735A (en) * 2007-10-01 2009-04-23 Nissan Motor Co Ltd Parking assistant device and method
JP2009140175A (en) * 2007-12-05 2009-06-25 Toyota Motor Corp White line detection device
WO2009090695A1 (en) * 2008-01-16 2009-07-23 Mitsubishi Electric Corporation Sensor system for vehicle
JP2009205191A (en) * 2008-02-26 2009-09-10 Hitachi Ltd Parking space recognition system
JP2011034297A (en) * 2009-07-31 2011-02-17 Clarion Co Ltd Parking space recognition device
JP2011046335A (en) * 2009-08-28 2011-03-10 Nissan Motor Co Ltd Device and method for parking support
JP2012001081A (en) * 2010-06-16 2012-01-05 Nissan Motor Co Ltd Parking support system
WO2011158713A1 (en) * 2010-06-18 2011-12-22 アイシン精機株式会社 Parking support device

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015064795A (en) * 2013-09-25 2015-04-09 日産自動車株式会社 Determination device and determination method for parking place
JP2015074254A (en) * 2013-10-04 2015-04-20 アイシン精機株式会社 Parking assisting device
JP2015076645A (en) * 2013-10-04 2015-04-20 本田技研工業株式会社 Vehicle periphery display device
US10068483B2 (en) 2013-10-04 2018-09-04 Aisin Seiki Kabushiki Kaisha Parking assistance device
US20150097956A1 (en) * 2013-10-04 2015-04-09 Aisin Seiki Kabushiki Kaisha Parking assistance device
US9983020B2 (en) 2014-03-12 2018-05-29 Nissan Motor Co., Ltd. Vehicle operation device and method
JPWO2015137012A1 (en) * 2014-03-12 2017-04-06 日産自動車株式会社 Vehicle operating device
JP2018138929A (en) * 2014-03-12 2018-09-06 日産自動車株式会社 Vehicle operation device
JP2016126605A (en) * 2015-01-06 2016-07-11 株式会社日立製作所 Travel environment recognition system
US9869966B2 (en) 2015-10-09 2018-01-16 Fuji Xerox Co., Ltd. Image forming apparatus and removable unit
US10319233B2 (en) 2015-10-22 2019-06-11 Nissan Motor Co., Ltd. Parking support method and parking support device
JPWO2017068694A1 (en) * 2015-10-22 2018-09-13 日産自動車株式会社 Parking support method and parking support device
WO2017068694A1 (en) * 2015-10-22 2017-04-27 日産自動車株式会社 Parking support method and parking support device
WO2017179206A1 (en) * 2016-04-15 2017-10-19 三菱電機株式会社 Display control device for parking assistance, and display control method for parking assistance
JPWO2017179205A1 (en) * 2016-04-15 2018-09-27 三菱電機株式会社 Parking assistance display control apparatus and parking assistance display control method
WO2017179205A1 (en) * 2016-04-15 2017-10-19 三菱電機株式会社 Display control device for parking assistance, and display control method for parking assistance

Similar Documents

Publication Publication Date Title
DE102007043110B4 (en) A method and apparatus for detecting a parking space using a bird&#39;s-eye view and a parking assistance system using the same
JP2005311868A (en) Vehicle periphery visually recognizing apparatus
JP4696248B2 (en) Mobile navigation information display method and mobile navigation information display device
US8111287B2 (en) Driving-operation assist and recording medium
EP1383099B1 (en) Image navigation device
US8315796B2 (en) Navigation device
KR101119563B1 (en) Parking assisting system
JP4604703B2 (en) Parking assistance device
US20100245573A1 (en) Image processing method and image processing apparatus
JP4380550B2 (en) In-vehicle imaging device
JP2007235642A (en) Obstruction detecting system
EP2763407A1 (en) Vehicle surroundings monitoring device
US8655019B2 (en) Driving support display device
JPWO2006064544A1 (en) Car storage equipment
JP4432801B2 (en) Driving assistance device
JP4561479B2 (en) Parking support method and parking support device
EP2045132A2 (en) Driving support device, driving support method, and computer program
JP5124351B2 (en) Vehicle operation system
EP1942314B1 (en) Navigation system
JP4421549B2 (en) Driving assistance device
JP3886376B2 (en) Vehicle perimeter monitoring system
JP2013541915A (en) Blind Spot Zone Display Device and Method
JP2012071794A (en) Parking support apparatus
JP2005239048A (en) Parking assistance system
US9535423B1 (en) Autonomous vehicle with improved visual detection ability

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20150115

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20151127

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20151201

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20160121

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20160517

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20161129