WO2021048584A1 - Procédé et appareil d'aide au stationnement - Google Patents

Procédé et appareil d'aide au stationnement Download PDF

Info

Publication number
WO2021048584A1
WO2021048584A1 PCT/IB2019/001101 IB2019001101W WO2021048584A1 WO 2021048584 A1 WO2021048584 A1 WO 2021048584A1 IB 2019001101 W IB2019001101 W IB 2019001101W WO 2021048584 A1 WO2021048584 A1 WO 2021048584A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
parking
parking support
garage
Prior art date
Application number
PCT/IB2019/001101
Other languages
English (en)
Japanese (ja)
Inventor
鈴木康啓
小林隼也
Original Assignee
日産自動車株式会社
ルノー エス. ア. エス.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日産自動車株式会社, ルノー エス. ア. エス. filed Critical 日産自動車株式会社
Priority to PCT/IB2019/001101 priority Critical patent/WO2021048584A1/fr
Publication of WO2021048584A1 publication Critical patent/WO2021048584A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R99/00Subject matter not provided for in other groups of this subclass
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a parking support method and a parking support device that support a vehicle to enter or leave the garage.
  • a parking assistant system is known that measures the contour of the garage, then automatically steers the vehicle and automatically parks the vehicle in the final parking position in the garage to avoid contact or collision with the walls of the garage. (See Patent Document 1).
  • An object to be solved by the present invention is to provide a parking support method and a parking support device that can provide a user with information on the entrance / exit of a garage.
  • the present invention is a parking support method including a controller that generates a first parking support image which is a bird's-eye view image from a virtual viewpoint of the upper part of the vehicle and displays it on a display, and when it is determined that the vehicle is parked in the garage.
  • a controller that generates a first parking support image which is a bird's-eye view image from a virtual viewpoint of the upper part of the vehicle and displays it on a display, and when it is determined that the vehicle is parked in the garage.
  • an obstacle existing beside the entrance / exit of the garage is identified from the data from the surrounding recognition sensor, and a second parking support image including this obstacle is generated and displayed.
  • the present invention it is possible to provide the user with information on the entrance / exit of the garage, and if there is an obstacle beside the entrance / exit, it is possible to provide information on the obstacle.
  • FIG. 1 is a block diagram showing a remote parking system 1 to which the parking support method and the parking support device of the present invention are applied.
  • the "autonomous driving control” means that the vehicle is driven by the automatic control of the on-board driving control device without depending on the driving operation of the driver.
  • “Autonomous parking control” is a kind of autonomous driving control, and means that a vehicle is parked (entered or exited) by automatic control of an on-board driving control device without depending on a driver's driving operation.
  • “parking” means to continuously park the vehicle in the parking space, but in the case of “traveling route”, not only the parking route when entering the garage into the parking space, but also from the parking space. It shall also include the garage exit route.
  • the "parking support method and parking support device” includes both parking support when entering the parking space and parking support when leaving the parking space.
  • putting in the garage is also called warehousing, and taking out the garage is also called warehousing.
  • the remote parking system 1 of the present embodiment has an assist mode in which an operator such as a driver gets on the vehicle and performs autonomous driving control in which the operator can intervene, and the operator gets off the vehicle and is outside the vehicle. It is equipped with a remote control mode that performs autonomous driving control using a remote controller.
  • the remote parking system 1 of the present embodiment is a system that puts in or takes out the garage by autonomous driving control when putting in the garage in the parking space or taking out the garage from the parking space. More specifically, the driver disembarks in the middle of entering the garage, and while confirming safety, the vehicle continues autonomous parking control by continuously transmitting an execution command signal to the vehicle by a remote controller. Then, when the vehicle may collide with an obstacle, the autonomous parking control is stopped by stopping the transmission of the execution command signal by the remote controller.
  • the autonomous driving control mode in which an operator such as a driver can get on the vehicle and perform an intervention operation by the operator is an assist mode, and an autonomous driving control mode in which the operator gets off and enters or leaves the garage together with remote control. Is called remote control mode.
  • a remote control mode combined with remote control can be used.
  • the remote control mode or the assist mode When entering the garage in the remote control mode or the assist mode, the remote control mode or the assist mode is activated, the warehousing route to the selected parking space is calculated, and the autonomous parking control is started.
  • remote control mode the driver carries the remote controller and disembarks before the start of autonomous parking control. The driver who got off the vehicle completes the garage entry by continuously transmitting the execution command signal to the vehicle by the remote controller. Further, in the assist mode, the autonomous parking control is started while the driver is in the vehicle.
  • the driver when taking out the garage from the parking space, in the remote control mode, the driver turns on the internal combustion engine or the drive motor from the outside of the vehicle by using the remote controller possessed, and further activates the remote leaving mode.
  • the autonomous exit control is started by calculating the exit route to the selected garage exit position, the driver completes the garage exit by continuing to transmit the execution command by the remote controller, and then gets on the vehicle.
  • the assist mode the driver gets into the vehicle, turns on the internal combustion engine or the drive motor, and further activates the delivery function of the assist mode.
  • the vehicle calculates the exit route to the garage exit position selected by the selected driver, starts autonomous exit control, and stops at the garage exit position.
  • the remote parking system 1 of the present embodiment is a system including a remote control mode in which such remote control is used in combination and an assist mode.
  • backward autonomous parking control is illustrated as an example of autonomous parking control, the present invention can also be applied to garage parking, columnar autonomous parking, and other autonomous parking.
  • the remote parking system 1 of the present embodiment includes a target parking space setting device 11, a vehicle position detector 12, an object detector 13, a parking route generation unit 14, an object deceleration calculation unit 15, a route tracking control unit 16, and a target vehicle speed generation unit. 17.
  • the steering angle control unit 18, the vehicle speed control unit 19, the master unit 20, the remote controller 21, and the slave unit 22 are provided.
  • the unit 19 and the master unit 20 are mounted on the vehicle.
  • the remote controller 21 and the slave unit 22 are possessed by an operator such as a driver. Each configuration will be described below.
  • the target parking space setting device 11 searches for parking spaces existing around the own vehicle in the remote control mode, and causes the operator to select a desired parking space from the parking spaces that can be parked. Further, the target parking space setting device 11 outputs the position information of the selected parking space (relative position coordinates from the current position of the own vehicle, latitude / longitude, etc.) to the parking route generation unit 14.
  • the target parking space setting device 11 corresponds to the controller of the present invention.
  • the target parking space setting device 11 has an input switch 111, a plurality of cameras 112a to 112d, a parking space detection unit 113, a display control unit 114, and a touch panel display (as shown in FIG. 2). (Hereinafter referred to as a display) 115 is provided.
  • the input switch 111 selectively selects the remote control mode and the assist mode.
  • the plurality of cameras 112a to 112d correspond to the photographing unit of the present invention and photograph the surroundings of the vehicle.
  • the cameras 112a to 112d are digital cameras, infrared cameras, and other image pickup devices that use an image pickup element.
  • the cameras 112a to 112d may also be used as the camera of the object detector 13 described later.
  • the parking space detection unit 113 detects objects around the vehicle by the images taken by the plurality of cameras 112a to 112d and the cameras, radars (millimeter wave radar, laser radar, ultrasonic radar, etc.) or sonar of the object detector 13. It is a computer on which a software program that detects a parking space that can be parked based on the detected detection data is installed.
  • the display control unit 114 processes the captured images of the cameras 112a to 112d to generate a parking support image including the parking space.
  • the display 115 corresponds to the display unit of the present invention and is used for displaying a parking support image and selecting a parking space.
  • the target parking space setting device 11 photographs the surroundings of the own vehicle with a plurality of cameras 112a to 112d. Further, the target parking space setting device 11 operates the camera, radar, sonar, or the like of the object detector 13 to detect an object such as a building, a parked vehicle, or a person around the vehicle.
  • the parking space detection unit 113 analyzes a plurality of captured images taken by the cameras 112a to 112d, and detects a parking space that can be parked based on the detection data including the position of the object output from the object detector 13. .
  • the target parking space setting device 11 generates a parking support image including a parking space that can be parked by the display control unit 114 and displays it on the display 115, prompting the operator to select a parking space in which the vehicle is to be parked.
  • the target parking space setting device 11 outputs the position information of the parking space to the parking route generation unit 14.
  • the parking lot information may be used.
  • FIG. 3 is a diagram showing an arrangement example of cameras 112a to 112d mounted on the own vehicle V.
  • the camera 112a is arranged on the front grille portion of the own vehicle V
  • the camera 112d is arranged near the rear bumper
  • the cameras 112b and 112c are arranged below the left and right door mirrors.
  • the cameras 112a to 112d cameras provided with a wide-angle lens having a large viewing angle can be used.
  • the display control unit 114 synthesizes a plurality of captured images taken by the cameras 112a to 112d, and virtualizes the surrounding state including the own vehicle V and the parking space in which the own vehicle V is parked above the own vehicle V.
  • a bird's-eye view image seen from the viewpoint P is generated as a parking support image.
  • the image processing performed by the display control unit 114 is, for example, "Masayasu Suzuki, Satoshi Chino, Teruhisa Takano, Development of bird's-eye view system, Pre-printed collection of the Society of Automotive Engineers of Japan, 116-07 (2007-10), 17- 22. ”and the like can be used.
  • the display control unit 114 has a normal mode and a garage mode as an image generation mode for generating a bird's-eye view image as a parking support image.
  • the normal mode is, for example, a mode used when parking a vehicle in a general parking space in which the premises of a parking lot is divided by a lane marking line.
  • the garage mode is a mode used when parking a vehicle in a parking space where there are obstacles such as a garage entrance / exit frame and an operation panel of a mechanical parking lot beside the entrance / exit of the parking space. ..
  • the garage mode is used because the parked vehicle becomes an obstacle existing beside the entrance / exit of the parking space.
  • the composite boundary line is included in the surrounding area (hereinafter referred to as the image composite area) including the boundary line (hereinafter referred to as the composite boundary line) generated when a plurality of captured images are combined.
  • the image composite area including the boundary line (hereinafter referred to as the composite boundary line) generated when a plurality of captured images are combined.
  • the transparency of the image is increased and the image is combined. Therefore, if there is an obstacle such as a garage entrance / exit frame or an operation panel of a mechanical parking lot in the image composition area, the obstacle disappears from the parking support image, which gives the driver a sense of discomfort. Therefore, the remote parking system 1 of the present embodiment includes a garage mode.
  • the switching of the image generation mode in the display control unit 114 is automatically switched based on the captured images of the cameras 112a to 112d and the detection data of the camera, radar, sonar, etc. of the object detector 13. That is, the cameras 112a to 112d and the object detector 13 correspond to the surrounding recognition sensor of the present invention.
  • the display control unit 114 determines that an obstacle exists beside the entrance / exit of the parking space
  • the display control unit 114 switches the image generation mode to the garage mode.
  • the display control unit 114 determines that there is no obstacle beside the entrance / exit of the parking space
  • the display control unit 114 switches the image generation mode to the normal mode.
  • the speed of the vehicle is predetermined when the parking space detection unit 113 detects the parking space, the remote control mode is selected by the input switch 111, and the remote control mode is started. This is executed when the vehicle speed (for example, 10 km / h) is reached, or when the vehicle position detector 12, which will be described later, determines that the vehicle is in the parking lot.
  • the display control unit 114 detects an obstacle before generating the parking support image. This may be done and the image generation mode may be switched according to the detection result.
  • the image generation mode of the display control unit 114 may be switched by the input switch 111 described above.
  • the input switch 111 includes a group of switches including a plurality of types of switches, and includes an input switch for switching between a remote control mode and an assist mode, as well as a switch for switching between a normal mode and a garage mode. Therefore, if it is known in advance that the vehicle will be parked in the garage, the image generation mode can be switched manually.
  • the vehicle position detector 12 is composed of a GPS unit, a gyro sensor, a vehicle speed sensor, and the like.
  • the GPS unit detects radio waves transmitted from a plurality of satellite communications and periodically acquires the position information of the own vehicle V.
  • the vehicle position detector 12 detects the current position of the own vehicle V based on the position information of the own vehicle V acquired by the GPS unit, the angle change information acquired from the gyro sensor, and the vehicle speed acquired from the vehicle speed sensor. ..
  • the position information of the own vehicle V detected by the vehicle position detector 12 is output to the parking route generation unit 14 and the route tracking control unit 16 at predetermined time intervals.
  • the object detector 13 corresponds to the surrounding recognition sensor of the present invention, and searches for whether or not an object such as an obstacle exists in the vicinity of the own vehicle V, and is a camera, a radar (millimeter wave radar, laser). Radar, ultrasonic radar, etc.) or sonar, etc., or a combination of these is provided. These cameras, radars or sonars, or a combination thereof, are mounted on the outer skin around the vehicle.
  • the mounting position of the object detector 13 is not particularly limited, but is, for example, all or a part of the center and both sides of the front bumper, the center and both sides of the rear bumper, the sill outer under the left and right center pillars, and the like. Can be attached to.
  • the object detector 13 includes a computer in which a software program for identifying the position of an object detected by a camera, radar, or the like is installed.
  • This computer generates the specified object information (target information) and its position information (relative position coordinates from the current position of the own vehicle, latitude / longitude, etc.) with the target parking space setting device 11 and a parking route. It is output to the unit 14 and the object deceleration calculation unit 15.
  • these object information and the position information are used for switching the image generation mode by the target parking space setting device 11 and for generating the parking route by the parking route generation unit 14. Further, the object information and the position information are used for the control of decelerating or stopping the own vehicle by the object deceleration calculation unit 15 when an object such as an unexpected obstacle is detected during the autonomous parking control.
  • the parking route generation unit 14 calculates a parking route that is a parking route from the current position of the own vehicle to the target parking position (in the case of the remote control mode, it means a warehousing route; the same applies hereinafter) and does not collide with or interfere with an object. To do.
  • the size of the own vehicle vehicle width, vehicle length, minimum turning radius, etc.
  • the position information of the space are used.
  • FIG. 4 is a plan view showing a scene in which the own vehicle V is parked in the garage G by autonomous parking control as an example of the remote control mode.
  • the garage G is drawn without a roof in order to illustrate the internal situation.
  • the front wall surface Gw of the garage G is provided with an entrance / exit Gd for entering or leaving the own vehicle V in the garage G.
  • the frame Gf and the front wall surface Gw of the doorway Gd serve as obstacles when the own vehicle V is put in or out of the garage G.
  • the parking space detection unit 113 searches for a parking space TPS that can be parked from inside the garage G.
  • the display control unit 114 generates a parking support image including the parking space TPS and the entrance / exit Gd and displays it on the display 115.
  • the parking route generation unit 14 parks from the current position P1 via the positions P2 and P3 to the parking position P4 on the parking space TPS. Calculate the path R1. Then, the parking route R1 is output to the route tracking control unit 16 and the target vehicle speed generation unit 17.
  • the object deceleration calculation unit 15 inputs the position information of an obstacle or other object from the object detector 13, and based on the distance to the object and the vehicle speed, the time until the object collides with the object (TTC: Time to Collection). ) Is calculated, and the deceleration start timing of the own vehicle is calculated.
  • TTC Time to Collection
  • the frame Gf of the garage G, the front wall surface Gw, the inner wall Gi of the garage G, and the like are objects as obstacles.
  • the object deceleration calculation unit 15 sets the vehicle speed as the initial setting value, and at the timing when the time TTC until the own vehicle V collides with the obstacle becomes equal to or less than the predetermined value.
  • the route tracking control unit 16 tracks the own vehicle along the parking route at predetermined time intervals based on the parking route from the parking route generation unit 14 and the current position of the own vehicle from the vehicle position detector 12. Calculate the target steering angle of. Regarding the parking path R1 of FIG. 4, the route tracking control unit 16 calculates the target steering angle of the parking path R1 from the current position P1 to the parking position P4 at predetermined time intervals for each current position of the own vehicle V. To do. The path tracking control unit 16 outputs the calculated target steering angle to the steering angle control unit 18.
  • the steering angle control may be performed so that the distance between both side surfaces of the own vehicle V and the inner wall Gi of the garage G facing the both side surfaces becomes constant. ..
  • the detection result of a camera, an ultrasonic sensor, or the like can be used.
  • the target vehicle speed generation unit 17 is a target for following the own vehicle along the parking route at predetermined time intervals based on the parking route from the parking route generation unit 14 and the deceleration start timing from the object deceleration calculation unit 15. Calculate the vehicle speed. Regarding the parking route R1 in FIG. 4, the target vehicle speed at the time of starting from the current position P1 and stopping close to the parking space TPS is calculated for each current position of the own vehicle V at predetermined time intervals, and the vehicle speed control unit is used. Output to 19. Further, when an unexpected obstacle is detected in the parking path R1 during the series of autonomous parking control shown in the figure, the object deceleration calculation unit 15 outputs the deceleration or stop timing. The target vehicle speed is output to the vehicle speed control unit 19.
  • the steering angle control unit 18 generates a control signal for operating the steering actuator provided in the steering system of the own vehicle V based on the target steering angle from the path tracking control unit 16. Further, the vehicle speed control unit 19 generates a control signal for operating the accelerator actuator provided in the drive system of the own vehicle V based on the target vehicle speed from the target vehicle speed generation unit 17. By simultaneously controlling the steering angle control unit 18 and the vehicle speed control unit 19, autonomous parking control is executed.
  • the international standard for autonomous driving control of a vehicle stipulates that the distance between the vehicle and the operator is within a predetermined remote control distance (for example, within 6 m) as a condition for allowing remote control of the vehicle. Therefore, in the remote parking system 1 of the present embodiment, the slave unit 22 possessed by the operator U and the master unit 20 mounted on the own vehicle V are used, and the relative position of the slave unit 22 with respect to the own vehicle V, that is, , Detects the relative position of the operator U who owns the slave unit 22 with respect to the own vehicle V.
  • the slave unit 22 and the master unit 20 form a so-called keyless entry system.
  • antennas 202a to 202d connected to the master unit 20 are installed at predetermined locations around the own vehicle V.
  • the master unit 20 transmits a slave unit search signal from the antennas 202a to 202d.
  • the slave unit 22 receives the slave unit search signal transmitted from each of the antennas 202a to 202d, and measures the radio field strength of the slave unit search signal of each antenna 202a to 202d. ..
  • the radio field strength of the slave unit search signal changes depending on the distance between the slave unit 22 and the antennas 202a to 202d.
  • the radio wave strength of the slave unit search signal received from the antenna 202b is the strongest, but is received from the antenna 202c near the right side of the rear bumper.
  • the signal strength of the slave unit search signal is the weakest.
  • the slave unit 22 transmits the radio wave strength of the slave unit search signal of each of the measured antennas 202a to 202d to the master unit 20.
  • the position detector 201 of the master unit 20 is, for example, a computer in which a software program for calculating the position of the slave unit 22 from the radio wave strengths of the antennas 202a to 202d received from the slave unit 22 by using a triangulation method or the like is installed. Is.
  • the position detector 201 is the relative position of the slave unit 22 with respect to the own vehicle V, that is, the operator U who owns the slave unit 22, based on the radio wave strength of the antennas 202a to 202d received from the slave unit 22.
  • the position relative to the vehicle V is detected.
  • the position detector 201 outputs the detected relative position of the slave unit 22 to the path tracking control unit 16 and the target vehicle speed generation unit 17 (or instead, the steering angle control unit 18 and the vehicle speed control unit 19 may be used). , Transmit to the remote controller 21.
  • the remote controller 21 is a device for the operator U to command from outside the vehicle whether to continue or stop the execution of the autonomous parking control set by the target parking space setting device 11. Therefore, the remote controller 21 wirelessly communicates to transmit an execution command signal to the route tracking control unit 16 and the target vehicle speed generation unit 17 (or instead, the steering angle control unit 18 and the vehicle speed control unit 19 may be used). It has a function and communicates with the wireless communication function provided in the own vehicle V.
  • the remote controller 21 includes, for example, a mobile information terminal such as a smartphone on which application software for remote control (hereinafter referred to as an application) is installed.
  • the smartphone on which the application is installed functions as the remote controller 21 of the remote parking system 1 by activating the application.
  • the international standard for autonomous driving control of a vehicle stipulates that the vehicle should execute autonomous driving control only while the operator continuously operates the remote controller. Therefore, in the remote parking system 1 of the present embodiment, the own vehicle V is transmitted from the remote controller 21 only while a predetermined command gesture is continuously input to the touch panel display (hereinafter referred to as the touch panel) 211 of the remote controller 21. Continue to send the execution command signal to. Further, the own vehicle V executes the autonomous parking control only while receiving the execution command signal transmitted from the remote controller 21. That is, when the input of the command gesture to the remote controller 21 is stopped, the execution command signal is not transmitted from the remote controller 21 to the vehicle, and the execution of the autonomous parking control of the vehicle is interrupted or stopped. Further, the remote controller 21 has a function of remotely activating a drive source such as a vehicle engine or a motor in order to remotely control a vehicle parked in a narrow parking space from the outside of the vehicle. There is.
  • a drive source such as a vehicle engine or a motor
  • the execution command signal transmitted to the own vehicle V is input to the route tracking control unit 16 and the target vehicle speed generation unit 17. Further, as described above, the relative positions of the own vehicle V and the slave unit 22 are input from the position detector 201 to the route tracking control unit 16 and the target vehicle speed generation unit 17.
  • the route tracking control unit 16 sends the steering angle control unit 18 to the steering angle control unit 18. Output the target steering angle.
  • the target vehicle speed generation unit 17 is a vehicle speed control unit when the distance between the own vehicle V and the slave unit 22 is within the remote control distance and the execution command signal from the remote controller 21 is input.
  • the target vehicle speed is output to 19.
  • the steering angle control unit 18 generates a control signal for operating the steering actuator provided in the steering system of the own vehicle V based on the target steering angle from the path tracking control unit 16. Further, the vehicle speed control unit 19 generates a control signal for operating the accelerator actuator provided in the drive system of the own vehicle V based on the target vehicle speed from the target vehicle speed generation unit 17.
  • the route tracking control unit 16 is in the case where the execution command signal from the remote controller 21 is input.
  • the target steering angle is not output to the steering angle control unit 18.
  • the target vehicle speed generation unit 17 is a place where an execution command signal from the remote controller 21 is input when the distance between the own vehicle V and the slave unit 22 is longer than the remote control distance.
  • the target vehicle speed is not output to the vehicle speed control unit 19. That is, when the distance between the own vehicle V and the slave unit 22 is farther than the remote control distance, the autonomous parking control is not executed even if the command gesture is input to the remote control device 21.
  • FIG. 6A shows a parking support image Pa1 in the normal mode generated when the own vehicle V is stopped at the current position P1 in FIG.
  • the parking support image Pa1 in the normal mode corresponds to the first parking support image of the present invention.
  • the parking support image Pa1 includes the own vehicle image Pa1v when the own vehicle V is viewed from above, the front image Pa1f based on the captured image of the camera 112a, the rear image Pa1b based on the captured image of the camera 112b, and the captured image of the camera 112b.
  • FIG. 6B shows the parking support image Pa2 in the normal mode generated when the own vehicle V is retracted to the position P2.
  • the parking support image Pa2 in this normal mode when the own vehicle V retreats and approaches the garage G, about half of the frame Gf and the front wall surface Gw of the garage G are parked due to the influence of the image composition areas S3a and S4a. It disappears from the support image Pa2.
  • FIG. 6C when the own vehicle V retracts to the position P3, the frame Gf and the front wall surface Gw of the garage G completely disappear from the parking support image Pa4 due to the influence of the image composition areas S3 and S4.
  • Such disappearance from the parking support image such as the frame Gf also occurs when the own vehicle V leaves the garage G.
  • the remote parking system 1 of the present embodiment supports parking when a vehicle is parked in a parking space where an obstacle exists beside an entrance / exit, such as a frame Gf of a garage G or an operation panel of a mechanical parking lot.
  • a garage mode is provided to prevent the entrance / exit frame and operation panel from disappearing from the image.
  • FIG. 7A shows a parking support image Pb1 in the garage mode, which is generated when the own vehicle V is stopped at the current position P3 in FIG.
  • the parking support image Pb1 in the garage mode corresponds to the second parking support image of the present invention.
  • the parking support image Pb1 in the garage mode is generated by synthesizing the own vehicle image Pb1v, the front image Pb1f, the rear image Pb1b, the left side image Pb1l, and the right side image Pb1r, similarly to the parking support image Pa1 in the normal mode.
  • the composite boundary line in the traveling direction of the own vehicle V that is, the composite boundary line L3b and L4b between the rear image Pb1b and the left image Pb1l and the right image Pb1r is set on the own vehicle.
  • the image Pb1v is set so as to be orthogonal to the central axis in the front-rear direction, that is, in a direction opposite to the direction of the obstacle and in a direction away from the obstacle.
  • FIG. 7B shows the parking support image Pb2 in the garage mode generated when the own vehicle V is retracted to the position P2.
  • the parking support image Pb2 even if the own vehicle V recedes and approaches the garage G, the frame Gf and the front wall surface Gw of the garage G are affected by the image composition areas S3b and S4b from the parking support image Pb2. It hasn't disappeared.
  • FIG. 7C even if the own vehicle V retreats to the position P3, the frame Gf and the front wall surface Gw of the garage G do not disappear from the parking support image Pb3 due to the influence of the image composition areas S3b and S4b. ..
  • the front image Pc1b, the left image Pc1l, and the right image are as shown in the parking support image Pc1 shown in FIG. 7D.
  • the composite boundary lines L1c and L2c with Pc1r are set so as to be orthogonal to the central axis in the front-rear direction of the own vehicle image Pc1v.
  • the change position of the composite boundary line in the garage mode is not limited to the direction opposite to the direction of the obstacle, that is, the direction away from the obstacle.
  • the composite boundary lines L3d and L4d are set so that the lines on both sides of the own vehicle image Pd1v extend straight behind the own vehicle image Pd1v, and the entrance / exit of the garage G. It may fit within Gd. Even in this case, since the image composition regions S3d and S4d do not affect the entrance / exit Gd of the garage G, the frame Gf and the front wall surface Gw of the garage G do not disappear from the parking support image Pd1.
  • FIG. 9 shows a state in which the parking support image Pb3 of FIG. 7C is transmitted to the remote controller 21 and displayed on the touch panel 211 of the remote controller 21.
  • FIG. 10 is a flowchart showing a control procedure executed by the remote parking system 1 of the present embodiment.
  • FIG. 11 is a flowchart showing a procedure for generating and displaying a parking support image in the display control unit 114.
  • step S1 shown in FIG. 10 the operator U such as the driver uses the input switch 111 of the on-board target parking space setting device 11. Operate to select the remote control mode. If the parking space is a garage or the like having an obstacle beside the doorway, the input switch 111 is operated to select the garage mode as the parking support image generation mode.
  • the target parking space setting device 11 searches for a parking space in which the own vehicle V can park using a plurality of on-board cameras 112a to 112d in step S2, and there is a parking space in which the vehicle V can park in step S3. Judge whether or not. If there is a parking space that can be parked, the process proceeds to step S4, and if there is no parking space that can be parked, the process returns to step S1. If the parking space that can be parked is not detected in step S2, the operator may be notified by a language display such as "There is no parking space" or by voice, and this process may be terminated.
  • step S4 the display control unit 114 of the target parking space setting device 11 generates a parking support image including a parkingable parking space and displays it on the in-vehicle display 115 to select the parking space desired by the operator U.
  • the display control unit 114 has an obstacle beside the entrance / exit of the parking space based on the captured images of the cameras 112a to 112d and the detection data of the camera, radar, sonar, etc. of the object detector 13. Determines if is present. If no obstacle is detected beside the entrance / exit of the parking space, the image generation mode is set to the normal mode and the process proceeds to step S44.
  • step S44 the display control unit 114 generates a parking support image in the normal mode as shown in FIG. 6A.
  • the generated normal mode parking assistance image is displayed on the display 115.
  • step S41 based on the captured images of the cameras 112a to 112d and the detection data of the camera, radar, sonar, etc. of the object detector 13, obstacles such as the frame Gf of the entrance / exit Gd of the garage G and the front wall surface Gw are obstructed.
  • the image generation mode is set to the normal mode and the process proceeds to step S43.
  • step S42 the position of the composite boundary line in the traveling direction of the own vehicle V is changed to generate a parking support image. As shown in FIG.
  • the display control unit 114 has the own vehicle image Pb1v, the front image Pb1f, and the rear image Pb1b, as shown in FIG. 7A.
  • the left image Pb1l and the right image Pb1r are combined to generate a parking support image Pb1.
  • the display control unit 114 sets the composite boundary line in the traveling direction of the own vehicle V, that is, the composite boundary lines L3b and L4b between the rear image Pb1b and the left image Pb1l and the right image Pb1r, at the entrance / exit Gd of the garage G.
  • the display control unit 114 proceeds to step S43 and displays the generated parking support image in the garage mode on the display 115.
  • the frame of the garage G from the parking support image Pb1. It is possible to prevent Gf and the front wall surface Gw from disappearing. Further, even if the stop position of the own vehicle V is the position P2 or the position P3 closer to the garage G than the current position P1 in FIG. 4, the parking support image Pb2 and the parking support are as shown in FIGS. 7B and 7C. Since the frame Gf of the garage G and the front wall surface Gw do not disappear from the image Pb3, the operator U does not feel uncomfortable.
  • the target parking position information is output to the parking route generation unit 14.
  • the parking route generation unit 14 generates the parking route R1 shown in FIG. 4 from the current position P1 of the own vehicle V and the parking space TPS which is the target parking position.
  • the object deceleration calculation unit 15 calculates the deceleration start timing at the time of autonomous parking control based on the object information detected by the object detector 13.
  • the parking route R1 generated by the parking route generation unit 14 is output to the route tracking control unit 16, and the deceleration start timing calculated by the object deceleration calculation unit 15 is output to the target vehicle speed generation unit 17.
  • the autonomous parking control is in the standby state.
  • the operator U determines whether or not to execute the autonomous parking control while riding on the own vehicle V.
  • the remote controller 21 is set to be operated by using the input switch 111 or the display 115, and the vehicle gets off from the own vehicle V.
  • the operator U executes the autonomous parking control while getting off from the own vehicle V in step S7
  • the operator U activates the remote controller 21 in step S8.
  • the start input of the remote control by the remote controller 21 includes the activation of the application installed on the remote controller 21, the door unlocking operation, the door locking and unlocking operation, and the combination of these and the application activation. Can be exemplified.
  • step S9 the pairing process of the remote controller 21 and the own vehicle V is performed.
  • the remote operation is started in step S10.
  • the operator U continuously performs a specified command gesture, for example, a touch operation for inputting a ring-shaped figure on the touch panel 211 of the remote controller 21.
  • the remote controller 21 determines whether or not the gesture input to the touch panel 211 is a specified command gesture. If the input gesture is a specified command gesture, the remote controller 21 generates an execution command signal and transmits it to the own vehicle V. This execution command signal is transmitted only while the command gesture is being input to the touch panel 211, and when the input of the command gesture is stopped, the transmission of the execution command signal is also stopped.
  • step S11 the route tracking control unit 16 steers when the distance between the own vehicle V and the slave unit 22 is within the remote control distance and the execution command signal from the remote control device 21 is input.
  • the target steering angle is output to the angle control unit 18.
  • the target vehicle speed generation unit 17 controls the vehicle speed when the distance between the own vehicle V and the slave unit 22 is within the remote control distance and the execution command signal from the remote controller 21 is input.
  • the target vehicle speed is output to the unit 19.
  • the steering angle control unit 18 generates a control signal for operating the steering actuator provided in the steering system of the own vehicle V based on the target steering angle from the path tracking control unit 16.
  • the vehicle speed control unit 19 generates a control signal for operating the accelerator actuator provided in the drive system of the own vehicle V based on the target vehicle speed from the target vehicle speed generation unit 17.
  • autonomous parking control is executed in the next step S12.
  • step S10 The processes from step S10 to step S13, which will be described later, are executed at predetermined time intervals until the own vehicle V arrives at the target parking space TPS in step S13.
  • step S13 it is determined whether or not the own vehicle V has arrived at the target parking space TPS, and if it has not arrived, the process returns to step S10, and when the own vehicle V arrives at the target parking space TPS. Stops the own vehicle V and ends the process.
  • the autonomous traveling control in the remote control mode is executed in the traveling route from the current position P1 of the own vehicle V to the parking space TPS in the garage G.
  • the target parking space setting device (controller) 11 is used to generate a parking support image (first parking support image) Pa1 which is a bird's-eye view image from the virtual viewpoint above the own vehicle V from the captured image, and display it.
  • the target parking space setting device 11 is a camera (surrounding recognition sensor) 112a to 112d provided in the own vehicle V to recognize the situation around the vehicle, and an object detector (surroundings).
  • the detection data of the recognition sensor 13 From the detection data of the recognition sensor 13 13, it is determined whether or not the own vehicle V is parked in the garage G, and when it is determined that the own vehicle V is parked in the garage G, the photographed data and the object detection of the cameras 112a to 112d are detected. From the detection data of the vessel 13, obstacles such as the frame Gf and the front wall surface Gw existing beside the entrance / exit Gd of the garage G are identified, and a parking support image (second parking support image) Pb1 including these obstacles is generated. , Display a support image including obstacles.
  • information on the entrance / exit Gd of the garage G can be provided to the operator U, and if an obstacle such as a frame Gf and a front wall surface Gw exists beside the entrance / exit Gd, information on the obstacle is also provided. can do.
  • a plurality of photographs of the surroundings of the own vehicle V are taken as the parking assistance image (first parking assistance image) Pa1 and the parking assistance image (second parking assistance image) Pb1.
  • a bird's-eye view image formed by synthesizing the images is generated, and in the parking support image Pb1, the composite boundary lines L1b to L1d of a plurality of captured images are changed with respect to the parking support image Pa1.
  • the composition boundary lines L1b to L1d overlap with obstacles such as the frame Gf and the front wall surface Gw beside the entrance / exit Gd. Since it is possible to prevent obstacles such as the frame Gf and the front wall surface Gw beside the doorway Gd from disappearing from the parking support image Pb1, it is possible to appropriately provide information on the doorway Gd of the parking space TPS.
  • the positions of the composite boundary lines L1b to L1d are set so as to avoid obstacles such as the frame Gf and the front wall surface Gw.
  • the direction since the direction is changed to the direction opposite to the direction of the obstacle, it can be dealt with only by changing the parameters when synthesizing the bird's-eye view image, and it can be easily adopted.
  • the frame Gf of the entrance / exit Gd of the garage G is an obstacle, so that the frame Gf of the entrance / exit Gd and the front wall surface Gw are used. It can be properly displayed on the parking support image.
  • the parking support image is displayed on the touch panel 211 of the remote controller 21 that operates the own vehicle V from the outside, so that the information regarding the entrance / exit Gd of the parking space TPS is displayed outside the vehicle. It can be provided to the operator U.
  • Second Embodiment a second embodiment of the remote parking system to which the parking support method and the parking support device of the present invention are applied will be described.
  • the same reference numerals as those of the first embodiment will be used for the same configurations as those of the first embodiment, and detailed description thereof will be omitted.
  • the transparency of the image composite area including the composite border is changed instead of the method of changing the position of the composite border. That is, in the present embodiment, as shown in FIG. 12, steps S41, S44, and S43 proceed in the same procedure as in the first embodiment, but in step S42a, the transparency of the image composition region is changed.
  • Generate a parking assistance image That is, in the present embodiment, the first parking support image and the second parking support image are not switched and generated according to the image generation mode, but a part of the first parking support image is changed to generate the garage G. The obstacles on the side of the entrance / exit Gd are displayed so that they will not disappear.
  • the parking support image Pa1 shown in FIG. 6A is generated as the first parking support image, and when the image generation mode is the garage mode, the parking support image Pa1 is generated.
  • the transparency of the image composition regions S3a to S4a is lowered.
  • the composition boundary lines L3a to L4a become conspicuous, but the entrance / exit Gd, the frame Gf, and the front wall surface Gw of the garage G can be prevented from disappearing from the parking support image.
  • Information on the entrance / exit Gd of the parking space TPS can be appropriately provided. Further, the present embodiment can be dealt with only by changing the parameters when synthesizing the bird's-eye view image, and can be easily adopted.
  • this embodiment may be applied together with the first embodiment. That is, in the garage mode, the position of the composite boundary line may be changed, and the transparency of the image composite region including the composite boundary line may be changed. According to this, even if an obstacle overlaps the changed composite boundary line, it is possible to prevent the obstacle from disappearing and appropriately display the parking support image.
  • an obstacle is drawn on the parking support image composed of a bird's-eye view image. That is, in the present embodiment, as in the second embodiment, the first parking support image and the second parking support image are not switched and generated according to the image generation mode, but one of the first parking support images. The part is changed to display obstacles beside the entrance / exit Gd of the garage G.
  • the display control unit 114 captures the images taken by the cameras 112a to 112d.
  • the contour of the garage G is detected.
  • a garage border F in which the front wall surface and the side wall surface of the garage G are drawn by computer graphics or the like is generated.
  • the garage border F is superimposed on the actual garage G, or the actual garage G is erased by image processing and displayed on the parking support image Pe1.
  • FIG. 13 shows a state in which the garage G is erased by image processing and the garage border F is drawn.
  • the garage border F moves in the parking support image Pe1 according to the movement of the own vehicle V.
  • an obstacle whose display may be unclear in a parking support image composed of a bird's-eye view image can be clarified by drawing and displaying it.
  • the present embodiment may be applied in combination with the first embodiment and the second embodiment as appropriate.
  • step S42b instead of generating a parking support image composed of a bird's-eye view image, image processing is performed to generate a photographed image taken by the cameras 112a to 112d in the traveling direction of the own vehicle V, that is, a bird's-eye view image.
  • the previously captured image is displayed as a parking support image.
  • the garage G is used as shown in FIG. 15 by using the captured image of the camera 112d behind the vehicle shown in FIG.
  • the parking support image Pf1 with the front view of the above is displayed on the display 115.
  • obstacles whose display may be unclear in the parking support image composed of the bird's-eye view image can be clearly displayed by the image taken from the traveling direction of the own vehicle V.
  • the image taken from the traveling direction of the own vehicle V is displayed instead of the bird's-eye view image
  • the bird's-eye view image and the image taken from the traveling direction of the own vehicle V may be displayed side by side at the same time. ..
  • the position of the virtual viewpoint is set higher than the virtual viewpoint of the first parking support image.
  • the bird's-eye view image seen from the position of the virtual viewpoint VP1 is used as the first parking support. Generate as an image.
  • the view is seen from the position of the virtual viewpoint VP2 higher than the virtual viewpoint VP1.
  • a bird's-eye view image is generated as a second parking support image.
  • the garage G is displayed smaller than that in the first parking support image, and the entrance / exit Gd of the garage G is less likely to overlap with the composite boundary line. It is possible to prevent the frame Gf and the like from disappearing from the second parking support image.
  • the present embodiment may be applied in combination with the first to fourth embodiments as appropriate.
  • each of the above embodiments the case where the own vehicle V is retracted and stored in the garage G and the case where the own vehicle V is advanced and the vehicle is discharged from the garage G have been described.
  • each of the above embodiments can be applied to the case where the own vehicle V is advanced to enter the garage G and the case where the own vehicle V is retracted to leave the garage G.
  • the garage is illustrated as a parking space in which an obstacle exists beside the doorway, it can also be applied to a mechanical parking lot having an operation panel in the frame of the doorway.
  • the composite boundary line is changed to a predetermined position, but the size (width) of the obstacle existing beside the doorway is changed. , Height, etc.), the position relative to the own vehicle V, and the like, the position where the composite boundary line is changed may be set.
  • the size and relative position of an obstacle are detected from the images taken by the cameras 112a to 112d, and based on the detection result and the parking route of the own vehicle V, the obstacle and the composite boundary line do not overlap with each other.
  • a composite boundary line may be set at the position.
  • each of the above embodiments has been described for a vehicle having an autonomous driving control function, it can also be applied to a vehicle not having an autonomous driving control function. That is, by applying the present invention to a vehicle whose parking space can be confirmed by a bird's-eye view image (hereinafter referred to as a vehicle capable of displaying a bird's-eye view image) even if the vehicle does not have an autonomous driving control function, the vehicle can be parked from the bird's-eye view image. It is possible to prevent the phenomenon that obstacles beside the entrance and exit of the space disappear.
  • the vehicle is at the parking lot position when the bird's-eye view image display mode is selected, when the vehicle speed reaches a predetermined speed (for example, 10 km / h), or based on the position information.
  • a predetermined speed for example, 10 km / h

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

Procédé d'aide au stationnement dans lequel des images capturées d'une zone autour d'un véhicule (V) capturées à l'aide de caméras (112a-122d) sont obtenues, une première image d'aide au stationnement (Pa1) qui est une image à vol d'oiseau à partir d'un point de vue virtuel au-dessus du véhicule (V) est générée à partir des images capturées à l'aide d'un dispositif de commande (11), et l'image à vol d'oiseau est affichée sur un écran (15). Le dispositif de commande (11) détermine, en fonction des données provenant d'un capteur de reconnaissance d'environnement (13) qui est disposé sur le véhicule (V) et qui est destiné à reconnaître une situation autour du véhicule, s'il faut ou non stationner le véhicule (V) dans un garage (G), et lorsqu'il est établi que le véhicule (V) doit être stationné dans le garage (G), spécifie, à l'aide des données provenant du capteur de reconnaissance d'environnement (13), un obstacle présent sur le côté d'une entrée du garage (G), génère une seconde image d'aide au stationnement (Pb1) comportant l'obstacle, et amène une image d'assistance comportant l'obstacle à s'afficher.
PCT/IB2019/001101 2019-09-12 2019-09-12 Procédé et appareil d'aide au stationnement WO2021048584A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2019/001101 WO2021048584A1 (fr) 2019-09-12 2019-09-12 Procédé et appareil d'aide au stationnement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2019/001101 WO2021048584A1 (fr) 2019-09-12 2019-09-12 Procédé et appareil d'aide au stationnement

Publications (1)

Publication Number Publication Date
WO2021048584A1 true WO2021048584A1 (fr) 2021-03-18

Family

ID=74865721

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2019/001101 WO2021048584A1 (fr) 2019-09-12 2019-09-12 Procédé et appareil d'aide au stationnement

Country Status (1)

Country Link
WO (1) WO2021048584A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011057101A (ja) * 2009-09-10 2011-03-24 Toshiba Alpine Automotive Technology Corp 車両用障害物検出装置
WO2011036892A1 (fr) * 2009-09-24 2011-03-31 パナソニック株式会社 Dispositif d'affichage d'aide à la conduite
JP2013030833A (ja) * 2011-07-26 2013-02-07 Aisin Seiki Co Ltd 車両周辺監視システム
JP2013530867A (ja) * 2010-06-09 2013-08-01 ヴァレオ・シャルター・ウント・ゼンゾーレン・ゲーエムベーハー 駐車スペースへの駐車時における自動車運転者支援方法、運転者支援装置、および自動車
JP2015015527A (ja) * 2013-07-03 2015-01-22 クラリオン株式会社 映像表示システム、映像合成装置及び映像合成方法
JP2019145951A (ja) * 2018-02-19 2019-08-29 株式会社デンソーテン 車両遠隔操作装置、車両遠隔操作システム及び車両遠隔操作方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011057101A (ja) * 2009-09-10 2011-03-24 Toshiba Alpine Automotive Technology Corp 車両用障害物検出装置
WO2011036892A1 (fr) * 2009-09-24 2011-03-31 パナソニック株式会社 Dispositif d'affichage d'aide à la conduite
JP2013530867A (ja) * 2010-06-09 2013-08-01 ヴァレオ・シャルター・ウント・ゼンゾーレン・ゲーエムベーハー 駐車スペースへの駐車時における自動車運転者支援方法、運転者支援装置、および自動車
JP2013030833A (ja) * 2011-07-26 2013-02-07 Aisin Seiki Co Ltd 車両周辺監視システム
JP2015015527A (ja) * 2013-07-03 2015-01-22 クラリオン株式会社 映像表示システム、映像合成装置及び映像合成方法
JP2019145951A (ja) * 2018-02-19 2019-08-29 株式会社デンソーテン 車両遠隔操作装置、車両遠隔操作システム及び車両遠隔操作方法

Similar Documents

Publication Publication Date Title
US20200398827A1 (en) Parking assist system
JP7467202B2 (ja) 駐車支援システム
WO2016158236A1 (fr) Dispositif de commande de véhicule
US11338826B2 (en) Parking assist system
US20210300349A1 (en) Vehicle movement assist system
US11458959B2 (en) Parking assist system
US11584297B2 (en) Display device for vehicle and parking assist system
US11433921B2 (en) Parking assist system
US11548501B2 (en) Parking assist system
US11827212B2 (en) Parking assist system
US20200398824A1 (en) Parking assist system
US11623636B2 (en) Display device for vehicle and parking assist system
US11753001B2 (en) Parking assist system
US20210179079A1 (en) Parking assist system
US11327495B2 (en) Vehicle control system
US11327479B2 (en) Vehicle control system
US20200310411A1 (en) Vehicle control system
US11938931B2 (en) Stop assist system
WO2021048584A1 (fr) Procédé et appareil d'aide au stationnement
JPWO2020115517A1 (ja) 駐車時の車両走行制御方法及び車両走行制御装置
JP7206103B2 (ja) 車両走行制御方法及び車両走行制御装置
US12038770B2 (en) Remote operation system
US11548500B2 (en) Parking assist system
US20240227675A1 (en) Driving support device
US20230125351A1 (en) Vehicle driving assist system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19945054

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19945054

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP