WO2017179205A1 - Display control device for parking assistance, and display control method for parking assistance - Google Patents

Display control device for parking assistance, and display control method for parking assistance Download PDF

Info

Publication number
WO2017179205A1
WO2017179205A1 PCT/JP2016/062155 JP2016062155W WO2017179205A1 WO 2017179205 A1 WO2017179205 A1 WO 2017179205A1 JP 2016062155 W JP2016062155 W JP 2016062155W WO 2017179205 A1 WO2017179205 A1 WO 2017179205A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
vehicle
parking space
parking
display
Prior art date
Application number
PCT/JP2016/062155
Other languages
French (fr)
Japanese (ja)
Inventor
直志 宮原
下谷 光生
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2018511865A priority Critical patent/JP6611918B2/en
Priority to PCT/JP2016/062155 priority patent/WO2017179205A1/en
Publication of WO2017179205A1 publication Critical patent/WO2017179205A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/14Traffic control systems for road vehicles indicating individual free spaces in parking areas

Definitions

  • the present invention relates to a parking assistance display control device and a parking assistance display control method.
  • An object of the present invention is to provide a parking assistance display control device and a parking assistance display control method capable of presenting a parking space in a display form that can be intuitively understood by a driver.
  • the display control device for parking assistance is a parking space determination unit that determines whether there is a parking space that can be parked around the vehicle based on the situation around the vehicle detected by the surrounding detection device. And image information representing vehicle side images of the side of the vehicle that are sequentially photographed along the traveling direction of the vehicle by the peripheral photographing device, and based on the acquired image information, the vehicle side image is When it is determined that there is a parking available space by the image combining unit that generates a combined image that is connected along the direction and the parking space determination unit, a parking space display object that indicates the position of the parking space is displayed as an image combining unit.
  • a control unit that generates a parking space image superimposed on the composite image generated by the control unit, and outputs the parking space image generated by the control unit to the display device
  • an image output unit, parking space image is at least partially, characterized in that it is a sterically visible stereoscopic image.
  • the display control method for parking assistance acquires image information representing a vehicle side image of a side of the vehicle that is sequentially photographed along the traveling direction of the vehicle by the peripheral imaging device, and based on the acquired image information.
  • a parking space display object indicating the position of the parking space is superimposed on the composite image, and at least a part of the stereoscopic image is stereoscopically visible.
  • a certain parking space image is generated, and the generated parking space image is output to a display device capable of stereoscopically displaying a stereoscopic image of the parking space image.
  • the parking space determination unit determines whether there is a parking space around the vehicle based on the situation around the vehicle detected by the periphery detection device. Is done. Image information representing vehicle side images that are sequentially photographed along the traveling direction of the vehicle by the peripheral photographing device is acquired by the image composition unit, and based on the acquired image information, the vehicle side image is set in the traveling direction of the vehicle. A composite image joined together is generated.
  • the control unit superimposes the parking space display object on the synthesized image generated by the image synthesis unit, and at least a part of the parking image is a stereoscopic image. A space image is generated.
  • the parking space image generated by the control unit is output to the display device by the image output unit. Accordingly, it is possible to present a parking space in a display form that can be intuitively understood by the driver.
  • image information representing vehicle side images that are sequentially photographed along the traveling direction of the vehicle by the peripheral photographing device is acquired, and the vehicle is based on the acquired image information.
  • a composite image is generated by joining the side images along the traveling direction of the vehicle. It is determined whether there is a parking space around the vehicle based on the situation around the vehicle detected by the periphery detection device.
  • the parking space display object is superimposed on the composite image, and a parking space image, at least a part of which is a stereoscopic image, is generated.
  • the generated parking space image is output to the display device. Accordingly, it is possible to present a parking space in a display form that can be intuitively understood by the driver.
  • FIG. 1 It is a block diagram which shows the structure of the parking assistance apparatus 100 in the 1st Embodiment of this invention. It is a block diagram which shows the structure of the parking assistance apparatus 200 in the 2nd Embodiment of this invention. It is a block diagram which shows the hardware constitutions of the display control apparatus 10 for parking assistance in the 2nd Embodiment of this invention. It is a figure which shows typically a mode that a vehicle side image is image
  • FIG. 5 is a diagram illustrating an example of a left-eye drawing plane 41.
  • FIG. It is a figure which shows an example of the drawing plane 42 for right eyes. It is a figure for demonstrating the attachment method to the drawing plane of a divided image. It is a figure which shows an example of the depth of a stereoscopic vision image. It is a figure which shows an example of the parking space image containing a stereoscopic vision image. It is a figure which shows the relationship between the depth of a stereoscopic vision image, and the position of each stereoscopic vision image in a parking space image. It is a flowchart which shows the process sequence regarding the display control process in the display control apparatus 10 for parking assistance of the 2nd Embodiment of this invention.
  • FIG. 10 It is a flowchart which shows the process sequence regarding the display control process in the display control apparatus 10 for parking assistance of the 2nd Embodiment of this invention. It is a figure which shows the vehicle unit image a1 cut out in the 2nd Embodiment of this invention shown in FIG. It is a figure which shows the other example of a vehicle unit image. It is a figure which shows the other example of the position of the background of a parked vehicle. It is a figure which simplifies and shows the structure of the binocular camera apparatus. It is a figure which shows typically a mode that a left side image is image
  • FIG. 1 is a block diagram showing a configuration of a parking assistance apparatus 100 according to the first embodiment of the present invention.
  • the parking assistance device 100 includes a parking assistance display control device 1, a peripheral photographing device 2, a peripheral detection device 3, and a display device 4.
  • the parking assistance display control device 1 includes an image composition unit 11, a parking space determination unit 12, a control unit 13, and an image output unit 14.
  • the parking support display control device 1 is mounted on a vehicle and used.
  • the peripheral photographing device 2, the peripheral detection device 3, and the display device 4 are provided in a vehicle (hereinafter sometimes referred to as “own vehicle”) on which the parking assist display control device 1 is mounted.
  • the parking support display control method according to another embodiment of the present invention is executed by the parking support display control apparatus 1 of the present embodiment.
  • the peripheral photographing device 2 is constituted by a camera.
  • the peripheral photographing device 2 is provided on the right side, the left side, and the rear in the traveling direction of the host vehicle.
  • the peripheral imaging device 2 captures images of the vicinity of the host vehicle, specifically, images on the right side, left side, and rear of the traveling direction of the host vehicle.
  • an image on the right side of the traveling direction of the host vehicle is referred to as a “right side image”
  • an image on the left side of the traveling direction of the host vehicle is referred to as a “left side image”
  • the rear side in the traveling direction of the host vehicle These images may be referred to as “rear images”.
  • the peripheral imaging device 2 sequentially captures images of the side of the host vehicle (hereinafter sometimes referred to as “vehicle side image”) along the traveling direction of the host vehicle.
  • the peripheral photographing device 2 is connected to the image composition unit 11 of the parking assistance display control device 1. Image information representing an image photographed by the peripheral photographing device 2 is given to the image composition unit 11.
  • the image composition unit 11 acquires image information representing a vehicle side image and the like from the peripheral photographing device 2. Based on the acquired image information, the image composition unit 11 generates a composite image obtained by joining the vehicle side images along the traveling direction of the vehicle.
  • the periphery detection device 3 detects the situation around the host vehicle. Specifically, the periphery detection device 3 detects an object existing around the host vehicle and detects the relative position between the host vehicle and the object. Objects existing around the host vehicle are, for example, other vehicles and features.
  • the peripheral detection device 3 is configured by at least one of an ultrasonic sensor, an image processing sensor, a millimeter wave radar, and a laser radar, for example.
  • the image processing sensor may be a camera.
  • the periphery detection device 3 is connected to the parking space determination unit 12 of the parking assistance display control device 1.
  • the surroundings detection device 3 generates detection information representing the detected situation around the host vehicle.
  • the periphery detection device 3 generates detection information indicating the detected object and the relative position between the host vehicle and the object.
  • the surroundings detection device 3 gives the generated detection information to the parking space determination unit 12.
  • the parking space determination unit 12 is a space that can be parked around the host vehicle based on the situation around the host vehicle detected by the surroundings detection device 3, specifically based on the detection information provided from the surrounding detection device 3. Determine if there is a parking space available. In the present embodiment, the parking space determination unit 12 determines whether there is a parking space as a parking space.
  • the control part 13 produces
  • the control unit 13 superimposes the generated parking space display object on the composite image generated by the image composition unit 11 to generate a parking space image.
  • the parking space image is a stereoscopic image that is at least partially visible in a stereoscopic manner.
  • the image output unit 14 outputs the parking space image generated by the control unit 13, specifically, image information representing the parking space image to the display device 4.
  • the display device 4 is configured by a liquid crystal display, a head-up display, or the like, for example.
  • the display device 4 displays an image represented by the image information given from the image output unit 14 on the display screen.
  • various information for example, a parking space, can be presented to the user of the parking assistance display control device 1, for example, the driver of the host vehicle.
  • the display device 4 is configured to display a stereoscopic image of the parking space image in a three-dimensional manner. Accordingly, the parking space image can be presented to the user of the parking assistance display control device 1, for example, the driver of the host vehicle, in a state where the stereoscopic image can be viewed stereoscopically.
  • the control unit 13 When it is determined by the parking space determination unit 12 that there is a parking space, the control unit 13 superimposes the parking space display object on the synthesized image generated by the image synthesis unit 11 to generate a parking space image.
  • the parking space image generated by the control unit 13 is output to the display device 4 by the image output unit 14.
  • the parking space image is a stereoscopic image that is at least partially visible in a stereoscopic manner. Accordingly, it is possible to present a parking space in a display form that can be intuitively understood by the driver.
  • peripheral photographing device 2 and the peripheral detecting device 3 are provided separately, but the peripheral photographing device 2 may also serve as the peripheral detecting device 3.
  • FIG. 2 is a block diagram showing the configuration of the parking assistance apparatus 200 according to the second embodiment of the present invention.
  • the parking assistance device 200 includes a parking assistance display control device 10, a peripheral photographing device 2, a peripheral detection device 3, a display device 4, an in-vehicle LAN (Local Area Network) 5, and an operation input device 6.
  • the parking assistance display control device 10 includes an image composition unit 11, a parking space determination unit 12, a control unit 13, an image output unit 14, and an in-vehicle LAN interface 15.
  • the parking support display control device 10 is mounted on a vehicle and used.
  • the peripheral photographing device 2, the peripheral detection device 3, the display device 4, the in-vehicle LAN 5, and the operation input device 6 are provided in the own vehicle on which the parking assistance display control device 10 is mounted.
  • parking assist device 200 of the present embodiment includes the same configuration as parking assist device 100 of the first embodiment shown in FIG. 1, the same reference numerals are assigned to the same components. Thus, a common description is omitted.
  • the parking assistance display control method according to another embodiment of the present invention is executed by the parking assistance display control apparatus 10 of the present embodiment.
  • the in-vehicle LAN 5 is configured by, for example, CAN (Controller Area Network).
  • the in-vehicle LAN 5 is connected to the in-vehicle LAN interface 15 of the parking assistance display control device 10.
  • the in-vehicle LAN 5 gives various vehicle information related to the own vehicle to the in-vehicle LAN interface 15.
  • the vehicle information includes, for example, sensor information detected by various sensors provided on the host vehicle, information on vehicle equipment such as indicator lights and warning lights provided on the host vehicle, information indicating the state of the vehicle equipment, traveling of the vehicle Vehicle speed information indicating speed, information indicating steering angle, information indicating the state of the brake, information indicating the state of the shift lever, and the like.
  • the indicator lamp is, for example, a blinker
  • the information on the vehicle device is, for example, blinker information on the blinker.
  • the in-vehicle LAN interface 15 provides the vehicle information given from the in-vehicle LAN 5 to the image composition unit 11, the parking space determination unit 12, and the control unit 13.
  • the control unit 13 obtains vehicle movement trajectory information representing a miracle of vehicle movement from the vehicle information given from the in-vehicle LAN 5.
  • the vehicle movement trajectory information obtained by the control unit 13 is used in each unit of the parking assistance display control device 10.
  • the operation input device 6 includes an operation input unit (not shown) operated by the user.
  • the operation input unit includes, for example, an operation switch and an operation button.
  • the operation input device 6 is used when the user inputs various information such as numerical information, character information, and instruction information to the parking assistance display control device 10.
  • the operation input device 6 When the operation input unit is operated by the user, the operation input device 6 generates operation information corresponding to the user's input operation and supplies the operation information to the control unit 13.
  • the operation input device 6 generates operation information including information indicating that the user is not operating anything when the user is not operating anything.
  • the operation input unit of the operation input device 6 may be configured by a touch panel or a voice input device capable of voice operation input instead of the operation switch and the operation button.
  • the touch panel is installed on the display screen of the display device 4 and detects the touch operation and the touch position of the user.
  • the touch panel generates operation information corresponding to the detected touch operation and the touch position, and supplies the operation information to the control unit 13.
  • the voice input device When the operation input unit of the operation input device 6 is composed of a voice input device, the voice input device recognizes the input voice. The voice input device generates operation information corresponding to the recognized voice and gives it to the control unit 13.
  • the control unit 13 functions as an image superimposing unit.
  • the control unit 13 functions as an image superimposing unit, it has the same function as the control unit 13 in the first embodiment described above. Specifically, when it is determined by the parking space determination unit 12 that there is a parking space, the control unit 13 generates a parking space display object indicating the position of the parking space, and generates the generated parking space display object.
  • the parking space image is generated by superimposing the image on the combined image generated by the image combining unit 11.
  • control unit 13 controls the entire parking assistance display control device 10 based on operation information given from the operation input device 6.
  • FIG. 3 is a block diagram showing a hardware configuration of the parking assistance display control apparatus 10 according to the second embodiment of the present invention.
  • the image composition unit 11, the parking space determination unit 12, the control unit 13, and the image output unit 14 in the parking assistance display control device 10 are realized by a processing circuit. That is, the parking support display control device 10 determines whether or not there is a parking space that can be parked around the vehicle based on the situation around the vehicle detected by the surroundings detection device 3 by the parking space determination unit 12.
  • the image composition unit 11 obtains image information representing a side image of the vehicle that is photographed in order along the traveling direction of the vehicle by the peripheral photographing device 2, and based on the obtained image information.
  • the controller 13 determines that there is a parking space by the parking space determination unit 12, the side image of the vehicle is joined along the traveling direction of the vehicle.
  • a parking space display object indicating the position is superimposed on the composite image generated by the image composition unit 11 to generate a parking space image, and the image output unit 14 Comprising a processing circuit for outputting a parking space image generated in the display device Te.
  • a dedicated hardware may be applied to the processing circuit, or a processor (Central Processing Unit, Graphics Processing Unit, processing device, arithmetic device, microprocessor, microcomputer, DSP that executes a program stored in the memory 102 101 (also referred to as “Digital Signal) Processor”) may be applied.
  • a processor Central Processing Unit, Graphics Processing Unit, processing device, arithmetic device, microprocessor, microcomputer, DSP that executes a program stored in the memory 102 101 (also referred to as “Digital Signal) Processor”) may be applied.
  • the processing circuit can be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field Programmable Gate Array). Or a combination of these.
  • the functions of the image synthesizing unit 11, the parking space determining unit 12, the control unit 13, and the image output unit 14 may be realized by a plurality of processing circuits, respectively, or the functions of the units may be combined into one processing circuit. It may be realized with.
  • each function of the image composition unit 11, the parking space determination unit 12, the control unit 13, and the image output unit 14 in the parking assistance display control device 10 includes software, firmware, Alternatively, it is realized by a combination of software and firmware.
  • Software and firmware are described as programs and stored in the memory 102.
  • the processing circuit implements the functions of each unit by reading and executing the program stored in the memory 102. That is, the parking assistance display control device 10 displays image information representing a vehicle side image of the side of the vehicle that is sequentially photographed along the traveling direction of the vehicle by the peripheral photographing device 2 when executed by the processing circuit. Based on the acquired image information, a step of generating a composite image obtained by connecting the vehicle side images along the traveling direction of the vehicle, and a situation around the vehicle detected by the periphery detection device 3 The step of determining whether there is a parking space that can be parked around the vehicle, and if it is determined that there is a parking space, a parking space display object indicating the position of the parking space is superimposed on the composite image. And a step of generating a parking space image and a step of outputting the generated parking space image to the display device 4 as a result.
  • these programs cause the computer to execute procedures and methods of processing performed by the image composition unit 11, the parking space determination unit 12, the control unit 13, and the image output unit 14.
  • the memory 102 is, for example, non-volatile or RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only Memory), or the like. Volatile semiconductor memories and magnetic disks, flexible disks, optical disks, compact disks, mini disks, DVDs (Digital Versatile Discs), and the like are applicable.
  • each function of the image composition unit 11, the parking space determination unit 12, the control unit 13, and the image output unit 14 is realized by either hardware or software. It is. However, the present invention is not limited to these configurations, and part of the image composition unit 11, the parking space determination unit 12, the control unit 13, and the image output unit 14 is realized by dedicated hardware, and another part is software. It may be a configuration realized by such as For example, the function of the control unit 13 is realized by a processing circuit as dedicated hardware, and other functions are realized by the processing circuit as a processor reading and executing a program stored in the memory 102 Is possible.
  • the processing circuit can realize the aforementioned functions by hardware, software, or the like, or a combination thereof.
  • FIGS. 4 to 12 are diagrams for explaining a method of forming a parking space image including a stereoscopic image.
  • the parking assistance display control device 10 once synthesizes the two-dimensional image, then divides the synthesized two-dimensional image into each parked vehicle image, and each image corresponds to the parked vehicle position.
  • a parking space image is generated so as to obtain a stereoscopic image having a large depth.
  • a parking space image is generated as follows.
  • FIG. 4 is a diagram schematically illustrating a state in which a vehicle side image is captured by the monocular camera 21 of the peripheral imaging device 2.
  • the peripheral photographing device 2 includes a monocular camera 21.
  • FIG. 4 shows a case where the monocular camera 21 is provided on the left side surface (hereinafter also referred to as “left side surface”) in the traveling direction of the vehicle.
  • the monocular camera 21 provided on the left side surface of the vehicle may be referred to as a “left monocular camera 21”.
  • the image composition unit 11 captures two images Pic (t) and Pic (t + ⁇ t) photographed at two positions separated by a predetermined distance D1 by the monocular camera 21 of the peripheral photographing device 2. Use to create a stereoscopic image.
  • the monocular camera 21 when the monocular camera 21 is provided on the left side surface of the vehicle, the monocular camera 21 captures the first image Pic (t) at the first position at time t.
  • the monocular camera 21 captures the second image Pic (t + ⁇ t) at a second position that is a distance D1 away from the first position at time t + ⁇ t after ⁇ t from time t.
  • FIG. 5 is a diagram schematically showing a state in which a left side image is taken.
  • the host vehicle 31 starts from a position Ts of the host vehicle 31 at the time of starting a search for a parking space that can be parked (hereinafter may be referred to as “parking space search start position”).
  • the vehicle travels in the traveling direction 32 toward the position Te of the host vehicle 31 at the time of presenting the parking space (hereinafter sometimes referred to as “parking space presentation position”) Te.
  • the peripheral photographing device 2 shown in FIG. 2 described above uses the monocular camera 21 shown in FIG. 4 to detect a vehicle side image that is a side image of the host vehicle 31, for example, the left side.
  • the direction images P1, P2,..., Pn are photographed intermittently or continuously.
  • the monocular camera 21 is provided on the left side surface of the host vehicle 31, for example.
  • the peripheral photographing device 2 may include a monocular camera 22 provided on the right side surface on the right side in the traveling direction 32 of the host vehicle 31.
  • the peripheral photographing device 2 obtains a distance Yln between the own vehicle 31 and an object on the side of the own vehicle 31, for example, the parked vehicles L1, L3, L4, from the vehicle side images taken by the monocular cameras 21, 22.
  • the horizontal axis indicates the distance X from the host vehicle 31
  • the vertical axis indicates the depth Yl on the left side of the host vehicle 31.
  • the direction of the horizontal axis corresponds to the horizontal direction X of the display screen of the display device 4, for example.
  • the distance X from the host vehicle 31 is a distance from the parking space presentation position Te in the traveling direction 32 of the host vehicle 31.
  • the peripheral photographing device 2 And the distance Yl1 between the host vehicle 31 and the third parked vehicle L3, and the distance Yl4 between the host vehicle 31 and the fourth parked vehicle L4 are recorded.
  • the peripheral photographing device 2 generates image information representing the photographed vehicle side images, for example, left side images P1, P2,.
  • the peripheral photographing device 2 includes the generated image information and the recorded distance between the own vehicle 31 and an object on the side of the own vehicle 31, for example, the distances Yl1, Yl3, Yl4 between the own vehicle 31 and the parked vehicles L1, L3, L4. Is provided to the image composition unit 11 of the parking assistance display control device 10.
  • the distance between the host vehicle 31 and an object on the side of the host vehicle 31 may be obtained by the periphery detection device 3.
  • the surroundings detection apparatus 3 calculates
  • the distance between the host vehicle 31 and the object on the side of the host vehicle 31 is a travel locus that represents a track that the host vehicle 31 has traveled during a parking space search to search for a parking space, and the host vehicle 31. Is the shortest distance from the object on the side.
  • the distances Yl1, Yl3, and Yl4 between the host vehicle 31 and the parked vehicles L1, L3, and L4 are the parking space search of the host vehicle 31.
  • FIG. 6 is a diagram illustrating an example of the composite image 40.
  • the image composition unit 11 generates a composite image 40 in which the vehicle side images are joined along the traveling direction 32 of the host vehicle 31 based on the image information given from the peripheral photographing device 2.
  • the image composition unit 11 gives the generated composite image 40 to the control unit 13.
  • the horizontal axis indicates the distance X from the host vehicle 31, and the vertical axis indicates the height Z.
  • the height Z is a height from the road surface on which the host vehicle 31 travels.
  • FIG. 7 is a diagram illustrating an example of a vehicle unit image.
  • the control unit 13 cuts out an image of a parked vehicle unit (hereinafter sometimes referred to as “vehicle unit image”) from the composite image 40 given from the image composition unit 11 by image processing.
  • vehicle unit image an image of a parked vehicle unit
  • the control unit 13 divides the composite image 40 into seven images a1, a3, a4, and b1 to b4 (hereinafter sometimes referred to as “divided images”).
  • Three vehicle unit images a1, a3, and a4, each including three parked vehicles L1, L3, and L4, are cut out.
  • the control unit 13 may estimate the position of each parked vehicle L1, L3, L4 using the distance measurement result by the periphery detection device 3, and may cut out the vehicle unit images a1, a3, a4.
  • FIG. 8 is a diagram illustrating an example of the drawing plane 41 for the left eye
  • FIG. 9 is a diagram illustrating an example of the drawing plane 42 for the right eye.
  • the control unit 13 provides a left-eye drawing plane 41 shown in FIG. 8 and a right-eye drawing plane 42 shown in FIG. 9 as drawing planes for creating a stereoscopic image.
  • FIG. 10 is a diagram for explaining a method of pasting a divided image onto a drawing plane.
  • FIG. 10A is a diagram illustrating an example of divided images a1, a3, a4, and b1 to b4 including vehicle unit images a1, a3, and a4.
  • FIG. 10B is a diagram showing an example of a state in which the divided images a1, a3, a4, b1 to b4 are pasted on the left-eye drawing plane 41.
  • FIG. 10C is a diagram showing an example of a state in which the divided images a1, a3, a4, b1 to b4 are pasted on the right-eye drawing plane.
  • the control unit 13 generates a parking space display object 51 representing the position of the parking space SL shown in FIG. 6 and superimposes the generated parking space display object 51 on the corresponding divided image b2 of the composite image 40.
  • the parking space display object 51 is a mark shown in FIG. 10, for example.
  • the control unit 13 calculates binocular parallax so that each of the parked vehicles L1, L3, and L4 is displayed at the position of the Y axis that is a depth position corresponding to the distance of the parked vehicles L1, L3, and L4.
  • the position where the divided images a1, a3, a4, b1 to b4 are pasted on the left and right drawing planes 41 and 42 is calculated.
  • the pasting positions of the vehicle unit images a1, a3, and a4 on the left-eye drawing plane 41 and the vehicle unit image a1 on the right-eye drawing plane 42 , A3, a4 are less different from the pasting positions.
  • the binocular parallax increases as the depths of the parked vehicles L1, L3, and L4 are smaller. Therefore, the attachment positions of the vehicle unit images a1, a3, and a4 on the left-eye drawing plane 41 and the right-eye drawing plane 42 The difference from the attachment position of the vehicle unit images a1, a3, a4 becomes large.
  • the control unit 13 pastes the divided images a1, a3, a4, b1 to b4 at the calculated positions of the drawing planes 41 and 42.
  • the left-eye divided images a1, a3, a4, b1 to b4 pasted on the left-eye drawing plane 41 are denoted by reference numerals a1-l, a3-l, a4-l, and b1-l to b4-l.
  • the right-eye divided images a1, a3, a4, b1 to b4 to be pasted on the right-eye drawing plane 42 are represented by reference symbols a1-r, a3-r, a4-r, b1-r to b4-r.
  • the control unit 13 performs interpolation by enlarging or reducing the screen on the background side so that a loss between images does not occur.
  • FIG. 11 is a diagram illustrating an example of the depth of a stereoscopic image.
  • FIG. 12 is a diagram illustrating an example of a parking space image including a stereoscopic image.
  • FIG. 13 is a diagram illustrating a relationship between the depth of the stereoscopic image and the position of each stereoscopic image in the parking space image.
  • the left-eye image 43L and the right-eye image 43R shown in FIG. 10 generated by the control unit 13 as described above are given to the image output unit 14 and output from the image output unit 14 to the display device 4.
  • a parking space image 45 including stereoscopic images L1, L3, L4, 51 with depth is displayed on the display screen 44 of the display device 4, and is observed by the driver of the host vehicle 31. Is done.
  • the depth display positions of the stereoscopic images L1, L3, L4, and 51 included in the parking space image 45 are in a positional relationship as shown in FIGS.
  • the depth display positions of the divided images a1, a3, a4, b1 to b4 are indicated by the corresponding reference symbols a1, a3, a4, b1 to b4.
  • FIG. 14 and FIG. 15 are flowcharts showing a processing procedure related to the display control process in the parking assistance display control apparatus 10 according to the second embodiment of the present invention.
  • display control processing for displaying the parking space image 45 shown in FIG. 13 on the display device 4 is performed according to the processing procedure shown in FIGS. 14 and 15.
  • step S1 in FIG. 14 is executed by the image composition unit 11, the parking space determination unit 12, the control unit 13, the image output unit 14, and the in-vehicle LAN interface 15 of the parking assistance display control device 10. .
  • the processing of the flowcharts shown in FIGS. 14 and 15 is started when the parking assist display control device 10 is turned on, and the process proceeds to step S1 in FIG.
  • step S1 the in-vehicle LAN interface 15 acquires vehicle information from the in-vehicle LAN 5. If vehicle information is acquired, it will transfer to Step S2.
  • step S2 the control unit 13 acquires operation information from the operation input device 6. When the operation information is acquired, the process proceeds to step S3.
  • step S3 the control unit 13 determines whether or not a parking space search start condition is satisfied.
  • the parking space search start condition is that the speed of the host vehicle 31 (hereinafter sometimes referred to as “vehicle speed”) is less than a predetermined speed, for example, less than 10 km / h, or a parking space from the user.
  • vehicle speed the speed of the host vehicle 31
  • a detection command is input. If it is determined that the parking space search start condition is satisfied, the process proceeds to step S4. If it is determined that the parking space search start condition is not satisfied, the process returns to step S1.
  • step S4 the image composition unit 11 acquires image information from the peripheral photographing device 2.
  • the image synthesizing unit 11 acquires image information representing the aforementioned vehicle side images, for example, the left side images P1, P2,..., Pn shown in FIG.
  • the process proceeds to step S5.
  • step S ⁇ b> 5 the parking space determination unit 12 acquires vehicle periphery information from the periphery detection device 3.
  • the vehicle periphery information is, for example, the relative position or distance between the host vehicle and an obstacle such as another vehicle or a feature. If vehicle periphery information is acquired, it will transfer to step S6.
  • step S6 the parking space determination unit 12 performs a parking space detection process.
  • the parking space determination unit 12 determines whether there is a parking space using the vehicle periphery information acquired in step S5.
  • the parking space determination unit 12 obtains a parking space in the parking space detection process, and determines that parking is possible when there is a route for parking the host vehicle at that position. To do.
  • the parking space determination unit 12 obtains the travel route of the host vehicle from the vehicle information given from the in-vehicle LAN interface 15, the image captured by the peripheral imaging device 2, and the detection information detected by the peripheral detection device 3. Of time attribute and position attribute. The parking space determination unit 12 uses these to determine whether or not there is a route for the own vehicle to park at the position of the determined parking space. After the parking space detection process ends, the process proceeds to step S7.
  • step S7 the control unit 13 acquires operation information from the operation input device 6.
  • step S8 the process proceeds to step S8.
  • step S8 the in-vehicle LAN interface 15 acquires vehicle information from the in-vehicle LAN 5. If vehicle information is acquired, it will transfer to step S9 of FIG.
  • the control unit 13 determines whether or not the parking space search end condition is satisfied.
  • the parking space search end condition is that the vehicle speed is equal to or higher than a predetermined speed, for example, 10 km / h or higher, or that a parking space search end command is input from the user.
  • step S10 the control unit 13 determines whether or not a parking space display condition is satisfied.
  • the parking space display condition is to satisfy any of the cases where a parking space is found, the host vehicle stops, or a parking space display command is input from the user. .
  • the process proceeds to step S11.
  • the process proceeds to step S15.
  • step S11 the image composition unit 11 generates a composite image, for example, the composite image 40 shown in FIG.
  • the process proceeds to step S12.
  • step S12 the control unit 13 generates a parking space display object, for example, the parking space display object 51 shown in FIG.
  • a parking space display object for example, the parking space display object 51 shown in FIG.
  • step S13 the control unit 13 superimposes the parking space display object generated in step S12 on the composite image generated in step S11 to generate a parking space image, for example, the parking space image 45 shown in FIG. .
  • a parking space image for example, the parking space image 45 shown in FIG. .
  • step S14 the image output unit 14 outputs the parking space image generated in step S14 to the display device 4. After outputting the parking space image to the display device 4, the process returns to step S4 in FIG.
  • step S15 the control unit 13 deletes the parking space image generated in step S14. After erasing the parking space image, the process returns to step S4 in FIG.
  • the parking space determination unit 12 determines whether or not there is a parking space around the host vehicle 31 based on the situation around the host vehicle 31 detected by the surrounding detection device 3. .
  • Image information representing vehicle side images that are sequentially photographed along the traveling direction 32 of the host vehicle 31 by the peripheral photographing device 2 is acquired by the image synthesis unit 11, and based on the acquired image information, the vehicle side images are displayed.
  • a composite image 40 joined along the traveling direction 32 of the host vehicle 31 is generated.
  • the control unit 13 When it is determined by the parking space determination unit 12 that there is a parking space, the control unit 13 superimposes the parking space display object 51 on the synthesized image 40 generated by the image synthesis unit 11, and the parking space image 45. Is generated.
  • the parking space image 45 generated by the control unit 13 is output to the display device 4 by the image output unit 14.
  • the parking space image is a stereoscopic image that is at least partially visible in a stereoscopic manner. Accordingly, it is possible to present a parking space in a display form that can be intuitively understood by the driver.
  • the parking space image 45 contains the parked vehicle area
  • the parking space can be presented to the driver in a display form that can be understood more intuitively.
  • the parking available space is represented by a parking space display object 51 that is a stereoscopic image, specifically, a stereoscopic image.
  • the parking space can be presented to the driver in a display form that can be understood more intuitively.
  • FIG. 16 and FIG. 17 are diagrams for explaining a method of cutting a vehicle unit image.
  • FIG. 16 is a diagram showing a vehicle unit image a1 cut out in the second embodiment of the present invention shown in FIG.
  • FIG. 17 is a diagram illustrating another example of the vehicle unit image.
  • the control unit 13 divides the parked vehicle portion a1-1 of the first parked vehicle L1 and the non-parked vehicle portion a1-2, and only the parked vehicle portion a1-1 is Parking is performed so that the depth corresponding to the distance Yl1 between the vehicle 31 and the first parked vehicle L1 is displayed, and the portion a1-2 other than the parked vehicle is displayed so that there is more depth than the parked vehicle portion a1-1.
  • a space image may be generated.
  • FIG. 18 is a diagram illustrating another example of the background position of the parked vehicle.
  • the divided images b1 to b4 that do not include a parked vehicle as a background may be displayed such that the distance from the own vehicle 31, that is, the depth distance d0 is the same.
  • the peripheral photographing device 2 includes the monocular cameras 21 and 22, but is not limited thereto.
  • FIG. 19 is a diagram showing a simplified configuration of the binocular camera device 60.
  • the peripheral photographing device 2 may include a binocular camera device 60 shown in FIG. 19 instead of the monocular cameras 21 and 22.
  • the binocular camera device 60 includes two cameras, that is, a first camera 61 and a second camera 62.
  • the first camera 61 and the second camera 62 are provided separated by a predetermined distance D2.
  • the binocular camera device 60 is provided such that the first camera 61 captures a left image and the second camera 62 captures a right image.
  • the binocular camera device 60 when the binocular camera device 60 is provided on the left side surface of the host vehicle 31, the binocular camera device 60 has the first camera 61 disposed on the rear side in the traveling direction 32 of the host vehicle 31, and the second camera 62. It is provided so as to be arranged on the front side in the traveling direction 32 of the host vehicle 31.
  • Two images taken at the same time t by the first camera 61 and the second camera 62 of the binocular camera device 60 that is, the left image PicL (t) taken by the first camera 61 and the second camera 62.
  • a stereoscopic image is generated using the paired right image PicR (t).
  • the left image PicL shot by the first camera 61 (T) is the same as the first image Pic (t) taken by the monocular camera 21 at time t.
  • the right image PicR (t) photographed by the second camera 62 is the same as the second image Pic (t + ⁇ t) photographed by the monocular camera 21 at time t + ⁇ t.
  • FIG. 20 is a diagram schematically illustrating a state in which a left-side image is captured using the binocular camera device 60.
  • the peripheral photographing device 2 synthesizes left and right pair images for stereoscopic images individually photographed by the binocular camera device 60 on the drawing plane.
  • the peripheral photographing device 2 uses the binocular camera device 60 shown in FIG. 19 described above to display an image of the side of the own vehicle 31 while the own vehicle 31 travels in the traveling direction 32.
  • a vehicle side image for example, a left side image is taken.
  • the binocular camera device 60 uses the first camera 61 and the second camera 62 at each position as the left image PicL (t), the left-eye left-side images Pl1, Pl2,..., Pln and the right image PicR (t). .., Prn for the right-eye left side images Pr1, Pr2,.
  • n is a natural number.
  • the left-eye left images Pl1, Pl2,..., Pln and the right-eye left-side images Pr1, Pr2,..., Prn are a pair of left and right images taken at the same point. It has a distance attribute to L1, L3, L4 or an obstacle.
  • the distance between the host vehicle 31 and the parked vehicles L 1, L 3, L 4 or the obstacle may be measured by an ultrasonic sensor that is the periphery detection device 3, or is measured from an image captured by the binocular camera device 60. May be.
  • FIG. 21 is a diagram illustrating an example of the drawing plane 71 for the left eye
  • FIG. 22 is a diagram illustrating an example of the drawing plane 72 for the right eye.
  • the control unit 13 provides a left-eye drawing plane 71 shown in FIG. 21 and a right-eye drawing plane 72 shown in FIG. 22 as drawing planes for creating a stereoscopic image.
  • the depth of the left-eye drawing plane 71 and the depth of the right-eye drawing plane 72 are the same distance as shown in FIG.
  • FIG. 23 is a diagram for explaining a method of pasting on a drawing plane.
  • the control unit 13 uses the left-eye left-side images Pl1, Pl2,..., Pln and the right-eye left-side images Pr1, Pr2,.
  • left and right pair images corresponding to the captured parked vehicles L1, L3, and L4 and left and right parking space display objects are prepared.
  • a left-eye vehicle image an-1 and a right-eye vehicle image an-r that are left and right pair images, and a left-eye display object 51L and a right-eye display object 51R that are left and right parking space display objects are prepared.
  • the control unit 13 pastes the prepared left and right pair images an-1 and an-r and the left and right parking space display objects 51L and 51R on the corresponding left and right drawing planes 71 and 72, respectively.
  • the positions on the left and right drawing planes 71 and 72 to which the left and right image pairs are pasted are the same.
  • a right-eye image 73R to which the display object 51L is pasted is generated.
  • FIG. 24 is a diagram illustrating another example of the depth of a stereoscopic image.
  • FIG. 25 is a diagram illustrating another example of a parking space image including a stereoscopic image.
  • the left-eye image 73L and the right-eye image 73R shown in FIG. 21 generated by the control unit 13 as described above are given to the image output unit 14 and output from the image output unit 14 to the display device 4.
  • the background portions are displayed such that the depth distance d0 is the same.
  • the parked vehicles L1, L3, and L4 are displayed with a depth corresponding to the distance from the host vehicle 31.
  • the fourth parked vehicle L4 is displayed with a depth distance d4 corresponding to the distance Yl4 from the host vehicle 31.
  • the parking space image 75 including the three stereoscopic images L1, L3, and L4 having a depth is displayed on the display screen 74 of the display device 4, and the vehicle 31 Observed by the driver.
  • control unit 13 constructs a vehicle and a parking space in a three-dimensional virtual space from the image captured by the monocular camera 21 shown in FIG. 4 or the binocular camera device 60 shown in FIG. May be configured to generate a stereoscopic image.
  • the vehicle side image is taken by the monocular camera 21 or the binocular camera device 60 and given to the control unit 13 via the image synthesis unit 11.
  • the control unit 13 three-dimensionally measures the shape and position of each vehicle from the given vehicle side image.
  • FIG. 26 is a diagram schematically showing a state in which the vehicle is mapped in the three-dimensional virtual space. Based on the shape and position of each vehicle measured as described above, the control unit 13 maps each vehicle in a three-dimensional virtual space as shown in FIG. At this time, a portion that could not be photographed by the monocular camera 21 or the binocular camera device 60 and could not be measured may be compensated for missing information by applying an appropriate vehicle model.
  • a virtual viewpoint Pver (x, y, z, ⁇ , ⁇ ) is provided in a three-dimensional virtual space
  • a virtual binocular camera device 80 is installed at that position, and the image is taken by the binocular camera device 80.
  • a left-eye image 81 and a right-eye image 82 are generated.
  • known three-dimensional image processing is applied to the image generation processing.
  • the left-eye image 81 and the right-eye image 82 thus generated by the control unit 13 are given to the image output unit 14 and output from the image output unit 14 to the display device 4.
  • a parking space image including a stereoscopic image having a depth is displayed on the display screen of the display device 4 and is observed by the driver of the host vehicle 31.
  • the display device 4 that displays a parking space image including a stereoscopic image with a depth is not limited to a display device that can display a stereoscopic image, and may be a head-up display that can perform three-dimensional expression. However, it may be a three-dimensional image display device according to another method.
  • FIG. 27 is a diagram for explaining a method of detecting a parking space.
  • the peripheral photographing device 2 uses the binocular camera device 60 installed on the left side of the vehicle 31, for example, for the right-eye images L1-r, L3-r, L4-r and the left-eye images for stereoscopic images. Photograph L1-l, L3-l, and L4-l in pairs.
  • the control unit 13 generates a composite image from the images L1-l, L1-r, L3-l, L3-r, L4-l, and L4-r taken by the binocular camera device 60.
  • the control unit 13 detects parking spaces SL-l and Sl-r from the generated composite image.
  • the left and right empty spaces are detected with an ultrasonic sensor in combination with shooting.
  • a known technique can be used for the detection algorithm and parking availability determination.
  • the control unit 13 presents to the driver by causing the display device 4 to display an image in which the display objects indicating the parking spaces SL-l and Sl-r detected in this way are superimposed on the composite image. For example, as shown in FIG. 12 described above, both the vehicle and the parking space are displayed as stereoscopic images.
  • FIG. 28 is a diagram showing another example of a parking space image.
  • the parking space image is displayed on the display device 4 along the traveling direction of the vehicle corresponding to the vehicle side images L 1, L 3, and L 4 on the display screen of the display device 4. It may be configured to be displayed in order from the lower side toward the upper side.
  • the third parked vehicle L3 is parked farther than the fourth parked vehicle L4.
  • the distance Yl3 with the parked vehicle L3 is larger than the distance Yl4 between the host vehicle 31 and the fourth parked vehicle L4 (Yl3> Yl4). Therefore, in the parking space image 90 shown in FIG. 28B, the solid angle of the third parked vehicle L3 is slightly narrower than the solid angle of the fourth parked vehicle L4.
  • FIG. 29 to FIG. 32 are diagrams schematically showing viewpoints of parking space images.
  • the parking space image may be displayed by changing the viewpoint of the display area.
  • the parking space image may be displayed as an image viewed from the viewpoint Peey 1 from the front in the horizontal direction.
  • the parking space image may be displayed as an image viewed from the viewpoint Peee2 from a slightly horizontal position.
  • the parking space image may be displayed as an image viewed from the viewpoint Peee3 viewed from directly above.
  • the parking space image may be displayed as an image viewed from an oblique viewpoint Peey4.
  • the parking pace image may be displayed by changing the viewpoint of a predetermined designated area, for example, a parking space SL.
  • control unit 13 selects a parking space and performs viewpoint conversion only on the parking space.
  • the area other than the designated area may be displayed with the original viewpoint, or may be displayed with the viewpoint changed in the same manner as the designated area. Further, only the designated area may be displayed wider than the area other than the designated area.
  • a virtual three-dimensional display object 51 that represents the vehicle as a stereoscopic image may be displayed in the parking space SL.
  • the parking space image may be displayed as an image viewed from the viewpoint Peee 1 from the front in a horizontal direction by selecting a parking space as the designated area.
  • the virtual three-dimensional display object 51 and the wheel stopper 91 are displayed in the parking space.
  • the parking space image may be displayed as an image in which the parking space including the virtual three-dimensional display object 51 and the wheel stopper 91 is viewed from the viewpoint Peey 2 from the front slightly.
  • the parking space image may be displayed as an image viewed from the viewpoint Peee3 where the parking space including the virtual three-dimensional display object 51 and the wheel stopper 91 is seen from directly above.
  • the parking space image may be displayed as an image of a parking space including the virtual three-dimensional display object 51 and the wheel stopper 91 viewed from the viewpoint Peee 4 from diagonally above.
  • the parking space image may be converted into an image viewed from a viewpoint different from the viewpoint when the vehicle side image is captured by the peripheral imaging device 2 and displayed.
  • a predetermined designated area in the parking space image may be converted into an image viewed from a viewpoint different from the viewpoint when the vehicle side image is captured by the peripheral imaging device 2 and displayed.
  • FIG. 37 is a diagram showing another example of a parking space image.
  • the parking space image is displayed even if the area E3 of the parking space SL where no vehicle is present is displayed as a stereoscopic image and the vehicle areas E1 and E2 where the vehicle is present are displayed as a two-dimensional image. Good.
  • the actual depth of the parked vehicle is as shown in FIG. 37A, but the vehicle areas E1 and E2 are displayed as a two-dimensional image. Therefore, when combining a two-dimensional image with a stereoscopic image, as shown in FIG. 37 (b), the depth distance Yl of the parked vehicles L1, L3, L4 is set to the distance between any parked vehicle and the host vehicle 31, for example, The distance Yl4 between the fourth parked vehicle L4 and the host vehicle 31 is unified.
  • FIG. 38 is a diagram illustrating another example of the parking space image.
  • the parking space image may include a three-dimensional display object that indicates the position of the host vehicle 31.
  • the position of the host vehicle on which the stereoscopic display object is displayed corresponds to the actual position of the vehicle in the three-dimensional space.
  • region may be displayed in cooperation compared with another area
  • the parked vehicle area may be displayed as a deformed display object that deforms and displays the vehicles L1, L3, and L4.
  • the parking space can be presented to the driver in a display form that can be understood more intuitively.
  • the left and right vehicle groups are displayed as a composite image including a stereoscopic image obtained by connecting a plurality of images photographed from the host vehicle photographed from the driver's viewpoint.
  • the third parked vehicle L3 is parked farther than the fourth parked vehicle L4, and the distance Yl3 between the third parked vehicle L3 and the host vehicle is the distance between the fourth parked vehicle L4 and the host vehicle. It is larger than Yl4 (Yl3> Yl4). Therefore, in the stereoscopic image shown in FIG. 38, the solid angle of the third parked vehicle L3 is slightly narrower than that of the fourth parked vehicle L4.
  • FIG. 39 is a diagram illustrating an example of a parking space image when there are parking space candidates on the left and right of the vehicle.
  • the peripheral photographing device 2 captures left and right binocular images with the monocular camera 21 or the binocular camera device 60 installed on the left and right of the host vehicle 31. Then, image information representing the photographed image is given to the image composition unit 11.
  • the image composition unit 11 generates composite images 261 and 262 from the provided image information.
  • the parking space determination unit 12 searches the left and right parking spaces. Thereby, for example, the left parking space SL and the right parking space SR are detected.
  • the control unit 13 superimposes display objects indicating the parking spaces SL and SR on the left and right composite images 261 and 262, that is, the left composite image 261 and the right composite image 262. Thereby, a parking space image 140 including the left and right parking spaces SL and SR is generated. By setting it as such a parking space image 140, the parking space can be presented to the driver in a display form that can be understood more intuitively.
  • the host vehicle 31 is displayed on a predetermined reference line 141 on the display screen of the display device 4, for example, the center line of the display screen. Specifically, the host vehicle 31 is displayed such that a line representing the travel locus of the host vehicle 31 coincides with the reference line 141 on the display screen.
  • the parking space image 140 shown in FIG. 39 is displayed on the display screen of the display device 4 according to the positional relationship between the own vehicle 31 and the vehicle or the parking area existing on the right side and the left side in the traveling direction 32 of the own vehicle 31.
  • the display position of the left composite image 261 and the right composite image 262 may be generated so as to be changed.
  • the parking space can be presented to the driver in a display form that can be understood more intuitively.
  • the display positions of the left composite image 61 and the right composite image 62 on the display screen of the display device 4 are changed according to the side distances dl, dr and the traveling direction distances H1, Hr of each parked vehicle.
  • the side distances dl and dr are distances between the host vehicle 31 on the side of the host vehicle 31 and an object on the side of the host vehicle 31.
  • the side distances dl and dr are specifically the shortest distances between the travel locus 33 at the time of the parking space search of the host vehicle 31 and the objects on the side of the host vehicle 31.
  • the side distances dl and dr are the travel locus 33 when the parking space search of the own vehicle 31 is performed, This is the distance from the front end of the vehicle L1, L3, L4, R1, R2, R4 on the own vehicle 31 side.
  • the lateral distances dl and dr correspond to the distance Yln from the host vehicle 31 described above.
  • the traveling direction distances Hl and Hr are distances between the host vehicle 31 and an object on the side of the host vehicle 31 in the traveling direction of the host vehicle 31.
  • the traveling direction distances Hl and Hr are specifically the shortest distances between the end portion on the rear side in the traveling direction of the host vehicle 31 and the object on the side of the host vehicle 31 at the stop position.
  • the traveling direction distances H1 and Hr are the end portions on the rear side in the traveling direction of the host vehicle 31 at the stop position, This is the distance from the side surface of each vehicle L1, L3, L4, R1, R2, R4 on the own vehicle 31 side.
  • the traveling direction distances Hl and Hr correspond to the distance X from the host vehicle 31 described above.
  • the parking space image 150 may include a host vehicle position display object 151 that indicates the position of the host vehicle 31.
  • the parking space can be presented to the driver in a display form that can be understood more intuitively.
  • the left composite image 261 is displayed in the left region of the display screen of the display device 4 toward the display screen corresponding to the predetermined first display region, which corresponds to the predetermined second display region.
  • the parking space image 150 is generated so that the right composite image 262 is displayed in the right surface area toward the display screen.
  • the parking space can be presented to the driver in a display form that can be understood more intuitively.
  • the left composite image 261 and the right composite image 262 are displayed so as to be rotated at predetermined rotation angles in opposite directions from the traveling direction 32 of the host vehicle 31 with the position of the host vehicle 31 as a fulcrum.
  • a parking space image 150 is generated. Specifically, the left composite image 261 and the right composite image 262 are parked so that they are displayed by being rotated 90 ° in opposite directions from the center line 152 of the display screen that coincides with the traveling direction 32 of the host vehicle 31.
  • a space image 150 is generated.
  • the parking space can be presented to the driver in a display form that can be understood more intuitively.
  • the parking space image 160 is recommended in addition to the first display area 161 where the left composite image 261 is displayed and the second display area 162 where the right composite image 262 is displayed.
  • a third display area 163 indicating a space may be included.
  • the parking space can be presented to the driver in a display form that can be understood more intuitively.
  • the parking space image 170 may include a host vehicle position display object 151 indicating the position of the host vehicle 31.
  • the parking space can be presented to the driver in a display form that can be understood more intuitively.
  • the parking space image 170 is divided into a host vehicle display area 171 in which the host vehicle position display object 151 is displayed and a composite image display area 172 in which the composite images 261 and 262 are displayed. It may be a configured.
  • the control unit 13 displays a rear image 173 on the rear side in the traveling direction 32 of the host vehicle 31 photographed by the peripheral photographing device 2 between the left composite image 261 and the right composite image 262.
  • the parking space image 170 may be generated so as to insert the left composite image 261, the rear image 171 and the right composite image 262 together.
  • the parking space can be presented to the driver in a display form that can be understood more intuitively.
  • the control unit 13 may generate the parking space image 180 so that a part of the parking space image 180 is displayed in a display form different from the remaining part.
  • a part of the parking space image 180 specifically, the host vehicle display area 182 in which the host vehicle 31 is displayed is displayed as an overhead image.
  • areas other than the host vehicle display area 182, for example, the composite image display areas 181 and 183 where the composite images 261 and 262 are displayed may be displayed as two-dimensional images.
  • FIG. 44 is a diagram showing another example of a parking space image.
  • the horizontal axis indicates the horizontal direction X
  • the vertical axis indicates the depth Y.
  • the parking space image may be combined with the overhead image and displayed.
  • the parking space image is displayed as a composite of the image of the host vehicle 31 shown in FIG. 44 (a) and the bird's-eye view images shown in FIGS. 44 (c) and 44 (d).
  • the image of the vehicle 31 synthesized or photographed as a bird's-eye view and the stereoscopic images L4 and R4 photographed from the driver's viewpoint are displayed side by side.
  • the traveling area that is not the parking area is an overhead image, it is preferable that the position of the depth plane on which the overhead image is displayed is matched to the depth of the parking space.
  • the overhead image may be a stereoscopic image as shown in FIG. 44
  • y1 is the distance from the tip of the left parking space 210 on the left side of the host vehicle 31 to the tip of the fourth left vehicle L4, that is, around the parking area 212 where the fourth left vehicle L4 is parked.
  • the width of the area 211 is shown.
  • y2 represents the distance from the left side surface of the host vehicle 31 to the tip of the left parking space 210.
  • Y3 indicates the distance from the right side surface of the host vehicle 31 to the tip of the right parking space 215 on the right side of the host vehicle 31.
  • y4 shows the distance from the front-end
  • Zr indicates the vehicle height of the fourth right vehicle R4.
  • zl represents the vehicle height of the fourth left vehicle L4.
  • za indicates the vehicle height of the host vehicle display object representing the host vehicle 31.
  • xa indicates the vehicle width of the host vehicle display object.
  • xw indicates the width of the travel area.
  • y2 is the distance from the left side surface of the host vehicle 31 to the tip of the left parking space 210, but may be half (1/2) the width of the travel area, or the host vehicle 31. The distance from the center of the vehicle to the tip of the left parking space 210 may be used. The same applies to y3.
  • the road Since the road is basically a plane, it does not have to be a stereoscopic image, but by displaying at least a part of the parking space image as an overhead image, for example, an obstacle such as a stone can be displayed in a stereoscopic manner. Can do.
  • FIG. 45 is a diagram illustrating an example of a parking space image when a recommended parking space exists. For example, as shown in FIG. 45, it is assumed that seven parking vehicles exist in the parking area and two parking spaces PL1 and PL2 exist.
  • FIG. 45 shows a display example when the second parking space PL2 is recommended.
  • FIG. 45A is a diagram for explaining a parking situation in the parking area, and corresponds to an overhead view seen from above. The display seen from the front is shown in FIG. 45 (c), and the depth representation of the stereoscopic display is shown in FIG. 45 (b).
  • the recommended parking space PL2 and the adjacent parked vehicles L6 and L8 are displayed as a stereoscopic image, and the others are displayed as a planar image.
  • the area Q2 including the parking space PL2 and the adjacent parked vehicles L6 and L8 is set as a stereoscopic image display area displayed as a stereoscopic image, and the areas Q1 and Q3 other than the stereoscopic image display area Q2 are planar.
  • a planar image display area displayed as an image is used.
  • FIG. 46 is a diagram showing another example of a parking space image.
  • FIG. 46A is a diagram for explaining the state of depth display
  • FIG. 46B is a diagram illustrating a display example of a recommended parking space.
  • a display object 220 indicating a vehicle may be added and displayed in the recommended parking space PL2 instead of the plane image.
  • the display object 220 is a display object represented by a stereoscopic image, for example.
  • FIG. 47 is a diagram showing another example of a parking space image.
  • FIG. 47A is a diagram illustrating the depths of the recommended parking spaces SL and SR
  • FIG. 47B is a diagram illustrating a display example of the recommended parking spaces.
  • the display mode may be changed between a parking area on the side where a recommended parking space exists and other parking areas.
  • the parking space image 230 of the example shown in FIG. 47 the right parking space SR is recommended, and the right parking area 232 is displayed as a stereoscopic image with respect to the center line 233 of the display screen of the display device 4.
  • the parking area 231 is displayed as a planar image.
  • a display object may be used instead of the planar image.
  • the composite image on the side where the recommended parking space exists that is, the right composite image 262
  • the remaining composite image that is, the left composite image 261
  • FIG. 48 is a block diagram showing the configuration of the parking assistance apparatus 201 according to the third embodiment of the present invention.
  • the parking assistance device 201 includes a parking assistance display control device 10 ⁇ / b> A, a peripheral photographing device 2, a peripheral detection device 3, a display device 4, an in-vehicle LAN 5, an operation input device 6, and a travel drive device 7.
  • the parking assistance display control device 10 ⁇ / b> A includes an image composition unit 11, a parking space determination unit 12, a control unit 13, an image output unit 14, and an automatic parking control unit 16.
  • the parking support display control device 10A is mounted on a moving body such as a vehicle, for example.
  • the peripheral photographing device 2, the peripheral detection device 3, the display device 4, the in-vehicle LAN 5, the operation input device 6, and the travel drive device 7 are provided in the host vehicle on which the parking assist display control device 10A is mounted.
  • parking support apparatus 201 of the present embodiment includes the same configuration as parking support apparatus 100 of the first embodiment and parking support apparatus 200 of the second embodiment, the same configuration is used. Are denoted by the same reference numerals, and a common description is omitted.
  • the parking assistance display control method according to another embodiment of the present invention is executed by the parking assistance display control apparatus 10A of the present embodiment.
  • the automatic parking control unit 16 controls the driving device 7 based on the parking space information and the parking space selection information, and executes automatic parking.
  • the traveling drive device 7 controls the steering wheel, brake, and accelerator of the vehicle to drive the vehicle.
  • FIG. 49 is a diagram showing an example of a parking space image 290 according to the third embodiment of the present invention.
  • the parking space image 290 includes a display object 291 for selecting a parking available space.
  • the display object 291 includes a character display area 292 for displaying characters such as “Please select a parking space”, a button 293 for selecting the left parking space SL, and a right parking space SR.
  • the parking space image 290 includes a vehicle side image, it is possible to obtain the same effects as those of the first and second embodiments described above.
  • the parking support display control device of the present embodiment described above is not only a navigation device that can be mounted on a vehicle, but also a parking support system constructed as a system by appropriately combining communication terminal devices, server devices, and the like. Can be applied.
  • the communication terminal device is, for example, a PND (Portable Navigation Device) and a portable communication device having a function of communicating with a server device.
  • the mobile communication device is, for example, a mobile phone, a smartphone, and a tablet terminal device.
  • each component of the parking assistance display control device includes each device that constructs the system. It may be arranged in a distributed manner, or may be arranged concentrated on any device.
  • the image composition unit, the parking space determination unit, and the control unit provided in the parking assistance display control device may be arranged in the server device or in a communication terminal device such as a portable communication device.
  • the parking assistance apparatus in the case where the image composition unit, the parking space determination unit, and the control unit provided in the parking assistance display control device described above are arranged in the server device has a configuration shown in the following fourth embodiment. . Moreover, the parking assistance apparatus in case the above-mentioned image composition part, parking space determination part, and control part are arrange
  • positioned at a portable communication apparatus has a structure shown in the following 5th Embodiment.
  • FIG. 50 is a block diagram showing a configuration of a parking assistance apparatus 500 according to the fourth embodiment of the present invention.
  • the parking assistance apparatus 500 according to the present embodiment includes an information providing apparatus 300 and a server apparatus 400. Since the parking support device 500 of the present embodiment includes the same configuration as the parking support devices 100 and 200 of the first and second embodiments described above, the same reference numerals are used for the same configuration. In addition, a common description is omitted.
  • the information providing apparatus 300 is mounted on a vehicle.
  • the information providing apparatus 300 includes an information providing apparatus main body 310, a peripheral photographing apparatus 2, a peripheral detection apparatus 3, a display apparatus 4, an in-vehicle LAN 5, and an operation input apparatus 6.
  • the information providing apparatus main body 310 includes an image output unit 14, an in-vehicle LAN interface 15, an in-vehicle side control unit 311, and an in-vehicle side communication unit 312.
  • the in-vehicle side control unit 311 includes, for example, a CPU (Central Processing Unit) and a memory such as a writable RAM.
  • the memory stores a control program.
  • the CPU controls the image output unit 14, the in-vehicle LAN interface 15, and the in-vehicle side communication unit 312 in an integrated manner by executing a control program stored in the memory.
  • the server device 400 includes an image composition unit 11, a parking space determination unit 12, a server side communication unit 401, and a server side control unit 402.
  • the server side communication unit 401 communicates with the information providing apparatus 300.
  • the server side communication unit 401 is configured to be able to communicate with the information providing apparatus 300 via a communication network such as the Internet, for example.
  • the server-side control unit 402 includes, for example, a CPU and a memory such as a writable RAM.
  • the memory stores a control program.
  • the CPU executes the control program stored in the memory, thereby comprehensively controlling the image composition unit 11, the parking space determination unit 12, and the server side communication unit 401.
  • the image composition unit 11, the parking space determination unit 12, and the server-side control unit 402 are arranged in the server device 400. Even in this arrangement, the same effects as those of the first to third embodiments can be obtained.
  • FIG. 51 is a block diagram showing a configuration of a parking assistance apparatus 510 according to the fifth embodiment of the present invention.
  • the parking assistance apparatus 510 according to the present embodiment includes an information providing apparatus 300 and a portable communication apparatus 410. Since the parking support device 500 of the present embodiment includes the same configuration as the parking support device 500 of the fourth embodiment described above, the same configuration is denoted by the same reference numeral and is common. Description to be omitted is omitted.
  • the mobile communication device 410 is realized by, for example, a mobile phone, a smartphone, or a tablet terminal device.
  • the mobile communication device 410 includes an image composition unit 11, a parking space determination unit 12, a mobile side communication unit 411, and a mobile side control unit 412.
  • the mobile communication unit 411 communicates with the information providing apparatus 300.
  • the mobile communication unit 411 When communicating with the information providing apparatus 300, the mobile communication unit 411 is configured to be able to communicate with the information providing apparatus 300 via a communication network such as the Internet, for example. Further, when communicating with the information providing apparatus 300, the portable communication unit 411 communicates with the information providing apparatus 300 by wireless such as Bluetooth (registered trademark) or wireless LAN, which is a short-range wireless communication standard. It may be configured to be possible. Further, the mobile communication unit 411 may be configured to be able to communicate with the information providing apparatus 300 by a wired line such as a USB (Universal Serial Bus) cable or a LAN cable.
  • USB Universal Serial Bus
  • the portable side control unit 412 is constituted by, for example, a CPU and a memory such as a writable RAM.
  • the memory stores a control program.
  • the CPU executes the control program stored in the memory, thereby comprehensively controlling the image composition unit 11, the parking space determination unit 12, and the mobile communication unit 411.
  • the image composition unit 11, the parking space determination unit 12, and the portable side control unit 412 are arranged in the portable communication device 410. Even in this arrangement, the same effects as those of the first to third embodiments can be obtained.
  • 1, 10, 10A Parking support display control device 2 peripheral photographing device, 3 peripheral detection device, 4 display device, 5 in-vehicle LAN, 6 operation input device, 7 travel drive device, 11 image compositing unit, 12 parking space judgment Unit, 13 control unit, 14 image output unit, 15 in-vehicle LAN interface, 16 automatic parking control unit, 21 left camera, 22 right camera, 100, 200, 201, 500, 510 parking assist device, 311 in-vehicle control unit, 312 On-vehicle side communication unit, 300 information providing device, 310 information providing device body, 400 server device, 401 server side communication unit, 402 server side control unit, 410 mobile communication device, 411 mobile side communication unit, 412 mobile side control unit.

Abstract

Image information representing vehicle side images (P1, P2, ..., Pn) of the sides of a vehicle (31) photographed sequentially along the advancing direction (32) of the vehicle (31) by peripheral photographing devices (21, 22) is acquired. A composite image, in which the vehicle side images (P1, P2, ..., Pn) are spliced together along the advancing direction (32) of the vehicle (31), is generated on the basis of the acquired image information. On the basis of the state of the periphery of the vehicle (31) detected by a periphery detection device, it is determined whether there is a parkable space. If it is determined that there is a parkable space, a parking space image, in which a parking space display object indicating the position of the parkable space is superimposed on the composite image, is generated and outputted to a display device. At least a portion of the parking space image is a stereoscopic image viewable in three dimensions.

Description

駐車支援用表示制御装置および駐車支援用表示制御方法Parking assistance display control apparatus and parking assistance display control method
 本発明は、駐車支援用表示制御装置および駐車支援用表示制御方法に関する。 The present invention relates to a parking assistance display control device and a parking assistance display control method.
 車両の駐車を支援するために、駐車可能なスペースを検出し、運転者に提示する技術が提案されている。たとえば、特許文献1,2に開示される技術では、車両に搭載される複数のカメラで撮影された画像から、車両の上方から見たような俯瞰画像が合成され、運転者に提示される。 In order to support vehicle parking, a technology for detecting a parking space and presenting it to the driver has been proposed. For example, in the technologies disclosed in Patent Documents 1 and 2, an overhead image as seen from above the vehicle is synthesized from images taken by a plurality of cameras mounted on the vehicle and presented to the driver.
特開2006-129021号公報JP 2006-129021 A 特開2013-133098号公報JP 2013-1333098 A
 駐車可能スペースを探すために車両が比較的低い速度で移動している場合、運転者は、運転席の位置から周辺を見る。前述の特許文献1,2に開示される技術では、駐車可能スペースは、俯瞰画像として運転者に提示されるので、運転者が現実に見た情景とは異なる。したがって、運転者が、提示された駐車可能スペースを即座に理解できない場合があるという問題がある。 When the vehicle is moving at a relatively low speed to find a parking space, the driver looks around from the driver's seat position. In the technologies disclosed in Patent Documents 1 and 2 described above, the parking space is presented to the driver as a bird's-eye view image, which is different from the scene that the driver actually saw. Therefore, there is a problem that the driver may not be able to immediately understand the presented parking space.
 以上のことから、より直感的に運転者に駐車可能スペースを提示することができる技術が求められる。 From the above, technology that can more intuitively present the parking space to the driver is required.
 本発明の目的は、運転者にとって直観的に理解が可能な表示形態で駐車可能なスペースを提示することができる駐車支援用表示制御装置および駐車支援用表示制御方法を提供することである。 An object of the present invention is to provide a parking assistance display control device and a parking assistance display control method capable of presenting a parking space in a display form that can be intuitively understood by a driver.
 本発明の駐車支援用表示制御装置は、周辺検出装置によって検出される車両の周辺の状況に基づいて、車両の周辺に駐車可能な駐車可能スペースが有るか否かを判断する駐車可能スペース判断部と、周辺撮影装置によって車両の進行方向に沿って順に撮影される車両の側方の車両側方画像を表す画像情報を取得し、取得した画像情報に基づいて、車両側方画像を車両の進行方向に沿って繋ぎ合わせた合成画像を生成する画像合成部と、駐車可能スペース判断部によって駐車可能スペースが有ると判断された場合、駐車可能スペースの位置を示す駐車スペース表示オブジェクトを、画像合成部によって生成された合成画像に重畳して、駐車スペース画像を生成する制御部と、制御部によって生成された駐車スペース画像を表示装置に出力する画像出力部とを備え、駐車スペース画像は、少なくとも一部分が、立体的に視認可能な立体視画像であることを特徴とする。 The display control device for parking assistance according to the present invention is a parking space determination unit that determines whether there is a parking space that can be parked around the vehicle based on the situation around the vehicle detected by the surrounding detection device. And image information representing vehicle side images of the side of the vehicle that are sequentially photographed along the traveling direction of the vehicle by the peripheral photographing device, and based on the acquired image information, the vehicle side image is When it is determined that there is a parking available space by the image combining unit that generates a combined image that is connected along the direction and the parking space determination unit, a parking space display object that indicates the position of the parking space is displayed as an image combining unit. A control unit that generates a parking space image superimposed on the composite image generated by the control unit, and outputs the parking space image generated by the control unit to the display device And an image output unit, parking space image is at least partially, characterized in that it is a sterically visible stereoscopic image.
 本発明の駐車支援用表示制御方法は、周辺撮影装置によって車両の進行方向に沿って順に撮影される車両の側方の車両側方画像を表す画像情報を取得し、取得した画像情報に基づいて、車両側方画像を車両の進行方向に沿って繋ぎ合わせた合成画像を生成し、周辺検出装置によって検出される車両の周辺の状況に基づいて、車両の周辺に駐車可能な駐車可能スペースが有るか否かを判断し、駐車可能スペースが有ると判断された場合、駐車可能スペースの位置を示す駐車スペース表示オブジェクトを合成画像に重畳して、少なくとも一部分が立体的に視認可能な立体視画像である駐車スペース画像を生成し、生成された駐車スペース画像を、駐車スペース画像の立体視画像を立体的に表示可能な表示装置に出力することを特徴とする。 The display control method for parking assistance according to the present invention acquires image information representing a vehicle side image of a side of the vehicle that is sequentially photographed along the traveling direction of the vehicle by the peripheral imaging device, and based on the acquired image information. There is a parking space that can be parked around the vehicle based on the situation around the vehicle, which is generated by connecting the vehicle side images along the traveling direction of the vehicle to generate a composite image If it is determined that there is a parking space, a parking space display object indicating the position of the parking space is superimposed on the composite image, and at least a part of the stereoscopic image is stereoscopically visible. A certain parking space image is generated, and the generated parking space image is output to a display device capable of stereoscopically displaying a stereoscopic image of the parking space image.
 本発明の駐車支援用表示制御装置によれば、周辺検出装置によって検出される車両の周辺の状況に基づいて、車両の周辺に駐車可能スペースが有るか否かが、駐車可能スペース判断部によって判断される。周辺撮影装置によって車両の進行方向に沿って順に撮影される車両側方画像を表す画像情報が画像合成部によって取得され、取得された画像情報に基づいて、車両側方画像を車両の進行方向に沿って繋ぎ合わせた合成画像が生成される。駐車可能スペース判断部によって駐車可能スペースが有ると判断された場合、制御部によって、駐車スペース表示オブジェクトが、画像合成部によって生成された合成画像に重畳されて、少なくとも一部分が立体視画像である駐車スペース画像が生成される。制御部によって生成された駐車スペース画像が、画像出力部によって表示装置に出力される。これによって、運転者にとって直観的に理解が可能な表示形態で駐車可能なスペースを提示することができる。 According to the parking assistance display control device of the present invention, the parking space determination unit determines whether there is a parking space around the vehicle based on the situation around the vehicle detected by the periphery detection device. Is done. Image information representing vehicle side images that are sequentially photographed along the traveling direction of the vehicle by the peripheral photographing device is acquired by the image composition unit, and based on the acquired image information, the vehicle side image is set in the traveling direction of the vehicle. A composite image joined together is generated. When the parking space determination unit determines that there is a parking space, the control unit superimposes the parking space display object on the synthesized image generated by the image synthesis unit, and at least a part of the parking image is a stereoscopic image. A space image is generated. The parking space image generated by the control unit is output to the display device by the image output unit. Accordingly, it is possible to present a parking space in a display form that can be intuitively understood by the driver.
 本発明の駐車支援用表示制御方法によれば、周辺撮影装置によって車両の進行方向に沿って順に撮影される車両側方画像を表す画像情報が取得され、取得された画像情報に基づいて、車両側方画像を車両の進行方向に沿って繋ぎ合わせた合成画像が生成される。周辺検出装置によって検出される車両の周辺の状況に基づいて、車両の周辺に駐車可能スペースが有るか否かが判断される。駐車可能スペースが有ると判断された場合、駐車スペース表示オブジェクトが合成画像に重畳されて、少なくとも一部分が立体視画像である駐車スペース画像が生成される。生成された駐車スペース画像が表示装置に出力される。これによって、運転者にとって直観的に理解が可能な表示形態で駐車可能なスペースを提示することができる。 According to the parking assist display control method of the present invention, image information representing vehicle side images that are sequentially photographed along the traveling direction of the vehicle by the peripheral photographing device is acquired, and the vehicle is based on the acquired image information. A composite image is generated by joining the side images along the traveling direction of the vehicle. It is determined whether there is a parking space around the vehicle based on the situation around the vehicle detected by the periphery detection device. When it is determined that there is a parking space, the parking space display object is superimposed on the composite image, and a parking space image, at least a part of which is a stereoscopic image, is generated. The generated parking space image is output to the display device. Accordingly, it is possible to present a parking space in a display form that can be intuitively understood by the driver.
 本発明の目的、特徴、態様、および利点は、以下の詳細な説明と添付図面とによって、より明白となる。 The objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description and the accompanying drawings.
本発明の第1の実施の形態における駐車支援装置100の構成を示すブロック図である。It is a block diagram which shows the structure of the parking assistance apparatus 100 in the 1st Embodiment of this invention. 本発明の第2の実施の形態における駐車支援装置200の構成を示すブロック図である。It is a block diagram which shows the structure of the parking assistance apparatus 200 in the 2nd Embodiment of this invention. 本発明の第2の実施の形態における駐車支援用表示制御装置10のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the display control apparatus 10 for parking assistance in the 2nd Embodiment of this invention. 周辺撮影装置2の単眼カメラ21によって車両側方画像を撮影する様子を模式的に示す図である。It is a figure which shows typically a mode that a vehicle side image is image | photographed with the monocular camera 21 of the peripheral imaging device 2. FIG. 左側方画像を撮影する様子を模式的に示す図である。It is a figure which shows typically a mode that a left side image is image | photographed. 合成画像40の一例を示す図である。4 is a diagram illustrating an example of a composite image 40. FIG. 車両単位画像の一例を示す図である。It is a figure which shows an example of a vehicle unit image. 左目用描画プレーン41の一例を示す図である。5 is a diagram illustrating an example of a left-eye drawing plane 41. FIG. 右目用描画プレーン42の一例を示す図である。It is a figure which shows an example of the drawing plane 42 for right eyes. 分割画像の描画プレーンへの貼り付け方法を説明するための図である。It is a figure for demonstrating the attachment method to the drawing plane of a divided image. 立体視画像の奥行の一例を示す図である。It is a figure which shows an example of the depth of a stereoscopic vision image. 立体視画像を含む駐車スペース画像の一例を示す図である。It is a figure which shows an example of the parking space image containing a stereoscopic vision image. 立体視画像の奥行と駐車スペース画像における各立体視画像の位置との関係を示す図である。It is a figure which shows the relationship between the depth of a stereoscopic vision image, and the position of each stereoscopic vision image in a parking space image. 本発明の第2の実施の形態の駐車支援用表示制御装置10における表示制御処理に関する処理手順を示すフローチャートである。It is a flowchart which shows the process sequence regarding the display control process in the display control apparatus 10 for parking assistance of the 2nd Embodiment of this invention. 本発明の第2の実施の形態の駐車支援用表示制御装置10における表示制御処理に関する処理手順を示すフローチャートである。It is a flowchart which shows the process sequence regarding the display control process in the display control apparatus 10 for parking assistance of the 2nd Embodiment of this invention. 図7に示す本発明の第2の実施形態において切り取られる車両単位画像a1を示す図である。It is a figure which shows the vehicle unit image a1 cut out in the 2nd Embodiment of this invention shown in FIG. 車両単位画像の他の例を示す図である。It is a figure which shows the other example of a vehicle unit image. 駐車車両の背景の位置の他の例を示す図である。It is a figure which shows the other example of the position of the background of a parked vehicle. 両眼カメラ装置60の構成を簡略化して示す図である。It is a figure which simplifies and shows the structure of the binocular camera apparatus. 両眼カメラ装置60を用いて左側方画像を撮影する様子を模式的に示す図である。It is a figure which shows typically a mode that a left side image is image | photographed using the binocular camera apparatus. 左目用描画プレーン71の一例を示す図である。5 is a diagram illustrating an example of a left-eye drawing plane 71. FIG. 右目用描画プレーン72の一例を示す図である。It is a figure which shows an example of the drawing plane 72 for right eyes. 描画プレーンへの貼り付け方法を説明するための図である。It is a figure for demonstrating the sticking method to a drawing plane. 立体視画像の奥行の他の例を示す図である。It is a figure which shows the other example of the depth of a stereoscopic vision image. 立体視画像を含む駐車スペース画像の他の例を示す図である。It is a figure which shows the other example of the parking space image containing a stereoscopic vision image. 3次元仮想空間に車両をマッピングした状態を模式的に示す図である。It is a figure which shows typically the state which mapped the vehicle to the three-dimensional virtual space. 駐車可能スペースを検出する方法を説明するための図である。It is a figure for demonstrating the method to detect a parking space. 駐車スペース画像の他の例を示す図である。It is a figure which shows the other example of a parking space image. 駐車スペース画像の視点を模式的に示す図である。It is a figure which shows typically the viewpoint of a parking space image. 駐車スペース画像の視点を模式的に示す図である。It is a figure which shows typically the viewpoint of a parking space image. 駐車スペース画像の視点を模式的に示す図である。It is a figure which shows typically the viewpoint of a parking space image. 駐車スペース画像の視点を模式的に示す図である。It is a figure which shows typically the viewpoint of a parking space image. 駐車スペース画像の視点の他の例を模式的に示す図である。It is a figure which shows typically the other example of the viewpoint of a parking space image. 駐車スペース画像の視点の他の例を模式的に示す図である。It is a figure which shows typically the other example of the viewpoint of a parking space image. 駐車スペース画像の視点の他の例を模式的に示す図である。It is a figure which shows typically the other example of the viewpoint of a parking space image. 駐車スペース画像の視点の他の例を模式的に示す図である。It is a figure which shows typically the other example of the viewpoint of a parking space image. 駐車スペース画像の他の例を示す図である。It is a figure which shows the other example of a parking space image. 駐車スペース画像の他の例を示す図である。It is a figure which shows the other example of a parking space image. 車両の左右に駐車可能スペースの候補がある場合の駐車スペース画像の一例を示す図である。It is a figure which shows an example of the parking space image in case there exists a parking space candidate in the right and left of a vehicle. 車両の左右に駐車可能スペースの候補がある場合の駐車スペース画像の他の例を示す図である。It is a figure which shows the other example of a parking space image in case there exists a parking space candidate in the right and left of a vehicle. 車両の左右に駐車可能スペースの候補がある場合の駐車スペース画像の他の例を示す図である。It is a figure which shows the other example of a parking space image in case there exists a parking space candidate in the right and left of a vehicle. 車両の左右に駐車可能スペースの候補がある場合の駐車スペース画像の他の例を示す図である。It is a figure which shows the other example of a parking space image in case there exists a parking space candidate in the right and left of a vehicle. 車両の左右に駐車可能スペースの候補がある場合の駐車スペース画像の他の例を示す図である。It is a figure which shows the other example of a parking space image in case there exists a parking space candidate in the right and left of a vehicle. 車両の左右に駐車可能スペースの候補がある場合の駐車スペース画像の他の例を示す図である。It is a figure which shows the other example of a parking space image in case there exists a parking space candidate in the right and left of a vehicle. 車両の左右に駐車可能スペースの候補がある場合の駐車スペース画像の他の例を示す図である。It is a figure which shows the other example of a parking space image in case there exists a parking space candidate in the right and left of a vehicle. 車両の左右に駐車可能スペースの候補がある場合の駐車スペース画像の他の例を示す図である。It is a figure which shows the other example of a parking space image in case there exists a parking space candidate in the right and left of a vehicle. 車両の左右に駐車可能スペースの候補がある場合の駐車スペース画像の他の例を示す図である。It is a figure which shows the other example of a parking space image in case there exists a parking space candidate in the right and left of a vehicle. 本発明の第3の実施の形態における駐車支援装置201の構成を示すブロック図である。It is a block diagram which shows the structure of the parking assistance apparatus 201 in the 3rd Embodiment of this invention. 本発明の第3の実施の形態における駐車スペース画像290の一例を示す図である。It is a figure which shows an example of the parking space image 290 in the 3rd Embodiment of this invention. 本発明の第4の実施の形態における駐車支援装置500の構成を示すブロック図である。It is a block diagram which shows the structure of the parking assistance apparatus 500 in the 4th Embodiment of this invention. 本発明の第5の実施の形態における駐車支援装置510の構成を示すブロック図である。It is a block diagram which shows the structure of the parking assistance apparatus 510 in the 5th Embodiment of this invention.
 <第1の実施の形態>
 図1は、本発明の第1の実施の形態における駐車支援装置100の構成を示すブロック図である。駐車支援装置100は、駐車支援用表示制御装置1、周辺撮影装置2、周辺検出装置3および表示装置4を備えて構成される。駐車支援用表示制御装置1は、画像合成部11、駐車可能スペース判断部12、制御部13および画像出力部14を備える。
<First Embodiment>
FIG. 1 is a block diagram showing a configuration of a parking assistance apparatus 100 according to the first embodiment of the present invention. The parking assistance device 100 includes a parking assistance display control device 1, a peripheral photographing device 2, a peripheral detection device 3, and a display device 4. The parking assistance display control device 1 includes an image composition unit 11, a parking space determination unit 12, a control unit 13, and an image output unit 14.
 駐車支援用表示制御装置1は、車両に搭載されて使用される。周辺撮影装置2、周辺検出装置3および表示装置4は、駐車支援用表示制御装置1を搭載する車両(以下「自車両」という場合がある)に設けられる。本発明の他の実施の形態である駐車支援用表示制御方法は、本実施の形態の駐車支援用表示制御装置1によって実行される。 The parking support display control device 1 is mounted on a vehicle and used. The peripheral photographing device 2, the peripheral detection device 3, and the display device 4 are provided in a vehicle (hereinafter sometimes referred to as “own vehicle”) on which the parking assist display control device 1 is mounted. The parking support display control method according to another embodiment of the present invention is executed by the parking support display control apparatus 1 of the present embodiment.
 周辺撮影装置2は、カメラによって構成される。本実施の形態では、周辺撮影装置2は、自車両の進行方向の右側方、左側方および後方に設けられる。周辺撮影装置2は、自車両の周辺の画像、具体的には、自車両の進行方向の右側方、左側方および後方の画像を撮影する。以下の説明では、自車両の進行方向の右側方の画像を「右側方画像」といい、自車両の進行方向の左側方の画像を「左側方画像」といい、自車両の進行方向の後方の画像を「後方画像」という場合がある。 The peripheral photographing device 2 is constituted by a camera. In the present embodiment, the peripheral photographing device 2 is provided on the right side, the left side, and the rear in the traveling direction of the host vehicle. The peripheral imaging device 2 captures images of the vicinity of the host vehicle, specifically, images on the right side, left side, and rear of the traveling direction of the host vehicle. In the following description, an image on the right side of the traveling direction of the host vehicle is referred to as a “right side image”, an image on the left side of the traveling direction of the host vehicle is referred to as a “left side image”, and the rear side in the traveling direction of the host vehicle. These images may be referred to as “rear images”.
 周辺撮影装置2は、自車両の進行方向に沿って順に、自車両の側方の画像(以下「車両側方画像」という場合がある)を撮影する。周辺撮影装置2は、駐車支援用表示制御装置1の画像合成部11に接続されている。周辺撮影装置2によって撮影された画像を表す画像情報は、画像合成部11に与えられる。 The peripheral imaging device 2 sequentially captures images of the side of the host vehicle (hereinafter sometimes referred to as “vehicle side image”) along the traveling direction of the host vehicle. The peripheral photographing device 2 is connected to the image composition unit 11 of the parking assistance display control device 1. Image information representing an image photographed by the peripheral photographing device 2 is given to the image composition unit 11.
 画像合成部11は、周辺撮影装置2から、車両側方画像などを表す画像情報を取得する。画像合成部11は、取得した画像情報に基づいて、車両側方画像を車両の進行方向に沿って繋ぎ合せた合成画像を生成する。 The image composition unit 11 acquires image information representing a vehicle side image and the like from the peripheral photographing device 2. Based on the acquired image information, the image composition unit 11 generates a composite image obtained by joining the vehicle side images along the traveling direction of the vehicle.
 周辺検出装置3は、自車両の周辺の状況を検出する。具体的には、周辺検出装置3は、自車両の周辺に存在する物体を検出するとともに、自車両と物体との相対位置を検出する。自車両の周辺に存在する物体は、たとえば他の車両および地物である。周辺検出装置3は、たとえば、超音波センサ、画像処理センサ、ミリ波レーダおよびレーザレーダのうちの少なくとも1つによって構成される。画像処理センサは、カメラであってもよい。 The periphery detection device 3 detects the situation around the host vehicle. Specifically, the periphery detection device 3 detects an object existing around the host vehicle and detects the relative position between the host vehicle and the object. Objects existing around the host vehicle are, for example, other vehicles and features. The peripheral detection device 3 is configured by at least one of an ultrasonic sensor, an image processing sensor, a millimeter wave radar, and a laser radar, for example. The image processing sensor may be a camera.
 周辺検出装置3は、駐車支援用表示制御装置1の駐車可能スペース判断部12に接続されている。周辺検出装置3は、検出した自車両の周辺の状況を表す検出情報を生成する。たとえば、周辺検出装置3は、検出した物体、および自車両と物体との相対位置を表す検出情報を生成する。周辺検出装置3は、生成した検出情報を駐車可能スペース判断部12に与える。 The periphery detection device 3 is connected to the parking space determination unit 12 of the parking assistance display control device 1. The surroundings detection device 3 generates detection information representing the detected situation around the host vehicle. For example, the periphery detection device 3 generates detection information indicating the detected object and the relative position between the host vehicle and the object. The surroundings detection device 3 gives the generated detection information to the parking space determination unit 12.
 駐車可能スペース判断部12は、周辺検出装置3によって検出される自車両の周辺の状況、具体的には周辺検出装置3から与えられる検出情報に基づいて、自車両の周辺に駐車可能なスペースである駐車可能スペースが有るか否かを判断する。本実施の形態では、駐車可能スペース判断部12は、駐車可能スペースとして、駐車可能な区画が有るか否かを判断する。 The parking space determination unit 12 is a space that can be parked around the host vehicle based on the situation around the host vehicle detected by the surroundings detection device 3, specifically based on the detection information provided from the surrounding detection device 3. Determine if there is a parking space available. In the present embodiment, the parking space determination unit 12 determines whether there is a parking space as a parking space.
 制御部13は、駐車可能スペース判断部12によって駐車可能スペースが有ると判断された場合、駐車可能スペースの位置を示す駐車スペース表示オブジェクトを生成する。制御部13は、生成した駐車スペース表示オブジェクトを、画像合成部11によって生成された合成画像に重畳して、駐車スペース画像を生成する。本実施の形態では、駐車スペース画像は、少なくとも一部分が、立体的に視認可能な立体視画像である。 The control part 13 produces | generates the parking space display object which shows the position of the parking available space, when it is judged by the parking available space judgment part 12 that there is parking available space. The control unit 13 superimposes the generated parking space display object on the composite image generated by the image composition unit 11 to generate a parking space image. In the present embodiment, the parking space image is a stereoscopic image that is at least partially visible in a stereoscopic manner.
 画像出力部14は、制御部13によって生成された駐車スペース画像、具体的には駐車スペース画像を表す画像情報を表示装置4に出力する。 The image output unit 14 outputs the parking space image generated by the control unit 13, specifically, image information representing the parking space image to the display device 4.
 表示装置4は、たとえば液晶ディスプレイ、ヘッドアップディスプレイなどによって構成される。表示装置4は、画像出力部14から与えられる画像情報が表す画像を表示画面に表示する。これによって、駐車支援用表示制御装置1の使用者、たとえば自車両の運転者に、各種の情報、たとえば駐車可能スペースを提示することができる。 The display device 4 is configured by a liquid crystal display, a head-up display, or the like, for example. The display device 4 displays an image represented by the image information given from the image output unit 14 on the display screen. Thereby, various information, for example, a parking space, can be presented to the user of the parking assistance display control device 1, for example, the driver of the host vehicle.
 本実施の形態では、表示装置4は、駐車スペース画像の立体視画像を立体的に表示可能に構成される。これによって、駐車支援用表示制御装置1の使用者、たとえば自車両の運転者に、立体視画像が立体的に視認可能な状態で駐車スペース画像を提示することができる。 In the present embodiment, the display device 4 is configured to display a stereoscopic image of the parking space image in a three-dimensional manner. Accordingly, the parking space image can be presented to the user of the parking assistance display control device 1, for example, the driver of the host vehicle, in a state where the stereoscopic image can be viewed stereoscopically.
 以上のように本実施の形態によれば、周辺検出装置3によって検出される自車両の周辺の状況に基づいて、自車両の周辺に駐車可能スペースが有るか否かが、駐車可能スペース判断部12によって判断される。周辺撮影装置2によって自車両の進行方向に沿って順に撮影される車両側方画像を表す画像情報が画像合成部11によって取得され、取得された画像情報に基づいて、車両側方画像を自車両の進行方向に沿って繋ぎ合わせた合成画像が生成される。 As described above, according to the present embodiment, based on the situation around the own vehicle detected by the surrounding detection device 3, whether or not there is a parking available space around the own vehicle is determined. 12 is determined. Image information representing vehicle side images that are sequentially photographed along the traveling direction of the host vehicle by the peripheral photographing device 2 is acquired by the image synthesizing unit 11, and the vehicle side image is converted into the host vehicle based on the acquired image information. A combined image is generated along the traveling direction.
 駐車可能スペース判断部12によって駐車可能スペースが有ると判断された場合、制御部13によって、駐車スペース表示オブジェクトが、画像合成部11によって生成された合成画像に重畳されて、駐車スペース画像が生成される。制御部13によって生成された駐車スペース画像が、画像出力部14によって表示装置4に出力される。駐車スペース画像は、少なくとも一部分が、立体的に視認可能な立体視画像である。これによって、運転者にとって直観的に理解が可能な表示形態で駐車可能なスペースを提示することができる。 When it is determined by the parking space determination unit 12 that there is a parking space, the control unit 13 superimposes the parking space display object on the synthesized image generated by the image synthesis unit 11 to generate a parking space image. The The parking space image generated by the control unit 13 is output to the display device 4 by the image output unit 14. The parking space image is a stereoscopic image that is at least partially visible in a stereoscopic manner. Accordingly, it is possible to present a parking space in a display form that can be intuitively understood by the driver.
 以上に述べた本実施の形態では、周辺撮影装置2と周辺検出装置3とが別個に設けられているが、周辺撮影装置2が周辺検出装置3を兼ねていてもよい。 In the present embodiment described above, the peripheral photographing device 2 and the peripheral detecting device 3 are provided separately, but the peripheral photographing device 2 may also serve as the peripheral detecting device 3.
 <第2の実施の形態>
 図2は、本発明の第2の実施の形態における駐車支援装置200の構成を示すブロック図である。駐車支援装置200は、駐車支援用表示制御装置10、周辺撮影装置2、周辺検出装置3、表示装置4、車内LAN(Local Area Network)5および操作入力装置6を備えて構成される。駐車支援用表示制御装置10は、画像合成部11、駐車可能スペース判断部12、制御部13、画像出力部14および車内LANインタフェース15を備える。
<Second Embodiment>
FIG. 2 is a block diagram showing the configuration of the parking assistance apparatus 200 according to the second embodiment of the present invention. The parking assistance device 200 includes a parking assistance display control device 10, a peripheral photographing device 2, a peripheral detection device 3, a display device 4, an in-vehicle LAN (Local Area Network) 5, and an operation input device 6. The parking assistance display control device 10 includes an image composition unit 11, a parking space determination unit 12, a control unit 13, an image output unit 14, and an in-vehicle LAN interface 15.
 駐車支援用表示制御装置10は、車両に搭載されて使用される。周辺撮影装置2、周辺検出装置3、表示装置4、車内LAN5および操作入力装置6は、駐車支援用表示制御装置10を搭載する自車両に設けられる。 The parking support display control device 10 is mounted on a vehicle and used. The peripheral photographing device 2, the peripheral detection device 3, the display device 4, the in-vehicle LAN 5, and the operation input device 6 are provided in the own vehicle on which the parking assistance display control device 10 is mounted.
 本実施の形態の駐車支援装置200は、前述の図1に示す第1の実施の形態の駐車支援装置100と同一の構成を含んでいるので、同一の構成については同一の参照符号を付して、共通する説明を省略する。本発明の他の実施の形態である駐車支援用表示制御方法は、本実施の形態の駐車支援用表示制御装置10によって実行される。 Since parking assist device 200 of the present embodiment includes the same configuration as parking assist device 100 of the first embodiment shown in FIG. 1, the same reference numerals are assigned to the same components. Thus, a common description is omitted. The parking assistance display control method according to another embodiment of the present invention is executed by the parking assistance display control apparatus 10 of the present embodiment.
 車内LAN5は、たとえばCAN(Controller Area Network)によって構成される。車内LAN5は、駐車支援用表示制御装置10の車内LANインタフェース15に接続されている。車内LAN5は、自車両に関する種々の車両情報を車内LANインタフェース15に与える。 The in-vehicle LAN 5 is configured by, for example, CAN (Controller Area Network). The in-vehicle LAN 5 is connected to the in-vehicle LAN interface 15 of the parking assistance display control device 10. The in-vehicle LAN 5 gives various vehicle information related to the own vehicle to the in-vehicle LAN interface 15.
 車両情報は、たとえば、自車両に設けられた種々のセンサによって検出したセンサ情報、自車両に設けられた表示灯および警告灯などの車両機器に関する情報、車両機器の状態を示す情報、車両の走行速度を表す車速情報、操舵角度を表す情報、ブレーキの状態を示す情報、ならびにシフトレバーの状態を示す情報などである。表示灯は、たとえばウインカーであり、車両機器に関する情報は、たとえばウインカーに関するウインカー情報である。 The vehicle information includes, for example, sensor information detected by various sensors provided on the host vehicle, information on vehicle equipment such as indicator lights and warning lights provided on the host vehicle, information indicating the state of the vehicle equipment, traveling of the vehicle Vehicle speed information indicating speed, information indicating steering angle, information indicating the state of the brake, information indicating the state of the shift lever, and the like. The indicator lamp is, for example, a blinker, and the information on the vehicle device is, for example, blinker information on the blinker.
 車内LANインタフェース15は、車内LAN5から与えられた車両情報を画像合成部11、駐車可能スペース判断部12および制御部13に与える。制御部13は、車内LAN5から与えられた車両情報から、車両の移動の奇跡を表す車両移動軌跡情報を求める。制御部13によって求められる車両移動軌跡情報は、駐車支援用表示制御装置10の各部で利用される。 The in-vehicle LAN interface 15 provides the vehicle information given from the in-vehicle LAN 5 to the image composition unit 11, the parking space determination unit 12, and the control unit 13. The control unit 13 obtains vehicle movement trajectory information representing a miracle of vehicle movement from the vehicle information given from the in-vehicle LAN 5. The vehicle movement trajectory information obtained by the control unit 13 is used in each unit of the parking assistance display control device 10.
 操作入力装置6は、使用者によって操作される不図示の操作入力部を備える。操作入力部は、たとえば、操作スイッチおよび操作ボタンを含む。操作入力装置6は、使用者が、数字情報、文字情報および駐車支援用表示制御装置10への指示情報などの各種の情報を入力するときに用いられる。使用者によって操作入力部が操作されると、操作入力装置6は、使用者の入力操作に応じた操作情報を生成して、制御部13に与える。操作入力装置6は、使用者が何も操作していないときには、使用者が何も操作していないことを表す情報を含む操作情報を生成する。 The operation input device 6 includes an operation input unit (not shown) operated by the user. The operation input unit includes, for example, an operation switch and an operation button. The operation input device 6 is used when the user inputs various information such as numerical information, character information, and instruction information to the parking assistance display control device 10. When the operation input unit is operated by the user, the operation input device 6 generates operation information corresponding to the user's input operation and supplies the operation information to the control unit 13. The operation input device 6 generates operation information including information indicating that the user is not operating anything when the user is not operating anything.
 操作入力装置6の操作入力部は、操作スイッチおよび操作ボタンの代わりに、タッチパネル、または音声による操作入力が可能な音声入力装置によって構成されてもよい。 The operation input unit of the operation input device 6 may be configured by a touch panel or a voice input device capable of voice operation input instead of the operation switch and the operation button.
 操作入力装置6の操作入力部がタッチパネルで構成される場合、タッチパネルは、表示装置4の表示画面上に設置され、使用者のタッチ操作とタッチ位置とを検出する。タッチパネルは、検出したタッチ操作とタッチ位置とに応じた操作情報を生成して、制御部13に与える。 When the operation input unit of the operation input device 6 is configured with a touch panel, the touch panel is installed on the display screen of the display device 4 and detects the touch operation and the touch position of the user. The touch panel generates operation information corresponding to the detected touch operation and the touch position, and supplies the operation information to the control unit 13.
 操作入力装置6の操作入力部が音声入力装置で構成される場合、音声入力装置は、入力された音声を認識する。音声入力装置は、認識した音声に応じた操作情報を生成して、制御部13に与える。 When the operation input unit of the operation input device 6 is composed of a voice input device, the voice input device recognizes the input voice. The voice input device generates operation information corresponding to the recognized voice and gives it to the control unit 13.
 制御部13は、画像重畳部として機能する。制御部13が画像重畳部として機能する場合、前述の第1の実施の形態における制御部13と同様の機能を有する。具体的には、制御部13は、駐車可能スペース判断部12によって駐車可能スペースが有ると判断された場合、駐車可能スペースの位置を示す駐車スペース表示オブジェクトを生成し、生成した駐車スペース表示オブジェクトを、画像合成部11によって生成された合成画像に重畳して、駐車スペース画像を生成する。 The control unit 13 functions as an image superimposing unit. When the control unit 13 functions as an image superimposing unit, it has the same function as the control unit 13 in the first embodiment described above. Specifically, when it is determined by the parking space determination unit 12 that there is a parking space, the control unit 13 generates a parking space display object indicating the position of the parking space, and generates the generated parking space display object. The parking space image is generated by superimposing the image on the combined image generated by the image combining unit 11.
 また制御部13は、操作入力装置6から与えられる操作情報に基づいて、駐車支援用表示制御装置10の全体を制御する。 Further, the control unit 13 controls the entire parking assistance display control device 10 based on operation information given from the operation input device 6.
 図3は、本発明の第2の実施の形態における駐車支援用表示制御装置10のハードウェア構成を示すブロック図である。駐車支援用表示制御装置10における画像合成部11、駐車可能スペース判断部12、制御部13および画像出力部14は、処理回路によって実現される。すなわち、駐車支援用表示制御装置10は、駐車可能スペース判断部12が、周辺検出装置3によって検出される車両の周辺の状況に基づいて、車両の周辺に駐車可能な駐車可能スペースが有るか否かを判断し、画像合成部11が、周辺撮影装置2によって車両の進行方向に沿って順に撮影される車両の側方の車両側方画像を表す画像情報を取得し、取得した画像情報に基づいて、車両側方画像を車両の進行方向に沿って繋ぎ合わせた合成画像を生成し、制御部13が、駐車可能スペース判断部12によって駐車可能スペースが有ると判断された場合、駐車可能スペースの位置を示す駐車スペース表示オブジェクトを、画像合成部11によって生成された合成画像に重畳して、駐車スペース画像を生成し、画像出力部14が、制御部13によって生成された駐車スペース画像を表示装置に出力するための処理回路を備える。 FIG. 3 is a block diagram showing a hardware configuration of the parking assistance display control apparatus 10 according to the second embodiment of the present invention. The image composition unit 11, the parking space determination unit 12, the control unit 13, and the image output unit 14 in the parking assistance display control device 10 are realized by a processing circuit. That is, the parking support display control device 10 determines whether or not there is a parking space that can be parked around the vehicle based on the situation around the vehicle detected by the surroundings detection device 3 by the parking space determination unit 12. The image composition unit 11 obtains image information representing a side image of the vehicle that is photographed in order along the traveling direction of the vehicle by the peripheral photographing device 2, and based on the obtained image information. When the controller 13 determines that there is a parking space by the parking space determination unit 12, the side image of the vehicle is joined along the traveling direction of the vehicle. A parking space display object indicating the position is superimposed on the composite image generated by the image composition unit 11 to generate a parking space image, and the image output unit 14 Comprising a processing circuit for outputting a parking space image generated in the display device Te.
 処理回路には、専用のハードウェアが適用されてもよいし、メモリ102に記憶されるプログラムを実行するプロセッサ(Central Processing Unit、Graphics Processing Unit、処理装置、演算装置、マイクロプロセッサ、マイクロコンピュータ、DSP(Digital Signal Processor)ともいう)101が適用されてもよい。 A dedicated hardware may be applied to the processing circuit, or a processor (Central Processing Unit, Graphics Processing Unit, processing device, arithmetic device, microprocessor, microcomputer, DSP that executes a program stored in the memory 102 101 (also referred to as “Digital Signal) Processor”) may be applied.
 処理回路が専用のハードウェアである場合、処理回路は、たとえば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field Programmable Gate Array)、またはこれらを組み合わせたものが該当する。画像合成部11、駐車可能スペース判断部12、制御部13および画像出力部14の各部の機能は、それぞれ、複数の処理回路で実現されてもよいし、各部の機能をまとめて一つの処理回路で実現されてもよい。 When the processing circuit is dedicated hardware, the processing circuit can be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field Programmable Gate Array). Or a combination of these. The functions of the image synthesizing unit 11, the parking space determining unit 12, the control unit 13, and the image output unit 14 may be realized by a plurality of processing circuits, respectively, or the functions of the units may be combined into one processing circuit. It may be realized with.
 処理回路がプロセッサを用いて構成されている場合、駐車支援用表示制御装置10における画像合成部11、駐車可能スペース判断部12、制御部13および画像出力部14の各機能は、ソフトウェア、ファームウェア、またはソフトウェアとファームウェアとの組み合わせによって実現される。ソフトウェアおよびファームウェアはプログラムとして記述され、メモリ102に記憶される。 When the processing circuit is configured using a processor, each function of the image composition unit 11, the parking space determination unit 12, the control unit 13, and the image output unit 14 in the parking assistance display control device 10 includes software, firmware, Alternatively, it is realized by a combination of software and firmware. Software and firmware are described as programs and stored in the memory 102.
 処理回路は、メモリ102に記憶されたプログラムを読み出して実行することによって、各部の機能を実現する。すなわち、駐車支援用表示制御装置10は、処理回路によって実行されるときに、周辺撮影装置2によって車両の進行方向に沿って順に撮影される車両の側方の車両側方画像を表す画像情報を取得し、取得した画像情報に基づいて、車両側方画像を車両の進行方向に沿って繋ぎ合わせた合成画像を生成するステップと、周辺検出装置3によって検出される車両の周辺の状況に基づいて、車両の周辺に駐車可能な駐車可能スペースが有るか否かを判断するステップと、駐車可能スペースが有ると判断された場合、駐車可能スペースの位置を示す駐車スペース表示オブジェクトを合成画像に重畳して駐車スペース画像を生成するステップと、生成された駐車スペース画像を表示装置4に出力するステップとが結果的に実行されることになるプログラムを記憶するためのメモリ102を備える。 The processing circuit implements the functions of each unit by reading and executing the program stored in the memory 102. That is, the parking assistance display control device 10 displays image information representing a vehicle side image of the side of the vehicle that is sequentially photographed along the traveling direction of the vehicle by the peripheral photographing device 2 when executed by the processing circuit. Based on the acquired image information, a step of generating a composite image obtained by connecting the vehicle side images along the traveling direction of the vehicle, and a situation around the vehicle detected by the periphery detection device 3 The step of determining whether there is a parking space that can be parked around the vehicle, and if it is determined that there is a parking space, a parking space display object indicating the position of the parking space is superimposed on the composite image. And a step of generating a parking space image and a step of outputting the generated parking space image to the display device 4 as a result. A memory 102 for storing the beam.
 また、これらのプログラムは、画像合成部11、駐車可能スペース判断部12、制御部13および画像出力部14が行う処理の手順および方法をコンピュータに実行させるものであるともいえる。 Also, it can be said that these programs cause the computer to execute procedures and methods of processing performed by the image composition unit 11, the parking space determination unit 12, the control unit 13, and the image output unit 14.
 ここで、メモリ102は、たとえば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read Only Memory)などの、不揮発性または揮発性の半導体メモリ、ならびに磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、およびDVD(Digital Versatile Disc)などが該当する。 Here, the memory 102 is, for example, non-volatile or RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only Memory), or the like. Volatile semiconductor memories and magnetic disks, flexible disks, optical disks, compact disks, mini disks, DVDs (Digital Versatile Discs), and the like are applicable.
 以上のように、本実施の形態では、画像合成部11、駐車可能スペース判断部12、制御部13および画像出力部14の各機能は、ハードウェアおよびソフトウェアなどのいずれか一方で実現される構成である。しかし、これらの構成に限られるものではなく、画像合成部11、駐車可能スペース判断部12、制御部13および画像出力部14の一部を専用のハードウェアで実現し、別の一部をソフトウェアなどで実現する構成であってもよい。たとえば、制御部13については専用のハードウェアとしての処理回路でその機能を実現し、それ以外についてはプロセッサとしての処理回路がメモリ102に記憶されたプログラムを読み出して実行することによってその機能を実現することが可能である。 As described above, in the present embodiment, each function of the image composition unit 11, the parking space determination unit 12, the control unit 13, and the image output unit 14 is realized by either hardware or software. It is. However, the present invention is not limited to these configurations, and part of the image composition unit 11, the parking space determination unit 12, the control unit 13, and the image output unit 14 is realized by dedicated hardware, and another part is software. It may be a configuration realized by such as For example, the function of the control unit 13 is realized by a processing circuit as dedicated hardware, and other functions are realized by the processing circuit as a processor reading and executing a program stored in the memory 102 Is possible.
 以上のように、処理回路は、ハードウェア、ソフトウェアなど、またはこれらの組み合わせによって、前述の各機能を実現することができる。 As described above, the processing circuit can realize the aforementioned functions by hardware, software, or the like, or a combination thereof.
 図4~図12は、立体視画像を含む駐車スペース画像を形成する方法を説明するための図である。本実施の形態では、駐車支援用表示制御装置10は、一旦、2次元画像を合成した後、合成した2次元画像を、それぞれの駐車車両画像に分割し、それぞれの画像が駐車車両位置に対応した奥行きのある立体視画像となるように、駐車スペース画像を生成する。具体的には、以下のようにして駐車スペース画像が生成される。 4 to 12 are diagrams for explaining a method of forming a parking space image including a stereoscopic image. In the present embodiment, the parking assistance display control device 10 once synthesizes the two-dimensional image, then divides the synthesized two-dimensional image into each parked vehicle image, and each image corresponds to the parked vehicle position. A parking space image is generated so as to obtain a stereoscopic image having a large depth. Specifically, a parking space image is generated as follows.
 図4は、周辺撮影装置2の単眼カメラ21によって車両側方画像を撮影する様子を模式的に示す図である。周辺撮影装置2は、単眼カメラ21を含んで構成される。図4では、単眼カメラ21が、車両の進行方向に向かって左側の側面(以下「左側面」という場合がある)に設けられる場合を示す。以下の説明では、車両の左側面に設けられる単眼カメラ21を、「左側単眼カメラ21」という場合がある。 FIG. 4 is a diagram schematically illustrating a state in which a vehicle side image is captured by the monocular camera 21 of the peripheral imaging device 2. The peripheral photographing device 2 includes a monocular camera 21. FIG. 4 shows a case where the monocular camera 21 is provided on the left side surface (hereinafter also referred to as “left side surface”) in the traveling direction of the vehicle. In the following description, the monocular camera 21 provided on the left side surface of the vehicle may be referred to as a “left monocular camera 21”.
 本実施の形態では、画像合成部11は、周辺撮影装置2の単眼カメラ21によって、予め定める距離D1だけ離れた2つの位置で撮影された2枚の画像Pic(t)、Pic(t+Δt)を用いて、立体視画像を作成する。 In the present embodiment, the image composition unit 11 captures two images Pic (t) and Pic (t + Δt) photographed at two positions separated by a predetermined distance D1 by the monocular camera 21 of the peripheral photographing device 2. Use to create a stereoscopic image.
 ここで、撮影対象は移動していないものとし、撮影が行われる2つの位置間の距離D1は、両位置での撮影時刻の差Δtが1秒以内程度となる比較的短いものとする。 Here, it is assumed that the subject to be photographed has not moved, and the distance D1 between the two positions where photographing is performed is relatively short so that the difference Δt in photographing time at both positions is within about one second.
 たとえば、単眼カメラ21が車両の左側面に設けられる場合、単眼カメラ21は、時刻tにおいて、第1の位置で第1の画像Pic(t)を撮影する。単眼カメラ21は、時刻tからΔt後の時刻t+Δtにおいて、第1の位置から距離D1離れた第2の位置で第2の画像Pic(t+Δt)を撮影する。 For example, when the monocular camera 21 is provided on the left side surface of the vehicle, the monocular camera 21 captures the first image Pic (t) at the first position at time t. The monocular camera 21 captures the second image Pic (t + Δt) at a second position that is a distance D1 away from the first position at time t + Δt after Δt from time t.
 図5は、左側方画像を撮影する様子を模式的に示す図である。図5に示すように、自車両31は、駐車可能なスペースである駐車可能スペースのサーチを開始した時点の自車両31の位置(以下「駐車スペースサーチ開始位置」という場合がある)Tsから、駐車可能スペースを提示する時点の自車両31の位置(以下「駐車可能スペース提示位置」という場合がある)Teに向かって、進行方向32に沿って進行する。この自車両31が進行する間に、前述の図2に示す周辺撮影装置2は、前述の図4に示す単眼カメラ21によって、自車両31の側方の画像である車両側方画像、たとえば左側方画像P1,P2,…,Pn(nは自然数)を断続的または連続的に撮影する。単眼カメラ21は、たとえば自車両31の左側面に設けられる。周辺撮影装置2は、自車両31の進行方向32に向かって右側の右側面に設けられる単眼カメラ22を備えていてもよい。 FIG. 5 is a diagram schematically showing a state in which a left side image is taken. As shown in FIG. 5, the host vehicle 31 starts from a position Ts of the host vehicle 31 at the time of starting a search for a parking space that can be parked (hereinafter may be referred to as “parking space search start position”). The vehicle travels in the traveling direction 32 toward the position Te of the host vehicle 31 at the time of presenting the parking space (hereinafter sometimes referred to as “parking space presentation position”) Te. While the host vehicle 31 travels, the peripheral photographing device 2 shown in FIG. 2 described above uses the monocular camera 21 shown in FIG. 4 to detect a vehicle side image that is a side image of the host vehicle 31, for example, the left side. The direction images P1, P2,..., Pn (n is a natural number) are photographed intermittently or continuously. The monocular camera 21 is provided on the left side surface of the host vehicle 31, for example. The peripheral photographing device 2 may include a monocular camera 22 provided on the right side surface on the right side in the traveling direction 32 of the host vehicle 31.
 周辺撮影装置2は、単眼カメラ21,22によって撮影された車両側方画像から、自車両31と、自車両31の側方の物体、たとえば駐車車両L1,L3,L4との距離Ylnを求め、記録する。図5(a)において、横軸は、自車両31からの距離Xを示し、縦軸は、自車両31の左側方の奥行Ylを示す。横軸の方向は、たとえば表示装置4の表示画面の横方向Xに相当する。自車両31からの距離Xは、具体的には、自車両31の進行方向32における駐車可能スペース提示位置Teからの距離である。 The peripheral photographing device 2 obtains a distance Yln between the own vehicle 31 and an object on the side of the own vehicle 31, for example, the parked vehicles L1, L3, L4, from the vehicle side images taken by the monocular cameras 21, 22. Record. 5A, the horizontal axis indicates the distance X from the host vehicle 31, and the vertical axis indicates the depth Yl on the left side of the host vehicle 31. The direction of the horizontal axis corresponds to the horizontal direction X of the display screen of the display device 4, for example. Specifically, the distance X from the host vehicle 31 is a distance from the parking space presentation position Te in the traveling direction 32 of the host vehicle 31.
 たとえば、周辺撮影装置2は、自車両31の左側方に3台の駐車車両、すなわち第1の駐車車両L1、第3の駐車車両L3および第4の駐車車両L4が存在する場合、自車両31と第1の駐車車両L1との距離Yl1、自車両31と第3の駐車車両L3との距離Yl3、および自車両31と第4の駐車車両L4との距離Yl4を求め、記録する。 For example, when there are three parked vehicles on the left side of the host vehicle 31, that is, the first parked vehicle L1, the third parked vehicle L3, and the fourth parked vehicle L4, the peripheral photographing device 2 And the distance Yl1 between the host vehicle 31 and the third parked vehicle L3, and the distance Yl4 between the host vehicle 31 and the fourth parked vehicle L4 are recorded.
 周辺撮影装置2は、撮影した車両側方画像、たとえば左側方画像P1,P2,…,Pnを表す画像情報を生成する。周辺撮影装置2は、生成した画像情報とともに、記録した自車両31と自車両31の側方の物体との距離、たとえば自車両31と駐車車両L1,L3,L4との距離Yl1,Yl3,Yl4を駐車支援用表示制御装置10の画像合成部11に与える。 The peripheral photographing device 2 generates image information representing the photographed vehicle side images, for example, left side images P1, P2,. The peripheral photographing device 2 includes the generated image information and the recorded distance between the own vehicle 31 and an object on the side of the own vehicle 31, for example, the distances Yl1, Yl3, Yl4 between the own vehicle 31 and the parked vehicles L1, L3, L4. Is provided to the image composition unit 11 of the parking assistance display control device 10.
 自車両31と自車両31の側方の物体との距離は、周辺検出装置3によって求められてもよい。この場合、周辺検出装置3は、たとえば、超音波センサによって、自車両3と自車両31の側方の物体との距離を求める。自車両31と自車両31の側方の物体との距離は、具体的には、自車両31が駐車可能スペースを探す駐車スペースサーチのときに走行してきた軌跡を表す走行軌跡と、自車両31の側方の物体との最短距離である。たとえば自車両31の側方の物体が駐車車両L1,L3,L4である場合、自車両31と各駐車車両L1,L3,L4との距離Yl1,Yl3,Yl4は、自車両31の駐車スペースサーチのときの走行軌跡と、各駐車車両L1,L3,L4の自車両31側の先端との距離である。 The distance between the host vehicle 31 and an object on the side of the host vehicle 31 may be obtained by the periphery detection device 3. In this case, the surroundings detection apparatus 3 calculates | requires the distance of the own vehicle 3 and the object of the side of the own vehicle 31 by an ultrasonic sensor, for example. Specifically, the distance between the host vehicle 31 and the object on the side of the host vehicle 31 is a travel locus that represents a track that the host vehicle 31 has traveled during a parking space search to search for a parking space, and the host vehicle 31. Is the shortest distance from the object on the side. For example, when the objects on the side of the host vehicle 31 are the parked vehicles L1, L3, and L4, the distances Yl1, Yl3, and Yl4 between the host vehicle 31 and the parked vehicles L1, L3, and L4 are the parking space search of the host vehicle 31. The distance between the travel locus at the time of the vehicle and the front end of each parked vehicle L1, L3, L4 on the own vehicle 31 side.
 図6は、合成画像40の一例を示す図である。画像合成部11は、周辺撮影装置2から与えられた画像情報に基づいて、車両側方画像を自車両31の進行方向32に沿って繋ぎ合わせた合成画像40を生成する。画像合成部11は、生成した合成画像40を制御部13に与える。図6において、横軸は自車両31からの距離Xを示し、縦軸は高さZを示す。高さZは、具体的には、自車両31が走行する路面からの高さである。 FIG. 6 is a diagram illustrating an example of the composite image 40. The image composition unit 11 generates a composite image 40 in which the vehicle side images are joined along the traveling direction 32 of the host vehicle 31 based on the image information given from the peripheral photographing device 2. The image composition unit 11 gives the generated composite image 40 to the control unit 13. In FIG. 6, the horizontal axis indicates the distance X from the host vehicle 31, and the vertical axis indicates the height Z. Specifically, the height Z is a height from the road surface on which the host vehicle 31 travels.
 図7は、車両単位画像の一例を示す図である。制御部13は、画像合成部11から与えられた合成画像40から、画像処理によって、駐車車両単位の画像(以下「車両単位画像」という場合がある)を切り取る。図7に示す例では、制御部13は、合成画像40を、a1,a3,a4,b1~b4の7枚の画像(以下「分割画像」という場合がある)に分割し、第1~第3の駐車車両L1,L3,L4をそれぞれ含む、3つの車両単位画像a1,a3,a4を切り取る。 FIG. 7 is a diagram illustrating an example of a vehicle unit image. The control unit 13 cuts out an image of a parked vehicle unit (hereinafter sometimes referred to as “vehicle unit image”) from the composite image 40 given from the image composition unit 11 by image processing. In the example shown in FIG. 7, the control unit 13 divides the composite image 40 into seven images a1, a3, a4, and b1 to b4 (hereinafter sometimes referred to as “divided images”). Three vehicle unit images a1, a3, and a4, each including three parked vehicles L1, L3, and L4, are cut out.
 制御部13は、周辺検出装置3による距離測定結果を用いて、各駐車車両L1,L3,L4の位置を推定して、車両単位画像a1,a3,a4を切り取ってもよい。 The control unit 13 may estimate the position of each parked vehicle L1, L3, L4 using the distance measurement result by the periphery detection device 3, and may cut out the vehicle unit images a1, a3, a4.
 図8は、左目用描画プレーン41の一例を示す図であり、図9は、右目用描画プレーン42の一例を示す図である。制御部13は、立体視画像を作成するための描画プレーンとして、図8に示す左目用描画プレーン41と、図9に示す右目用描画プレーン42とを設ける。 8 is a diagram illustrating an example of the drawing plane 41 for the left eye, and FIG. 9 is a diagram illustrating an example of the drawing plane 42 for the right eye. The control unit 13 provides a left-eye drawing plane 41 shown in FIG. 8 and a right-eye drawing plane 42 shown in FIG. 9 as drawing planes for creating a stereoscopic image.
 図10は、分割画像の描画プレーンへの貼り付け方法を説明するための図である。図10(a)は、車両単位画像a1,a3,a4を含む分割画像a1,a3,a4,b1~b4の一例を示す図である。図10(b)は、左目用描画プレーン41に分割画像a1,a3,a4,b1~b4を貼り付けた状態の一例を示す図である。図10(c)は、右目用描画プレーン42に分割画像a1,a3,a4,b1~b4を貼り付けた状態の一例を示す図である。 FIG. 10 is a diagram for explaining a method of pasting a divided image onto a drawing plane. FIG. 10A is a diagram illustrating an example of divided images a1, a3, a4, and b1 to b4 including vehicle unit images a1, a3, and a4. FIG. 10B is a diagram showing an example of a state in which the divided images a1, a3, a4, b1 to b4 are pasted on the left-eye drawing plane 41. FIG. 10C is a diagram showing an example of a state in which the divided images a1, a3, a4, b1 to b4 are pasted on the right-eye drawing plane.
 制御部13は、図6に示す駐車可能スペースSLの位置を表す駐車スペース表示オブジェクト51を生成し、生成した駐車スペース表示オブジェクト51を、合成画像40の対応する分割画像b2に重畳する。駐車スペース表示オブジェクト51は、たとえば図10に示すマークである。 The control unit 13 generates a parking space display object 51 representing the position of the parking space SL shown in FIG. 6 and superimposes the generated parking space display object 51 on the corresponding divided image b2 of the composite image 40. The parking space display object 51 is a mark shown in FIG. 10, for example.
 制御部13は、駐車車両L1,L3,L4の距離に対応した奥行き位置であるY軸の位置に、それぞれの駐車車両L1,L3,L4が表示されるように両眼視差を計算して、分割画像a1,a3,a4,b1~b4を左右の描画プレーン41,42のどの位置に貼り付けるかを計算する。 The control unit 13 calculates binocular parallax so that each of the parked vehicles L1, L3, and L4 is displayed at the position of the Y axis that is a depth position corresponding to the distance of the parked vehicles L1, L3, and L4. The position where the divided images a1, a3, a4, b1 to b4 are pasted on the left and right drawing planes 41 and 42 is calculated.
 駐車車両L1,L3,L4の奥行が大きいほど、両眼視差が小さいので、左目用描画プレーン41における車両単位画像a1,a3,a4の貼り付け位置と、右目用描画プレーン42における車両単位画像a1,a3,a4の貼り付け位置との差異は小さくなる。 Since the binocular parallax is smaller as the depth of the parked vehicles L1, L3, and L4 is larger, the pasting positions of the vehicle unit images a1, a3, and a4 on the left-eye drawing plane 41 and the vehicle unit image a1 on the right-eye drawing plane 42 , A3, a4 are less different from the pasting positions.
 換言すれば、駐車車両L1,L3,L4の奥行が小さいほど、両眼視差が大きいので、左目用描画プレーン41における車両単位画像a1,a3,a4の貼り付け位置と、右目用描画プレーン42における車両単位画像a1,a3,a4の貼り付け位置との差異は大きくなる。 In other words, the binocular parallax increases as the depths of the parked vehicles L1, L3, and L4 are smaller. Therefore, the attachment positions of the vehicle unit images a1, a3, and a4 on the left-eye drawing plane 41 and the right-eye drawing plane 42 The difference from the attachment position of the vehicle unit images a1, a3, a4 becomes large.
 制御部13は、各描画プレーン41,42の計算した位置に分割画像a1,a3,a4,b1~b4を貼り付ける。以下の説明では、左目用描画プレーン41に貼り付けられる左目用分割画像a1,a3,a4,b1~b4を、参照符号a1-l,a3-l,a4-l,b1-l~b4-lで表す。また右目用描画プレーン42に貼り付けられる右目用分割画像a1,a3,a4,b1~b4を、参照符号a1-r,a3-r,a4-r,b1-r~b4-rで表す。 The control unit 13 pastes the divided images a1, a3, a4, b1 to b4 at the calculated positions of the drawing planes 41 and 42. In the following description, the left-eye divided images a1, a3, a4, b1 to b4 pasted on the left-eye drawing plane 41 are denoted by reference numerals a1-l, a3-l, a4-l, and b1-l to b4-l. Represented by The right-eye divided images a1, a3, a4, b1 to b4 to be pasted on the right-eye drawing plane 42 are represented by reference symbols a1-r, a3-r, a4-r, b1-r to b4-r.
 ここで、第nの車両単位画像をan(nは自然数)とすると、第nの左目用車両単位画像an-lの位置と、第nの右目用車両単位画像an-rの位置とは、両眼視差を考慮したため、両描画プレーン41,42の同一位置とは限らない。したがって、制御部13は、画像間の欠損が起きないように、背景側の画面を拡大または縮小することによって補間を行う。 Here, when the n-th vehicle unit image is an (n is a natural number), the position of the n-th left-eye vehicle unit image an-l and the position of the n-th right-eye vehicle unit image an-r are: Since binocular parallax is taken into account, the drawing positions 41 and 42 are not necessarily at the same position. Therefore, the control unit 13 performs interpolation by enlarging or reducing the screen on the background side so that a loss between images does not occur.
 このようにして、左目用描画プレーン41に左目用分割画像a1-l,a3-l,a4-l,b1-l~b4-lが貼り付けられた左目用画像43Lと、右目用描画プレーン42に右目用分割画像a1-r,a3-r,a4-r,b1-r~b4-rが貼り付けられた右目用画像43Rとが生成される。 In this way, the left-eye image 43L in which the left-eye divided images a1-l, a3-l, a4-l, and b1-l to b4-l are pasted on the left-eye drawing plane 41, and the right-eye drawing plane 42 To the right-eye divided image a1-r, a3-r, a4-r, and b1-r to b4-r are generated.
 図11は、立体視画像の奥行の一例を示す図である。図12は、立体視画像を含む駐車スペース画像の一例を示す図である。図13は、立体視画像の奥行と駐車スペース画像における各立体視画像の位置との関係を示す図である。 FIG. 11 is a diagram illustrating an example of the depth of a stereoscopic image. FIG. 12 is a diagram illustrating an example of a parking space image including a stereoscopic image. FIG. 13 is a diagram illustrating a relationship between the depth of the stereoscopic image and the position of each stereoscopic image in the parking space image.
 前述のようにして制御部13によって生成された図10に示す左目用画像43Lおよび右目用画像43Rは、画像出力部14に与えられ、画像出力部14から表示装置4に出力される。これによって、図12に示すように、表示装置4の表示画面44に、奥行きのある立体視画像L1,L3,L4,51を含む駐車スペース画像45が表示され、自車両31の運転者によって観測される。 The left-eye image 43L and the right-eye image 43R shown in FIG. 10 generated by the control unit 13 as described above are given to the image output unit 14 and output from the image output unit 14 to the display device 4. As a result, as shown in FIG. 12, a parking space image 45 including stereoscopic images L1, L3, L4, 51 with depth is displayed on the display screen 44 of the display device 4, and is observed by the driver of the host vehicle 31. Is done.
 駐車スペース画像45に含まれる各立体視画像L1,L3,L4,51の奥行表示位置は、図11および図13に示すような位置関係になる。図11では、各分割画像a1,a3,a4,b1~b4の奥行表示位置を、対応する参照符号a1,a3,a4,b1~b4で示す。 The depth display positions of the stereoscopic images L1, L3, L4, and 51 included in the parking space image 45 are in a positional relationship as shown in FIGS. In FIG. 11, the depth display positions of the divided images a1, a3, a4, b1 to b4 are indicated by the corresponding reference symbols a1, a3, a4, b1 to b4.
 図14および図15は、本発明の第2の実施の形態の駐車支援用表示制御装置10における表示制御処理に関する処理手順を示すフローチャートである。本実施の形態では、図14および図15に示す処理手順に従って、前述の図13に示す駐車スペース画像45を表示装置4に表示させる表示制御処理が行われる。 FIG. 14 and FIG. 15 are flowcharts showing a processing procedure related to the display control process in the parking assistance display control apparatus 10 according to the second embodiment of the present invention. In the present embodiment, display control processing for displaying the parking space image 45 shown in FIG. 13 on the display device 4 is performed according to the processing procedure shown in FIGS. 14 and 15.
 図14および図15に示すフローチャートの各処理は、駐車支援用表示制御装置10の画像合成部11、駐車可能スペース判断部12、制御部13、画像出力部14および車内LANインタフェース15によって実行される。図14および図15に示すフローチャートの処理は、駐車支援用表示制御装置10の電源が投入されると開始され、図14のステップS1に移行する。 14 and 15 are executed by the image composition unit 11, the parking space determination unit 12, the control unit 13, the image output unit 14, and the in-vehicle LAN interface 15 of the parking assistance display control device 10. . The processing of the flowcharts shown in FIGS. 14 and 15 is started when the parking assist display control device 10 is turned on, and the process proceeds to step S1 in FIG.
 ステップS1において、車内LANインタフェース15は、車内LAN5から車両情報を取得する。車両情報を取得すると、ステップS2に移行する。 In step S1, the in-vehicle LAN interface 15 acquires vehicle information from the in-vehicle LAN 5. If vehicle information is acquired, it will transfer to Step S2.
 ステップS2において、制御部13は、操作入力装置6から操作情報を取得する。操作情報を取得すると、ステップS3に移行する。 In step S2, the control unit 13 acquires operation information from the operation input device 6. When the operation information is acquired, the process proceeds to step S3.
 ステップS3において、制御部13は、駐車スペースサーチ開始条件を満たすか否かを判断する。ここで、駐車スペースサーチ開始条件は、自車両31の速度(以下「車速」という場合がある)が予め定める速度未満になること、たとえば10km/h未満になること、または使用者からの駐車スペース検出命令が入力されることである。駐車スペースサーチ開始条件を満たすと判断された場合は、ステップS4に移行し、駐車スペースサーチ開始条件を満たさないと判断された場合は、ステップS1に戻る。 In step S3, the control unit 13 determines whether or not a parking space search start condition is satisfied. Here, the parking space search start condition is that the speed of the host vehicle 31 (hereinafter sometimes referred to as “vehicle speed”) is less than a predetermined speed, for example, less than 10 km / h, or a parking space from the user. A detection command is input. If it is determined that the parking space search start condition is satisfied, the process proceeds to step S4. If it is determined that the parking space search start condition is not satisfied, the process returns to step S1.
 ステップS4において、画像合成部11は、周辺撮影装置2から画像情報を取得する。本実施の形態では、画像合成部11は、前述の車両側方画像、たとえば前述の図5(b)に示す左側方画像P1,P2,…,Pnを表す画像情報を取得する。画像情報を取得すると、ステップS5に移行する。 In step S4, the image composition unit 11 acquires image information from the peripheral photographing device 2. In the present embodiment, the image synthesizing unit 11 acquires image information representing the aforementioned vehicle side images, for example, the left side images P1, P2,..., Pn shown in FIG. When the image information is acquired, the process proceeds to step S5.
 ステップS5において、駐車可能スペース判断部12は、周辺検出装置3から車両周辺情報を取得する。車両周辺情報は、たとえば自車両と、他の車両もしくは地物などの障害物との相対位置または距離である。車両周辺情報を取得すると、ステップS6に移行する。 In step S <b> 5, the parking space determination unit 12 acquires vehicle periphery information from the periphery detection device 3. The vehicle periphery information is, for example, the relative position or distance between the host vehicle and an obstacle such as another vehicle or a feature. If vehicle periphery information is acquired, it will transfer to step S6.
 ステップS6において、駐車可能スペース判断部12は、駐車可能スペース検出処理を行う。駐車可能スペース検出処理では、駐車可能スペース判断部12は、ステップS5で取得した車両周辺情報を用いて、駐車可能スペースの有無を判断する。 In step S6, the parking space determination unit 12 performs a parking space detection process. In the parking space detection process, the parking space determination unit 12 determines whether there is a parking space using the vehicle periphery information acquired in step S5.
 本実施の形態では、駐車可能スペース判断部12は、駐車可能スペース検出処理において、駐車可能スペースを求め、その位置に自車両が駐車するための経路が存在する場合に、駐車可能であると判断する。 In the present embodiment, the parking space determination unit 12 obtains a parking space in the parking space detection process, and determines that parking is possible when there is a route for parking the host vehicle at that position. To do.
 このとき、駐車可能スペース判断部12は、車内LANインタフェース15から与えられる車両情報から自車両の走行経路を求め、周辺撮影装置2で撮影された画像、および周辺検出装置3で検出された検出情報の、時間属性および位置属性とする。駐車可能スペース判断部12は、これらを用いて、求めた駐車可能スペースの位置に自車両が駐車するための経路が存在するか否かを判断する。駐車可能スペース検出処理が終了した後は、ステップS7に移行する。 At this time, the parking space determination unit 12 obtains the travel route of the host vehicle from the vehicle information given from the in-vehicle LAN interface 15, the image captured by the peripheral imaging device 2, and the detection information detected by the peripheral detection device 3. Of time attribute and position attribute. The parking space determination unit 12 uses these to determine whether or not there is a route for the own vehicle to park at the position of the determined parking space. After the parking space detection process ends, the process proceeds to step S7.
 ステップS7において、制御部13は、操作入力装置6から操作情報を取得する。操作情報を取得すると、ステップS8に移行する。 In step S7, the control unit 13 acquires operation information from the operation input device 6. When the operation information is acquired, the process proceeds to step S8.
 ステップS8において、車内LANインタフェース15は、車内LAN5から車両情報を取得する。車両情報を取得すると、図15のステップS9に移行する。 In step S8, the in-vehicle LAN interface 15 acquires vehicle information from the in-vehicle LAN 5. If vehicle information is acquired, it will transfer to step S9 of FIG.
 図15のステップS9において、制御部13は、駐車スペースサーチ終了条件を満たすか否かを判断する。ここで、駐車スペースサーチ終了条件は、車速が予め定める速度以上、たとえば10km/h以上になること、または使用者からの駐車スペースサーチ終了命令が入力されることである。駐車スペースサーチ終了条件を満たすと判断された場合は、全ての処理手順を終了し、駐車スペースサーチ終了条件を満たさないと判断された場合は、ステップS10に移行する。 15, the control unit 13 determines whether or not the parking space search end condition is satisfied. Here, the parking space search end condition is that the vehicle speed is equal to or higher than a predetermined speed, for example, 10 km / h or higher, or that a parking space search end command is input from the user. When it is determined that the parking space search end condition is satisfied, all the processing procedures are ended, and when it is determined that the parking space search end condition is not satisfied, the process proceeds to step S10.
 ステップS10において、制御部13は、駐車可能スペース表示条件を満たすか否かを判断する。ここで、駐車可能スペース表示条件は、駐車可能スペースが発見された場合、自車両が停車した場合、使用者から駐車スペース表示命令が入力された場合のうちのいずれかの場合を満たすことである。駐車可能スペース表示条件を満たすと判断された場合は、ステップS11に移行する。駐車可能スペース表示条件を満たさないと判断された場合は、ステップS15に移行する。 In step S10, the control unit 13 determines whether or not a parking space display condition is satisfied. Here, the parking space display condition is to satisfy any of the cases where a parking space is found, the host vehicle stops, or a parking space display command is input from the user. . When it is determined that the parking space display condition is satisfied, the process proceeds to step S11. When it is determined that the parking space display condition is not satisfied, the process proceeds to step S15.
 ステップS11において、画像合成部11は、合成画像、たとえば前述の図6に示す合成画像40を生成する。合成画像を生成すると、ステップS12に移行する。 In step S11, the image composition unit 11 generates a composite image, for example, the composite image 40 shown in FIG. When the composite image is generated, the process proceeds to step S12.
 ステップS12において、制御部13は、駐車スペース表示オブジェクト、たとえば前述の図10に示す駐車スペース表示オブジェクト51を生成する。駐車スペース表示オブジェクトを生成すると、ステップS13に移行する。 In step S12, the control unit 13 generates a parking space display object, for example, the parking space display object 51 shown in FIG. When the parking space display object is generated, the process proceeds to step S13.
 ステップS13において、制御部13は、ステップS12で生成された駐車スペース表示オブジェクトを、ステップS11で生成された合成画像に重畳して、駐車スペース画像、たとえば図12に示す駐車スペース画像45を生成する。駐車スペース画像を生成すると、ステップS14に移行する。 In step S13, the control unit 13 superimposes the parking space display object generated in step S12 on the composite image generated in step S11 to generate a parking space image, for example, the parking space image 45 shown in FIG. . When the parking space image is generated, the process proceeds to step S14.
 ステップS14において、画像出力部14は、ステップS14で生成された駐車スペース画像を表示装置4に出力する。駐車スペース画像を表示装置4に出力した後は、図14のステップS4に戻る。 In step S14, the image output unit 14 outputs the parking space image generated in step S14 to the display device 4. After outputting the parking space image to the display device 4, the process returns to step S4 in FIG.
 ステップS15において、制御部13は、ステップS14で生成された駐車スペース画像を消去する。駐車スペース画像を消去した後は、図14のステップS4に戻る。 In step S15, the control unit 13 deletes the parking space image generated in step S14. After erasing the parking space image, the process returns to step S4 in FIG.
 以上の本実施の形態によれば、第1の実施の形態と同様の効果を得ることができる。具体的には、周辺検出装置3によって検出される自車両31の周辺の状況に基づいて、自車両31の周辺に駐車可能スペースが有るか否かが、駐車可能スペース判断部12によって判断される。周辺撮影装置2によって自車両31の進行方向32に沿って順に撮影される車両側方画像を表す画像情報が画像合成部11によって取得され、取得された画像情報に基づいて、車両側方画像を自車両31の進行方向32に沿って繋ぎ合わせた合成画像40が生成される。 According to the present embodiment described above, the same effects as those of the first embodiment can be obtained. More specifically, the parking space determination unit 12 determines whether or not there is a parking space around the host vehicle 31 based on the situation around the host vehicle 31 detected by the surrounding detection device 3. . Image information representing vehicle side images that are sequentially photographed along the traveling direction 32 of the host vehicle 31 by the peripheral photographing device 2 is acquired by the image synthesis unit 11, and based on the acquired image information, the vehicle side images are displayed. A composite image 40 joined along the traveling direction 32 of the host vehicle 31 is generated.
 駐車可能スペース判断部12によって駐車可能スペースが有ると判断された場合、制御部13によって、駐車スペース表示オブジェクト51が、画像合成部11によって生成された合成画像40に重畳されて、駐車スペース画像45が生成される。制御部13によって生成された駐車スペース画像45が、画像出力部14によって表示装置4に出力される。駐車スペース画像は、少なくとも一部分が、立体的に視認可能な立体視画像である。これによって、運転者にとって直観的に理解が可能な表示形態で駐車可能なスペースを提示することができる。 When it is determined by the parking space determination unit 12 that there is a parking space, the control unit 13 superimposes the parking space display object 51 on the synthesized image 40 generated by the image synthesis unit 11, and the parking space image 45. Is generated. The parking space image 45 generated by the control unit 13 is output to the display device 4 by the image output unit 14. The parking space image is a stereoscopic image that is at least partially visible in a stereoscopic manner. Accordingly, it is possible to present a parking space in a display form that can be intuitively understood by the driver.
 また本実施の形態では、図12に示すように、駐車スペース画像45は、自車両31以外の他の車両L1,L3,L4が駐車される駐車車両領域を含み、駐車車両領域は、立体視画像によって表される。これによって。運転者に、より直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。 Moreover, in this Embodiment, as shown in FIG. 12, the parking space image 45 contains the parked vehicle area | region where other vehicles L1, L3, and L4 other than the own vehicle 31 are parked, and the parked vehicle area | region is stereoscopically viewed. Represented by an image. by this. The parking space can be presented to the driver in a display form that can be understood more intuitively.
 また本実施の形態では、図12に示すように、駐車可能スペースは、立体視画像、具体的には立体視画像である駐車スペース表示オブジェクト51によって表される。これによって、運転者に、より直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。 Further, in the present embodiment, as shown in FIG. 12, the parking available space is represented by a parking space display object 51 that is a stereoscopic image, specifically, a stereoscopic image. Thus, the parking space can be presented to the driver in a display form that can be understood more intuitively.
 本実施の形態では、前述の図7に示すように、合成画像40から駐車車両毎に車両単位画像a1,a3,a4が切り取られるが、車両単位画像の切り取り方法は、これに限定されない。図16および図17は、車両単位画像の切り取り方法を説明するための図である。図16は、図7に示す本発明の第2の実施形態において切り取られる車両単位画像a1を示す図である。図17は、車両単位画像の他の例を示す図である。 In the present embodiment, as shown in FIG. 7 described above, the vehicle unit images a1, a3, and a4 are cut out from the composite image 40 for each parked vehicle, but the vehicle unit image cutting method is not limited to this. FIG. 16 and FIG. 17 are diagrams for explaining a method of cutting a vehicle unit image. FIG. 16 is a diagram showing a vehicle unit image a1 cut out in the second embodiment of the present invention shown in FIG. FIG. 17 is a diagram illustrating another example of the vehicle unit image.
 制御部13は、図17に示すように、第1の駐車車両L1の駐車車両部分a1-1と、駐車車両以外の部分a1-2とを分割し、駐車車両部分a1-1のみを、自車両31と第1の駐車車両L1との距離Yl1に対応する奥行で表示し、駐車車両以外の部分a1-2は、駐車車両部分a1-1よりも更に奥行があるように表示するように駐車スペース画像を生成してもよい。 As shown in FIG. 17, the control unit 13 divides the parked vehicle portion a1-1 of the first parked vehicle L1 and the non-parked vehicle portion a1-2, and only the parked vehicle portion a1-1 is Parking is performed so that the depth corresponding to the distance Yl1 between the vehicle 31 and the first parked vehicle L1 is displayed, and the portion a1-2 other than the parked vehicle is displayed so that there is more depth than the parked vehicle portion a1-1. A space image may be generated.
 また本実施の形態では、前述の図11に示すように、駐車車両の背景となる、駐車車両を含まない分割画像b1~b4を、異なる距離で表示するようにしたが、これに限定されない。図18は、駐車車両の背景の位置の他の例を示す図である。たとえば図18に示すように、背景となる、駐車車両を含まない分割画像b1~b4は、自車両31からの距離、すなわち奥行距離d0が同一になるように表示されてもよい。 In the present embodiment, as shown in FIG. 11 described above, the divided images b1 to b4 that do not include the parked vehicle, which are the background of the parked vehicle, are displayed at different distances. However, the present invention is not limited to this. FIG. 18 is a diagram illustrating another example of the background position of the parked vehicle. For example, as shown in FIG. 18, the divided images b1 to b4 that do not include a parked vehicle as a background may be displayed such that the distance from the own vehicle 31, that is, the depth distance d0 is the same.
 また本実施の形態では、周辺撮影装置2は、単眼カメラ21,22を含んで構成されるが、これに限定されない。図19は、両眼カメラ装置60の構成を簡略化して示す図である。周辺撮影装置2は、単眼カメラ21,22に代えて、図19に示す両眼カメラ装置60を含んで構成されてもよい。 In the present embodiment, the peripheral photographing device 2 includes the monocular cameras 21 and 22, but is not limited thereto. FIG. 19 is a diagram showing a simplified configuration of the binocular camera device 60. The peripheral photographing device 2 may include a binocular camera device 60 shown in FIG. 19 instead of the monocular cameras 21 and 22.
 両眼カメラ装置60は、2つのカメラ、すなわち第1カメラ61および第2カメラ62を備える。第1カメラ61と第2カメラ62とは、予め定める距離D2だけ離隔して設けられる。 The binocular camera device 60 includes two cameras, that is, a first camera 61 and a second camera 62. The first camera 61 and the second camera 62 are provided separated by a predetermined distance D2.
 両眼カメラ装置60は、第1カメラ61によって左画像が撮影され、第2カメラ62によって右画像が撮影されるように設けられる。たとえば、両眼カメラ装置60が自車両31の左側面に設けられる場合、両眼カメラ装置60は、第1カメラ61が自車両31の進行方向32の後方側に配置され、第2カメラ62が自車両31の進行方向32の前方側に配置されるように設けられる。 The binocular camera device 60 is provided such that the first camera 61 captures a left image and the second camera 62 captures a right image. For example, when the binocular camera device 60 is provided on the left side surface of the host vehicle 31, the binocular camera device 60 has the first camera 61 disposed on the rear side in the traveling direction 32 of the host vehicle 31, and the second camera 62. It is provided so as to be arranged on the front side in the traveling direction 32 of the host vehicle 31.
 両眼カメラ装置60の第1カメラ61および第2カメラ62で同一時刻tに撮影される2つの画像、すなわち第1カメラ61で撮影される左画像PicL(t)と、第2カメラ62で撮影される右画像PicR(t)とのペアを用いて、立体視画像が生成される。 Two images taken at the same time t by the first camera 61 and the second camera 62 of the binocular camera device 60, that is, the left image PicL (t) taken by the first camera 61 and the second camera 62. A stereoscopic image is generated using the paired right image PicR (t).
 第1カメラ61と第2カメラ62との距離D2が、前述の図4に示す単眼カメラ21による2つの撮影位置間の距離D1と同一である場合、第1カメラ61によって撮影される左画像PicL(t)は、時刻tにおいて単眼カメラ21によって撮影される第1の画像Pic(t)と同一になる。また第2カメラ62によって撮影される右画像PicR(t)は、時刻t+Δtにおいて単眼カメラ21によって撮影される第2の画像Pic(t+Δt)と同一になる。 When the distance D2 between the first camera 61 and the second camera 62 is the same as the distance D1 between the two shooting positions by the monocular camera 21 shown in FIG. 4, the left image PicL shot by the first camera 61 (T) is the same as the first image Pic (t) taken by the monocular camera 21 at time t. The right image PicR (t) photographed by the second camera 62 is the same as the second image Pic (t + Δt) photographed by the monocular camera 21 at time t + Δt.
 図20は、両眼カメラ装置60を用いて左側方画像を撮影する様子を模式的に示す図である。図20に示す例では、周辺撮影装置2は、両眼カメラ装置60によって個別に撮影した立体視画像用の左右ペア画像を、描画プレーン上で合成する。 FIG. 20 is a diagram schematically illustrating a state in which a left-side image is captured using the binocular camera device 60. In the example illustrated in FIG. 20, the peripheral photographing device 2 synthesizes left and right pair images for stereoscopic images individually photographed by the binocular camera device 60 on the drawing plane.
 具体的には、周辺撮影装置2は、自車両31が進行方向32に向かって進行する間に、前述の図19に示す両眼カメラ装置60を用いて、自車両31の側方の画像である車両側方画像、たとえば左側方画像を撮影する。両眼カメラ装置60は、第1カメラ61および第2カメラ62によって、各位置で、左画像PicL(t)として、左目用左側方画像Pl1,Pl2,…,Plnと、右画像PicR(t)として、右目用左側方画像Pr1,Pr2,…,Prnとを撮影する。ここで、nは自然数である。 Specifically, the peripheral photographing device 2 uses the binocular camera device 60 shown in FIG. 19 described above to display an image of the side of the own vehicle 31 while the own vehicle 31 travels in the traveling direction 32. A vehicle side image, for example, a left side image is taken. The binocular camera device 60 uses the first camera 61 and the second camera 62 at each position as the left image PicL (t), the left-eye left-side images Pl1, Pl2,..., Pln and the right image PicR (t). .., Prn for the right-eye left side images Pr1, Pr2,. Here, n is a natural number.
 左目用左側方画像Pl1,Pl2,…,Plnと、右目用左側方画像Pr1,Pr2,…,Prnとは、同一地点で撮影した左右1枚ずつの画像ペアであり、自車両31から駐車車両L1,L3,L4または障害物までの距離属性を持つ。自車両31と駐車車両L1,L3,L4または障害物との距離は、周辺検出装置3である超音波センサによって測定されてもよいし、両眼カメラ装置60によって撮影される画像から測距されてもよい。 The left-eye left images Pl1, Pl2,..., Pln and the right-eye left-side images Pr1, Pr2,..., Prn are a pair of left and right images taken at the same point. It has a distance attribute to L1, L3, L4 or an obstacle. The distance between the host vehicle 31 and the parked vehicles L 1, L 3, L 4 or the obstacle may be measured by an ultrasonic sensor that is the periphery detection device 3, or is measured from an image captured by the binocular camera device 60. May be.
 図21は、左目用描画プレーン71の一例を示す図であり、図22は、右目用描画プレーン72の一例を示す図である。制御部13は、立体視画像を作成するための描画プレーンとして、図21に示す左目用描画プレーン71と、図22に示す右目用描画プレーン72とを設ける。ここでは、説明の都合上、左目用描画プレーン71の奥行と、右目用描画プレーン72の奥行とは、前述の図12に示すように同一の距離とする。 FIG. 21 is a diagram illustrating an example of the drawing plane 71 for the left eye, and FIG. 22 is a diagram illustrating an example of the drawing plane 72 for the right eye. The control unit 13 provides a left-eye drawing plane 71 shown in FIG. 21 and a right-eye drawing plane 72 shown in FIG. 22 as drawing planes for creating a stereoscopic image. Here, for convenience of explanation, the depth of the left-eye drawing plane 71 and the depth of the right-eye drawing plane 72 are the same distance as shown in FIG.
 図23は、描画プレーンへの貼り付け方法を説明するための図である。制御部13は、両眼カメラ装置60によって撮影された図20に示す左目用左側方画像Pl1,Pl2,…,Plnおよび右目用左側方画像Pr1,Pr2,…,Prnに基づいて、図23(a)に示すように、撮影された駐車車両L1,L3,L4に対応する左右のペア画像と、左右の駐車スペース表示オブジェクトとを準備する。たとえば、左右のペア画像である左目用車両画像an-lおよび右目用車両画像an-rと、左右の駐車スペース表示オブジェクトである左目用表示オブジェクト51Lおよび右目用表示オブジェクト51Rとを準備する。 FIG. 23 is a diagram for explaining a method of pasting on a drawing plane. The control unit 13 uses the left-eye left-side images Pl1, Pl2,..., Pln and the right-eye left-side images Pr1, Pr2,. As shown in a), left and right pair images corresponding to the captured parked vehicles L1, L3, and L4 and left and right parking space display objects are prepared. For example, a left-eye vehicle image an-1 and a right-eye vehicle image an-r that are left and right pair images, and a left-eye display object 51L and a right-eye display object 51R that are left and right parking space display objects are prepared.
 制御部13は、準備した左右のペア画像an-l,an-r、および左右の駐車スペース表示オブジェクト51L,51Rを、それぞれ対応する左右の描画プレーン71,72に貼り付ける。ここで、左右の画像ペアを貼り付ける左右の描画プレーン71,72上の位置は同じである。 The control unit 13 pastes the prepared left and right pair images an-1 and an-r and the left and right parking space display objects 51L and 51R on the corresponding left and right drawing planes 71 and 72, respectively. Here, the positions on the left and right drawing planes 71 and 72 to which the left and right image pairs are pasted are the same.
 このようにして、左目用描画プレーン71に左目用車両画像an-lおよび左目用表示オブジェクト51Lが貼り付けられた左目用画像73Lと、右目用描画プレーン72に右目用車両画像an-rおよび右目用表示オブジェクト51Lが貼り付けられた右目用画像73Rとが生成される。 In this way, the left-eye image 73L in which the left-eye vehicle image an-1 and the left-eye display object 51L are pasted on the left-eye drawing plane 71, and the right-eye vehicle image an-r and the right-eye are displayed on the right-eye drawing plane 72. A right-eye image 73R to which the display object 51L is pasted is generated.
 図24は、立体視画像の奥行の他の例を示す図である。図25は、立体視画像を含む駐車スペース画像の他の例を示す図である。前述のようにして制御部13によって生成された図21に示す左目用画像73Lおよび右目用画像73Rは、画像出力部14に与えられ、画像出力部14から表示装置4に出力される。 FIG. 24 is a diagram illustrating another example of the depth of a stereoscopic image. FIG. 25 is a diagram illustrating another example of a parking space image including a stereoscopic image. The left-eye image 73L and the right-eye image 73R shown in FIG. 21 generated by the control unit 13 as described above are given to the image output unit 14 and output from the image output unit 14 to the display device 4.
 図24に示すように、背景となる各部分は奥行距離d0が同一となるように表示される。駐車車両L1,L3,L4は、自車両31からの距離に応じた奥行で表示される。たとえば、第4の駐車車両L4は、自車両31からの距離Yl4に応じた奥行距離d4で表示される。 As shown in FIG. 24, the background portions are displayed such that the depth distance d0 is the same. The parked vehicles L1, L3, and L4 are displayed with a depth corresponding to the distance from the host vehicle 31. For example, the fourth parked vehicle L4 is displayed with a depth distance d4 corresponding to the distance Yl4 from the host vehicle 31.
 このようにすることによって、図25に示すように、表示装置4の表示画面74に、奥行きのある3つの立体視画像L1,L3,L4を含む駐車スペース画像75が表示され、自車両31の運転者によって観測される。 In this way, as shown in FIG. 25, the parking space image 75 including the three stereoscopic images L1, L3, and L4 having a depth is displayed on the display screen 74 of the display device 4, and the vehicle 31 Observed by the driver.
 また制御部13は、前述の図4に示す単眼カメラ21、または図19に示す両眼カメラ装置60で撮影された画像から、3次元仮想空間に車両および駐車可能スペースを構築し、視点変換処理によって、立体視画像を生成するように構成されてもよい。この場合、車両側方画像は、単眼カメラ21または両眼カメラ装置60によって撮影され、画像合成部11を介して制御部13に与えられる。制御部13は、与えられた車両側方画像から、各車両の形状および位置を3次元的に計測する。 Further, the control unit 13 constructs a vehicle and a parking space in a three-dimensional virtual space from the image captured by the monocular camera 21 shown in FIG. 4 or the binocular camera device 60 shown in FIG. May be configured to generate a stereoscopic image. In this case, the vehicle side image is taken by the monocular camera 21 or the binocular camera device 60 and given to the control unit 13 via the image synthesis unit 11. The control unit 13 three-dimensionally measures the shape and position of each vehicle from the given vehicle side image.
 図26は、3次元仮想空間に車両をマッピングした状態を模式的に示す図である。制御部13は、前述のようにして計測した各車両の形状および位置に基づいて、各車両を図26に示すように3次元仮想空間にマッピングする。このとき、単眼カメラ21または両眼カメラ装置60で撮影できず、計測できなかった部分は、適当な車両モデルを当てはめて欠損情報を補ってもよい。 FIG. 26 is a diagram schematically showing a state in which the vehicle is mapped in the three-dimensional virtual space. Based on the shape and position of each vehicle measured as described above, the control unit 13 maps each vehicle in a three-dimensional virtual space as shown in FIG. At this time, a portion that could not be photographed by the monocular camera 21 or the binocular camera device 60 and could not be measured may be compensated for missing information by applying an appropriate vehicle model.
 具体的には、3次元仮想空間に仮想視点Pver(x,y,z,θ,φ)を設け、その位置に仮想の両眼カメラ装置80を設置して、両眼カメラ装置80で撮影されたものとして、左目用画像81と右目用画像82とを生成する。この画像生成処理には、たとえば、公知の3次元画像処理が適用される。 Specifically, a virtual viewpoint Pver (x, y, z, θ, φ) is provided in a three-dimensional virtual space, a virtual binocular camera device 80 is installed at that position, and the image is taken by the binocular camera device 80. As an example, a left-eye image 81 and a right-eye image 82 are generated. For example, known three-dimensional image processing is applied to the image generation processing.
 このようにして制御部13によって生成される左目用画像81および右目用画像82は、画像出力部14に与えられ、画像出力部14から表示装置4に出力される。これによって、奥行きのある立体視画像を含む駐車スペース画像が表示装置4の表示画面に表示され、自車両31の運転者によって観測される。 The left-eye image 81 and the right-eye image 82 thus generated by the control unit 13 are given to the image output unit 14 and output from the image output unit 14 to the display device 4. Thus, a parking space image including a stereoscopic image having a depth is displayed on the display screen of the display device 4 and is observed by the driver of the host vehicle 31.
 また、奥行のある立体視画像を含む駐車スペース画像を表示する表示装置4は、立体視画像を表示可能な表示装置に限るものではなく、3次元表現が可能なヘッドアップディスプレイであってもよいし、他の方式による3次元画像表示装置であってもよい。 In addition, the display device 4 that displays a parking space image including a stereoscopic image with a depth is not limited to a display device that can display a stereoscopic image, and may be a head-up display that can perform three-dimensional expression. However, it may be a three-dimensional image display device according to another method.
 図27は、駐車可能スペースを検出する方法を説明するための図である。前述のように、周辺撮影装置2は、たとえば車両31の左側に設置された両眼カメラ装置60によって、立体視画像用に右目用画像L1-r,L3-r,L4-rと左目用画像L1-l,L3-l,L4-lとをペアで撮影する。制御部13は、たとえば、両眼カメラ装置60で撮影された画像L1-l,L1-r,L3-l,L3-r,L4-l,L4-rから、合成画像を生成する。制御部13は、生成した合成画像から、駐車可能スペースSL-l,Sl-rを検出する。 FIG. 27 is a diagram for explaining a method of detecting a parking space. As described above, the peripheral photographing device 2 uses the binocular camera device 60 installed on the left side of the vehicle 31, for example, for the right-eye images L1-r, L3-r, L4-r and the left-eye images for stereoscopic images. Photograph L1-l, L3-l, and L4-l in pairs. For example, the control unit 13 generates a composite image from the images L1-l, L1-r, L3-l, L3-r, L4-l, and L4-r taken by the binocular camera device 60. The control unit 13 detects parking spaces SL-l and Sl-r from the generated composite image.
 駐車可能スペースを検出するときには、たとえば、撮影と合わせて超音波センサで、左右の空きスペースを検出する。検出アルゴリズムおよび駐車可否判断には、公知の技術を用いることができる。 When detecting the parking available space, for example, the left and right empty spaces are detected with an ultrasonic sensor in combination with shooting. A known technique can be used for the detection algorithm and parking availability determination.
 制御部13は、このようにして検出された駐車可能スペースSL-l,Sl-rを示す表示オブジェクトを合成画像に重畳した画像を表示装置4に表示させることによって運転者に提示する。たとえば、前述の図12に示すように、車両も駐車可能スペースも立体視画像として表示される。 The control unit 13 presents to the driver by causing the display device 4 to display an image in which the display objects indicating the parking spaces SL-l and Sl-r detected in this way are superimposed on the composite image. For example, as shown in FIG. 12 described above, both the vehicle and the parking space are displayed as stereoscopic images.
 図28は、駐車スペース画像の他の例を示す図である。駐車スペース画像は、図28(b)に示すように、車両側方画像L1,L3,L4が、表示装置4の表示画面上に対応する車両の進行方向に沿って、表示装置4の表示画面に向かって下側から上側に順に表示されるように構成されてもよい。 FIG. 28 is a diagram showing another example of a parking space image. As shown in FIG. 28 (b), the parking space image is displayed on the display device 4 along the traveling direction of the vehicle corresponding to the vehicle side images L 1, L 3, and L 4 on the display screen of the display device 4. It may be configured to be displayed in order from the lower side toward the upper side.
 図28(b)に示す例では、図28(a)に示すように、第3の駐車車両L3は、第4の駐車車両L4よりも遠くに駐車されており、自車両31と第3の駐車車両L3との距離Yl3は、自車両31と第4の駐車車両L4との距離Yl4よりも大きくなっている(Yl3>Yl4)。したがって、図28(b)に示す駐車スペース画像90では、第3の駐車車両L3の立体角は、第4の駐車車両L4の立体角に比べて、やや狭くなっている。 In the example shown in FIG. 28 (b), as shown in FIG. 28 (a), the third parked vehicle L3 is parked farther than the fourth parked vehicle L4. The distance Yl3 with the parked vehicle L3 is larger than the distance Yl4 between the host vehicle 31 and the fourth parked vehicle L4 (Yl3> Yl4). Therefore, in the parking space image 90 shown in FIG. 28B, the solid angle of the third parked vehicle L3 is slightly narrower than the solid angle of the fourth parked vehicle L4.
 図29~図32は、駐車スペース画像の視点を模式的に示す図である。駐車スペース画像は、図29~図32に示すように、表示エリアを視点変換して表示されてもよい。たとえば、駐車スペース画像は、図29に示すように、水平で前方からの視点Peye1から見た画像として表示されてもよい。また駐車スペース画像は、図30に示すように、水平でやや前方からの視点Peye2から見た画像として表示されてもよい。また駐車スペース画像は、図31に示すように、真上から俯瞰する視点Peye3から見た画像として表示されてもよい。また駐車スペース画像は、図32に示すように、斜め上からの視点Peye4から見た画像として表示されてもよい。 FIG. 29 to FIG. 32 are diagrams schematically showing viewpoints of parking space images. As shown in FIGS. 29 to 32, the parking space image may be displayed by changing the viewpoint of the display area. For example, as shown in FIG. 29, the parking space image may be displayed as an image viewed from the viewpoint Peey 1 from the front in the horizontal direction. In addition, as shown in FIG. 30, the parking space image may be displayed as an image viewed from the viewpoint Peee2 from a slightly horizontal position. Further, as shown in FIG. 31, the parking space image may be displayed as an image viewed from the viewpoint Peee3 viewed from directly above. In addition, as shown in FIG. 32, the parking space image may be displayed as an image viewed from an oblique viewpoint Peey4.
 図33~図36は、駐車スペース画像の視点の他の例を模式的に示す図である。駐車ペース画像は、図33~図36に示すように、予め定める指定エリア、たとえば駐車可能スペースSLを視点変換して表示されてもよい。 33 to 36 are diagrams schematically illustrating other examples of the viewpoint of the parking space image. As shown in FIGS. 33 to 36, the parking pace image may be displayed by changing the viewpoint of a predetermined designated area, for example, a parking space SL.
 この場合、制御部13は、駐車可能スペースを選択して、駐車可能スペースにのみ視点変換を行う。指定エリア以外のエリアは、元のままの視点で表示されてもよいし、指定エリアと同様に視点変換して表示されてもよい。また指定エリアのみ、指定エリア以外のエリアよりも広く表示されてもよい。また駐車可能スペースSLに、車両を立体視画像として表す仮想3次元表示オブジェクト51を表示してもよい。 In this case, the control unit 13 selects a parking space and performs viewpoint conversion only on the parking space. The area other than the designated area may be displayed with the original viewpoint, or may be displayed with the viewpoint changed in the same manner as the designated area. Further, only the designated area may be displayed wider than the area other than the designated area. In addition, a virtual three-dimensional display object 51 that represents the vehicle as a stereoscopic image may be displayed in the parking space SL.
 たとえば、駐車スペース画像は、図33に示すように、指定エリアとして駐車可能スペースを選択して、駐車可能スペースのみ、水平で前方からの視点Peye1から見た画像として表示されてもよい。図33に示す例では、駐車可能スペースには、仮想3次元表示オブジェクト51と、車輪止め91とが表示される。 For example, as shown in FIG. 33, the parking space image may be displayed as an image viewed from the viewpoint Peee 1 from the front in a horizontal direction by selecting a parking space as the designated area. In the example shown in FIG. 33, the virtual three-dimensional display object 51 and the wheel stopper 91 are displayed in the parking space.
 また駐車スペース画像は、図34に示すように、仮想3次元表示オブジェクト51と車輪止め91とを含む駐車可能スペースが、水平でやや前方からの視点Peye2から見た画像として表示されてもよい。また駐車スペース画像は、図35に示すように、仮想3次元表示オブジェクト51と車輪止め91とを含む駐車可能スペースが、真上から俯瞰する視点Peye3から見た画像として表示されてもよい。また駐車スペース画像は、図36に示すように、仮想3次元表示オブジェクト51と車輪止め91とを含む駐車可能スペースが、斜め上からの視点Peye4から見た画像として表示されてもよい。 As shown in FIG. 34, the parking space image may be displayed as an image in which the parking space including the virtual three-dimensional display object 51 and the wheel stopper 91 is viewed from the viewpoint Peey 2 from the front slightly. In addition, as shown in FIG. 35, the parking space image may be displayed as an image viewed from the viewpoint Peee3 where the parking space including the virtual three-dimensional display object 51 and the wheel stopper 91 is seen from directly above. In addition, as shown in FIG. 36, the parking space image may be displayed as an image of a parking space including the virtual three-dimensional display object 51 and the wheel stopper 91 viewed from the viewpoint Peee 4 from diagonally above.
 以上のように駐車スペース画像は、周辺撮影装置2によって車両側方画像が撮影されたときの視点とは異なる視点から見た画像に変換されて表示されてもよい。特に、駐車スペース画像のうち、予め定める指定エリアは、周辺撮影装置2によって車両側方画像が撮影されたときの視点とは異なる視点から見た画像に変換されて表示されてもよい。これによって、運転者に、より直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。 As described above, the parking space image may be converted into an image viewed from a viewpoint different from the viewpoint when the vehicle side image is captured by the peripheral imaging device 2 and displayed. In particular, a predetermined designated area in the parking space image may be converted into an image viewed from a viewpoint different from the viewpoint when the vehicle side image is captured by the peripheral imaging device 2 and displayed. Thus, the parking space can be presented to the driver in a display form that can be understood more intuitively.
 図37は、駐車スペース画像の他の例を示す図である。駐車スペース画像は、たとえば図37に示すように、車両が存在しない駐車可能スペースSLの領域E3が立体視画像で表示され、車両が存在する車両領域E1,E2が2次元画像で表示されてもよい。 FIG. 37 is a diagram showing another example of a parking space image. For example, as shown in FIG. 37, the parking space image is displayed even if the area E3 of the parking space SL where no vehicle is present is displayed as a stereoscopic image and the vehicle areas E1 and E2 where the vehicle is present are displayed as a two-dimensional image. Good.
 これによって、画像処理量を削減することができる。この場合、実際の駐車車両の奥行は、図37(a)に示すようになっているが、車両領域E1,E2は2次元画像で表示される。したがって、2次元画像を立体視画像と組み合わせる場合は、図37(b)に示すように駐車車両L1,L3,L4の奥行距離Ylを、いずれかの駐車車両と自車両31との距離、たとえば、第4の駐車車両L4と自車両31との距離Yl4に統一する。 This can reduce the amount of image processing. In this case, the actual depth of the parked vehicle is as shown in FIG. 37A, but the vehicle areas E1 and E2 are displayed as a two-dimensional image. Therefore, when combining a two-dimensional image with a stereoscopic image, as shown in FIG. 37 (b), the depth distance Yl of the parked vehicles L1, L3, L4 is set to the distance between any parked vehicle and the host vehicle 31, for example, The distance Yl4 between the fourth parked vehicle L4 and the host vehicle 31 is unified.
 図38は、駐車スペース画像の他の例を示す図である。駐車スペース画像は、図38に示すように、自車両31の位置を示す立体表示オブジェクトを含んでもよい。立体表示オブジェクトが表示される自車両の位置は、実際の3次元空間の車両の位置と対応する。自車両の立体表示オブジェクト以外は、たとえば前述の図28(b)と同様に表示される。 FIG. 38 is a diagram illustrating another example of the parking space image. As shown in FIG. 38, the parking space image may include a three-dimensional display object that indicates the position of the host vehicle 31. The position of the host vehicle on which the stereoscopic display object is displayed corresponds to the actual position of the vehicle in the three-dimensional space. Other than the three-dimensional display object of the host vehicle, for example, the same display as in FIG.
 図示は省略するが、駐車車両領域は、他の領域に比べて協調して表示されてもよい。具体的には、駐車車両領域は、車両L1,L3,L4をデフォルメして表示するデフォルメ表示オブジェクトで表示されてもよい。これによって、運転者に、より直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。 Although illustration is abbreviate | omitted, a parked vehicle area | region may be displayed in cooperation compared with another area | region. Specifically, the parked vehicle area may be displayed as a deformed display object that deforms and displays the vehicles L1, L3, and L4. Thus, the parking space can be presented to the driver in a display form that can be understood more intuitively.
 図38に示す例では、左右の車両群は、運転者視点で撮影した自車両から撮影した複数の画像をつなぎ合わせた立体視画像を含む合成画像で表示される。第3の駐車車両L3は、第4の駐車車両L4よりも遠くに駐車されており、第3の駐車車両L3と自車両との距離Yl3は、第4の駐車車両L4と自車両との距離Yl4よりも大きくなっている(Yl3>Yl4)。したがって、図38に示す立体視画像では、第3の駐車車両L3の方が、第4の駐車車両L4に比べて、立体角がやや狭くなっている。 In the example shown in FIG. 38, the left and right vehicle groups are displayed as a composite image including a stereoscopic image obtained by connecting a plurality of images photographed from the host vehicle photographed from the driver's viewpoint. The third parked vehicle L3 is parked farther than the fourth parked vehicle L4, and the distance Yl3 between the third parked vehicle L3 and the host vehicle is the distance between the fourth parked vehicle L4 and the host vehicle. It is larger than Yl4 (Yl3> Yl4). Therefore, in the stereoscopic image shown in FIG. 38, the solid angle of the third parked vehicle L3 is slightly narrower than that of the fourth parked vehicle L4.
 図39は、車両の左右に駐車可能スペースの候補がある場合の駐車スペース画像の一例を示す図である。自車両31の左右に駐車可能スペースの候補がある場合、周辺撮影装置2は、自車両31の左右に設置された単眼カメラ21または両眼カメラ装置60によって、左右両眼用の画像を撮影し、撮影した画像を表す画像情報を画像合成部11に与える。画像合成部11は、与えられた画像情報から合成画像261,262を生成する。駐車可能スペース判断部12は、左右の駐車可能スペースをサーチする。これによって、たとえば、左側駐車可能スペースSLおよび右側駐車可能スペースSRが検出される。 FIG. 39 is a diagram illustrating an example of a parking space image when there are parking space candidates on the left and right of the vehicle. When there are parking space candidates on the left and right of the host vehicle 31, the peripheral photographing device 2 captures left and right binocular images with the monocular camera 21 or the binocular camera device 60 installed on the left and right of the host vehicle 31. Then, image information representing the photographed image is given to the image composition unit 11. The image composition unit 11 generates composite images 261 and 262 from the provided image information. The parking space determination unit 12 searches the left and right parking spaces. Thereby, for example, the left parking space SL and the right parking space SR are detected.
 制御部13は、左右の合成画像261,262、すなわち左側合成画像261および右側合成画像262に、駐車可能スペースSL,SRを示す表示オブジェクトを重畳する。これによって、左右の駐車可能スペースSL,SRを含む駐車スペース画像140が生成される。このような駐車スペース画像140とすることによって、運転者に、より直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。自車両31は、表示装置4の表示画面の予め定める基準線141、たとえば表示画面の中央線上に表示される。具体的には、自車両31は、自車両31の走行軌跡を表す線が表示画面の基準線141に一致するように表示される。 The control unit 13 superimposes display objects indicating the parking spaces SL and SR on the left and right composite images 261 and 262, that is, the left composite image 261 and the right composite image 262. Thereby, a parking space image 140 including the left and right parking spaces SL and SR is generated. By setting it as such a parking space image 140, the parking space can be presented to the driver in a display form that can be understood more intuitively. The host vehicle 31 is displayed on a predetermined reference line 141 on the display screen of the display device 4, for example, the center line of the display screen. Specifically, the host vehicle 31 is displayed such that a line representing the travel locus of the host vehicle 31 coincides with the reference line 141 on the display screen.
 また図39に示す駐車スペース画像140は、自車両31と自車両31の進行方向32に向かって右側および左側に存在する車両または駐車区画との位置関係に応じて、表示装置4の表示画面における左側合成画像261および右側合成画像262の表示位置を変更するように生成されてもよい。これによって、運転者に、より直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。この場合、たとえば、各駐車車両の側方距離dl,drおよび進行方向距離Hl,Hrに応じて、表示装置4の表示画面における左側合成画像61および右側合成画像62の表示位置が変更される。 Further, the parking space image 140 shown in FIG. 39 is displayed on the display screen of the display device 4 according to the positional relationship between the own vehicle 31 and the vehicle or the parking area existing on the right side and the left side in the traveling direction 32 of the own vehicle 31. The display position of the left composite image 261 and the right composite image 262 may be generated so as to be changed. Thus, the parking space can be presented to the driver in a display form that can be understood more intuitively. In this case, for example, the display positions of the left composite image 61 and the right composite image 62 on the display screen of the display device 4 are changed according to the side distances dl, dr and the traveling direction distances H1, Hr of each parked vehicle.
 ここで、側方距離dl,drとは、自車両31の側方における自車両31と自車両31の側方の物体との距離である。側方距離dl,drは、具体的には、自車両31の駐車スペースサーチのときの走行軌跡33と、自車両31の側方の物体との最短距離である。たとえば自車両31の側方の物体が車両L1,L3,L4,R1,R2,R4である場合、側方距離dl,drは、自車両31の駐車スペースサーチのときの走行軌跡33と、各車両L1,L3,L4,R1,R2、R4の自車両31側の先端との距離である。側方距離dl,drは、前述の自車両31との距離Ylnに相当する。 Here, the side distances dl and dr are distances between the host vehicle 31 on the side of the host vehicle 31 and an object on the side of the host vehicle 31. The side distances dl and dr are specifically the shortest distances between the travel locus 33 at the time of the parking space search of the host vehicle 31 and the objects on the side of the host vehicle 31. For example, when the objects on the side of the own vehicle 31 are the vehicles L1, L3, L4, R1, R2, and R4, the side distances dl and dr are the travel locus 33 when the parking space search of the own vehicle 31 is performed, This is the distance from the front end of the vehicle L1, L3, L4, R1, R2, R4 on the own vehicle 31 side. The lateral distances dl and dr correspond to the distance Yln from the host vehicle 31 described above.
 また進行方向距離Hl,Hrとは、自車両31の進行方向における自車両31と自車両31の側方の物体との距離である。進行方向距離Hl,Hrは、具体的には、停車位置における自車両31の進行方向後方側の端部と、自車両31の側方の物体との最短距離である。たとえば自車両31の側方の物体が車両L1,L3,L4,R1,R2,R4である場合、進行方向距離Hl,Hrは、停車位置における自車両31の進行方向後方側の端部と、各車両L1,L3,L4,R1,R2,R4の自車両31側の側面部との距離である。進行方向距離Hl,Hrは、前述の自車両31からの距離Xに相当する。 Further, the traveling direction distances Hl and Hr are distances between the host vehicle 31 and an object on the side of the host vehicle 31 in the traveling direction of the host vehicle 31. The traveling direction distances Hl and Hr are specifically the shortest distances between the end portion on the rear side in the traveling direction of the host vehicle 31 and the object on the side of the host vehicle 31 at the stop position. For example, when the objects on the side of the host vehicle 31 are the vehicles L1, L3, L4, R1, R2, and R4, the traveling direction distances H1 and Hr are the end portions on the rear side in the traveling direction of the host vehicle 31 at the stop position, This is the distance from the side surface of each vehicle L1, L3, L4, R1, R2, R4 on the own vehicle 31 side. The traveling direction distances Hl and Hr correspond to the distance X from the host vehicle 31 described above.
 図40~図47は、車両の左右に駐車可能スペースの候補がある場合の駐車スペース画像の他の例を示す図である。図40に示すように、駐車スペース画像150は、自車両31の位置を示す自車両位置表示オブジェクト151を含んでもよい。これによって、運転者に、より直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。 40 to 47 are diagrams showing other examples of parking space images when there are parking space candidates on the left and right of the vehicle. As shown in FIG. 40, the parking space image 150 may include a host vehicle position display object 151 that indicates the position of the host vehicle 31. Thus, the parking space can be presented to the driver in a display form that can be understood more intuitively.
 また図40では、表示装置4の表示画面のうち、予め定める第1の表示領域に相当する表示画面に向かって左側の領域に左側合成画像261が表示され、予め定める第2の表示領域に相当する表示画面に向かって右側の表域に右側合成画像262が表示されるように、駐車スペース画像150が生成される。これによって、運転者に、より直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。 In FIG. 40, the left composite image 261 is displayed in the left region of the display screen of the display device 4 toward the display screen corresponding to the predetermined first display region, which corresponds to the predetermined second display region. The parking space image 150 is generated so that the right composite image 262 is displayed in the right surface area toward the display screen. Thus, the parking space can be presented to the driver in a display form that can be understood more intuitively.
 また図40では、左側合成画像261および右側合成画像262が、自車両31の位置を支点として自車両31の進行方向32から、互いに反対方向に予め定める回転角度回転して表示されるように、駐車スペース画像150が生成される。具体的には、左側合成画像261および右側合成画像262が、自車両31の進行方向32と一致する表示画面の中央線152から、互いに反対方向に90°回転して表示されるように、駐車スペース画像150が生成される。これによって、運転者に、より直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。 In FIG. 40, the left composite image 261 and the right composite image 262 are displayed so as to be rotated at predetermined rotation angles in opposite directions from the traveling direction 32 of the host vehicle 31 with the position of the host vehicle 31 as a fulcrum. A parking space image 150 is generated. Specifically, the left composite image 261 and the right composite image 262 are parked so that they are displayed by being rotated 90 ° in opposite directions from the center line 152 of the display screen that coincides with the traveling direction 32 of the host vehicle 31. A space image 150 is generated. Thus, the parking space can be presented to the driver in a display form that can be understood more intuitively.
 また図41に示すように、駐車スペース画像160は、左側合成画像261が表示される第1の表示領域161と、右側合成画像262が表示される第2の表示領域162とに加えて、推奨スペースを示す第3の表示領域163を含んでもよい。これによって、運転者に、より直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。 As shown in FIG. 41, the parking space image 160 is recommended in addition to the first display area 161 where the left composite image 261 is displayed and the second display area 162 where the right composite image 262 is displayed. A third display area 163 indicating a space may be included. Thus, the parking space can be presented to the driver in a display form that can be understood more intuitively.
 また図42に示すように、駐車スペース画像170は、自車両31の位置を示す自車両位置表示オブジェクト151を含んでもよい。これによって、運転者に、より直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。この場合、駐車スペース画像170は、図42に示すように、自車両位置表示オブジェクト151が表示される自車両表示領域171と、合成画像261,262が表示される合成画像表示領域172とに分割された構成であってもよい。 42, the parking space image 170 may include a host vehicle position display object 151 indicating the position of the host vehicle 31. Thus, the parking space can be presented to the driver in a display form that can be understood more intuitively. In this case, as shown in FIG. 42, the parking space image 170 is divided into a host vehicle display area 171 in which the host vehicle position display object 151 is displayed and a composite image display area 172 in which the composite images 261 and 262 are displayed. It may be a configured.
 また図42に示すように、制御部13は、周辺撮影装置2によって撮影される自車両31の進行方向32の後方側の後方画像173を、左側合成画像261と右側合成画像262との間に挿入して、左側合成画像261と後方画像171と右側合成画像262とを繋ぎ合せるように、駐車スペース画像170を生成してもよい。これによって、運転者に、より直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。 Further, as shown in FIG. 42, the control unit 13 displays a rear image 173 on the rear side in the traveling direction 32 of the host vehicle 31 photographed by the peripheral photographing device 2 between the left composite image 261 and the right composite image 262. The parking space image 170 may be generated so as to insert the left composite image 261, the rear image 171 and the right composite image 262 together. Thus, the parking space can be presented to the driver in a display form that can be understood more intuitively.
 また図43に示すように、制御部13は、駐車スペース画像180の一部分が、残余の部分とは異なる表示形態で表示されるように、駐車スペース画像180を生成してもよい。図43に示す例では、駐車スペース画像180の一部分、具体的には自車両31が表示される自車両表示領域182が、俯瞰画像として表示される。また自車両表示領域182以外の領域、たとえば合成画像261,262が表示される合成画像表示領域181,183は、2次元画像として表示されてもよい。 43, the control unit 13 may generate the parking space image 180 so that a part of the parking space image 180 is displayed in a display form different from the remaining part. In the example shown in FIG. 43, a part of the parking space image 180, specifically, the host vehicle display area 182 in which the host vehicle 31 is displayed is displayed as an overhead image. In addition, areas other than the host vehicle display area 182, for example, the composite image display areas 181 and 183 where the composite images 261 and 262 are displayed may be displayed as two-dimensional images.
 図44は、駐車スペース画像の他の例を示す図である。図44(b)において、横軸は横方向Xを示し、縦軸は奥行Yを示す。駐車スペース画像は、図44に示すように、俯瞰画像と合成されて表示されてもよい。図44に示す例では、駐車スペース画像は、図44(a)に示す自車両31の画像と、図44(c),(d)に示す俯瞰画像との合成として表示される。 FIG. 44 is a diagram showing another example of a parking space image. In FIG. 44 (b), the horizontal axis indicates the horizontal direction X, and the vertical axis indicates the depth Y. As shown in FIG. 44, the parking space image may be combined with the overhead image and displayed. In the example shown in FIG. 44, the parking space image is displayed as a composite of the image of the host vehicle 31 shown in FIG. 44 (a) and the bird's-eye view images shown in FIGS. 44 (c) and 44 (d).
 図44に示す例では、俯瞰画像として合成または撮影された自車両31の画像と、運転者視点で撮影された立体視画像L4,R4とが並べて表示される。このとき、駐車エリアではない走行エリアを俯瞰画像とするが、俯瞰画像を表示する奥行平面の位置は、駐車可能スペースの奥の距離に合わせることが好ましい。 In the example shown in FIG. 44, the image of the vehicle 31 synthesized or photographed as a bird's-eye view and the stereoscopic images L4 and R4 photographed from the driver's viewpoint are displayed side by side. At this time, although the traveling area that is not the parking area is an overhead image, it is preferable that the position of the depth plane on which the overhead image is displayed is matched to the depth of the parking space.
 俯瞰画像は、図44に示すように、立体視画像でもよい。図44において、y1は、自車両31の左側方の左駐車可能スペース210の先端から第4の左側車両L4の先端までの距離、すなわち第4の左側車両L4が駐車される駐車領域212の周囲の領域211の幅を示す。y2は、自車両31の左側面から左駐車可能スペース210の先端までの距離を示す。 The overhead image may be a stereoscopic image as shown in FIG. 44, y1 is the distance from the tip of the left parking space 210 on the left side of the host vehicle 31 to the tip of the fourth left vehicle L4, that is, around the parking area 212 where the fourth left vehicle L4 is parked. The width of the area 211 is shown. y2 represents the distance from the left side surface of the host vehicle 31 to the tip of the left parking space 210.
 y3は、自車両31の右側面から自車両31の右側方の右駐車可能スペース215の先端までの距離を示す。y4は、右駐車可能スペース215の先端から第4の右側車両R4の先端までの距離、すなわち第4の右側車両R4が駐車される駐車領域217の周囲の領域216の幅を示す。 Y3 indicates the distance from the right side surface of the host vehicle 31 to the tip of the right parking space 215 on the right side of the host vehicle 31. y4 shows the distance from the front-end | tip of the right parking possible space 215 to the front-end | tip of the 4th right vehicle R4, ie, the width | variety of the area | region 216 around the parking area 217 where the 4th right vehicle R4 is parked.
 zrは、第4の右側車両R4の車高を示す。zlは、第4の左側車両L4の車高を示す。zaは、自車両31を表す自車両表示オブジェクトの車高を示す。xaは、自車両表示オブジェクトの車幅を示す。xwは、走行エリアの幅を示す。 Zr indicates the vehicle height of the fourth right vehicle R4. zl represents the vehicle height of the fourth left vehicle L4. za indicates the vehicle height of the host vehicle display object representing the host vehicle 31. xa indicates the vehicle width of the host vehicle display object. xw indicates the width of the travel area.
 図44では、y2を自車両31の左側面から左駐車可能スペース210の先端までの距離としたが、走行エリアの幅の2分の1(1/2)にしてもよいし、自車両31の中心から左駐車可能スペース210の先端までの距離としてもよい。y3についても同様である。 In FIG. 44, y2 is the distance from the left side surface of the host vehicle 31 to the tip of the left parking space 210, but may be half (1/2) the width of the travel area, or the host vehicle 31. The distance from the center of the vehicle to the tip of the left parking space 210 may be used. The same applies to y3.
 道路は、基本的には平面であるので、立体視画像である必要はないが、駐車スペース画像の少なくとも一部分を俯瞰画像として表示することによって、たとえば石などの障害物を立体的に表示することができる。 Since the road is basically a plane, it does not have to be a stereoscopic image, but by displaying at least a part of the parking space image as an overhead image, for example, an obstacle such as a stone can be displayed in a stereoscopic manner. Can do.
 図45は、推奨する駐車可能スペースが存在する場合の駐車スペース画像の一例を示す図である。たとえば、図45に示すように、駐車エリアに駐車車両が7台存在し、2か所の駐車可能スペースPL1,PL2が存在する場合を想定する。 FIG. 45 is a diagram illustrating an example of a parking space image when a recommended parking space exists. For example, as shown in FIG. 45, it is assumed that seven parking vehicles exist in the parking area and two parking spaces PL1 and PL2 exist.
 図45では、第2の駐車可能スペースPL2を推奨する場合の表示例を示す。図45(a)は、駐車エリア内の駐車状況を説明するための図であり、上方から見た俯瞰図に相当する。正面から見た表示を図45(c)に示し、立体表示の奥行表現を図45(b)に示す。図45に示す例では、推奨する駐車可能スペースPL2と、その両隣の駐車車両L6,L8とを立体視画像で表示し、それ以外を平面画像で表示する。すなわち、駐車可能スペースPL2およびその両隣の駐車車両L6,L8を含む領域Q2を、立体視画像で表示される立体視画像表示領域とし、立体視画像表示領域Q2以外の領域Q1,Q3を、平面画像で表示される平面画像表示領域とする。 FIG. 45 shows a display example when the second parking space PL2 is recommended. FIG. 45A is a diagram for explaining a parking situation in the parking area, and corresponds to an overhead view seen from above. The display seen from the front is shown in FIG. 45 (c), and the depth representation of the stereoscopic display is shown in FIG. 45 (b). In the example shown in FIG. 45, the recommended parking space PL2 and the adjacent parked vehicles L6 and L8 are displayed as a stereoscopic image, and the others are displayed as a planar image. That is, the area Q2 including the parking space PL2 and the adjacent parked vehicles L6 and L8 is set as a stereoscopic image display area displayed as a stereoscopic image, and the areas Q1 and Q3 other than the stereoscopic image display area Q2 are planar. A planar image display area displayed as an image is used.
 図46は、駐車スペース画像の他の例を示す図である。図46(a)は、奥行表示の状況を説明するための図であり、図46(b)は、推奨する駐車可能スペースの表示例を示す図である。図46に示すように、推奨する駐車可能スペースPL2に、平面画像の代わりに、車両を示す表示オブジェクト220を追加して表示してもよい。表示オブジェクト220は、たとえば、立体視画像によって表される表示オブジェクトである。このように推奨される駐車可能スペースに、立体視画像によって表される表示オブジェクトを表示させることによって、運転者に、より直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。 FIG. 46 is a diagram showing another example of a parking space image. FIG. 46A is a diagram for explaining the state of depth display, and FIG. 46B is a diagram illustrating a display example of a recommended parking space. As shown in FIG. 46, a display object 220 indicating a vehicle may be added and displayed in the recommended parking space PL2 instead of the plane image. The display object 220 is a display object represented by a stereoscopic image, for example. By displaying the display object represented by the stereoscopic image in the recommended parking space as described above, the parking space can be presented to the driver in a display form that can be understood more intuitively. .
 図47は、駐車スペース画像の他の例を示す図である。図47(a)は、推奨する駐車可能スペースSL,SRの奥行を示す図であり、図47(b)は、推奨する駐車可能スペースの表示例を示す図である。図47に示すように、推奨する駐車スペースの存在する側の駐車エリアと、それ以外の駐車エリアとで表示態様を変えてもよい。図47に示す例の駐車スペース画像230では、右の駐車可能スペースSRを推奨して、表示装置4の表示画面の中央線233に関して、右側の駐車エリア232を立体視画像で表示し、左側の駐車エリア231を平面画像として表示している。平面画像の代わりに、表示オブジェクトを用いてもよい。 FIG. 47 is a diagram showing another example of a parking space image. FIG. 47A is a diagram illustrating the depths of the recommended parking spaces SL and SR, and FIG. 47B is a diagram illustrating a display example of the recommended parking spaces. As shown in FIG. 47, the display mode may be changed between a parking area on the side where a recommended parking space exists and other parking areas. In the parking space image 230 of the example shown in FIG. 47, the right parking space SR is recommended, and the right parking area 232 is displayed as a stereoscopic image with respect to the center line 233 of the display screen of the display device 4. The parking area 231 is displayed as a planar image. A display object may be used instead of the planar image.
 このように左側合成画像261および右側合成画像262のうち、推奨される駐車可能スペースが存在する側の合成画像、すなわち右側合成画像262と、残余の合成画像、すなわち左側合成画像261とが異なる表示形態で表示されるように、駐車スペース画像230を生成することによって、運転者に、より直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。 As described above, of the left composite image 261 and the right composite image 262, the composite image on the side where the recommended parking space exists, that is, the right composite image 262, and the remaining composite image, that is, the left composite image 261 are displayed differently. By generating the parking space image 230 so as to be displayed in the form, the parking space can be presented to the driver in a display form that can be understood more intuitively.
 特に、推奨される駐車可能スペースが存在する側の合成画像、すなわち右側合成画像262を記立体視画像によって表すことによって、運転者に、より一層、直観的に理解が可能な表示形態で駐車可能スペースを提示することができる。 In particular, by representing the composite image of the side where the recommended parking space exists, that is, the right composite image 262 with a stereoscopic image, it is possible to park in a display form that can be more intuitively understood by the driver. Space can be presented.
 <第3の実施の形態>
 図48は、本発明の第3の実施の形態における駐車支援装置201の構成を示すブロック図である。駐車支援装置201は、駐車支援用表示制御装置10A、周辺撮影装置2、周辺検出装置3、表示装置4、車内LAN5、操作入力装置6および走行駆動装置7を備えて構成される。駐車支援用表示制御装置10Aは、画像合成部11、駐車可能スペース判断部12、制御部13、画像出力部14および自動駐車制御部16を備える。
<Third Embodiment>
FIG. 48 is a block diagram showing the configuration of the parking assistance apparatus 201 according to the third embodiment of the present invention. The parking assistance device 201 includes a parking assistance display control device 10 </ b> A, a peripheral photographing device 2, a peripheral detection device 3, a display device 4, an in-vehicle LAN 5, an operation input device 6, and a travel drive device 7. The parking assistance display control device 10 </ b> A includes an image composition unit 11, a parking space determination unit 12, a control unit 13, an image output unit 14, and an automatic parking control unit 16.
 駐車支援用表示制御装置10Aは、たとえば、車両などの移動体に搭載される。周辺撮影装置2、周辺検出装置3、表示装置4、車内LAN5、操作入力装置6および走行駆動装置7は、駐車支援用表示制御装置10Aを搭載する自車両に設けられる。 The parking support display control device 10A is mounted on a moving body such as a vehicle, for example. The peripheral photographing device 2, the peripheral detection device 3, the display device 4, the in-vehicle LAN 5, the operation input device 6, and the travel drive device 7 are provided in the host vehicle on which the parking assist display control device 10A is mounted.
 本実施の形態の駐車支援装置201は、前述の第1の実施の形態の駐車支援装置100および第2の実施の形態の駐車支援装置200と同一の構成を含んでいるので、同一の構成については同一の参照符号を付して、共通する説明を省略する。本発明の他の実施の形態である駐車支援用表示制御方法は、本実施の形態の駐車支援用表示制御装置10Aによって実行される。 Since parking support apparatus 201 of the present embodiment includes the same configuration as parking support apparatus 100 of the first embodiment and parking support apparatus 200 of the second embodiment, the same configuration is used. Are denoted by the same reference numerals, and a common description is omitted. The parking assistance display control method according to another embodiment of the present invention is executed by the parking assistance display control apparatus 10A of the present embodiment.
 自動駐車制御部16は、駐車可能スペース情報と駐車スペース選択情報により、走行駆動装置7を制御し、自動駐車を実行する。 The automatic parking control unit 16 controls the driving device 7 based on the parking space information and the parking space selection information, and executes automatic parking.
 走行駆動装置7は、車両のハンドル、ブレーキ、アクセルを制御して、車両の駆動を行う。 The traveling drive device 7 controls the steering wheel, brake, and accelerator of the vehicle to drive the vehicle.
 図49は、本発明の第3の実施の形態における駐車スペース画像290の一例を示す図である。本実施の形態では、駐車スペース画像290には、駐車可能スペースを選択するための表示オブジェクト291が含まれる。表示オブジェクト291は、「駐車スペースを選択してください」などの文字が表示される文字表示領域292と、左側駐車可能スペースSLを選択するためのボタン293と、右側駐車可能スペースSRを選択するためのボタン294とを含む。運転者は、表示オブジェクト291内のボタン293,294のいずれかを選択することによって、駐車可能スペースを選択する。 FIG. 49 is a diagram showing an example of a parking space image 290 according to the third embodiment of the present invention. In the present embodiment, the parking space image 290 includes a display object 291 for selecting a parking available space. The display object 291 includes a character display area 292 for displaying characters such as “Please select a parking space”, a button 293 for selecting the left parking space SL, and a right parking space SR. Button 294. The driver selects a parking space by selecting one of the buttons 293 and 294 in the display object 291.
 本実施の形態においても、駐車スペース画像290は、車両側方画像を含むので、前述の第1および第2の実施の形態と同様の効果を得ることができる。 Also in the present embodiment, since the parking space image 290 includes a vehicle side image, it is possible to obtain the same effects as those of the first and second embodiments described above.
 以上に述べた本実施の形態の駐車支援用表示制御装置は、車両に搭載可能なナビゲーション装置だけでなく、通信端末装置、およびサーバ装置などを適宜に組み合わせてシステムとして構築される駐車支援システムに適用することができる。通信端末装置は、たとえばサーバ装置と通信を行う機能を有するPND(Portable Navigation Device)および携帯通信装置である。携帯通信装置は、たとえば携帯電話機、スマートフォンおよびタブレット型端末装置である。 The parking support display control device of the present embodiment described above is not only a navigation device that can be mounted on a vehicle, but also a parking support system constructed as a system by appropriately combining communication terminal devices, server devices, and the like. Can be applied. The communication terminal device is, for example, a PND (Portable Navigation Device) and a portable communication device having a function of communicating with a server device. The mobile communication device is, for example, a mobile phone, a smartphone, and a tablet terminal device.
 前述のようにナビゲーション装置と通信端末装置とサーバ装置とを適宜に組み合わせてシステムが構築される場合、本実施の形態の駐車支援用表示制御装置の各構成要素は、前記システムを構築する各装置に分散して配置されてもよいし、いずれかの装置に集中して配置されてもよい。 As described above, when the system is constructed by appropriately combining the navigation device, the communication terminal device, and the server device, each component of the parking assistance display control device according to the present embodiment includes each device that constructs the system. It may be arranged in a distributed manner, or may be arranged concentrated on any device.
 たとえば、駐車支援用表示制御装置に備えられる画像合成部、駐車可能スペース判断部および制御部は、サーバ装置に配置されてもよいし、携帯通信装置などの通信端末装置に配置されてもよい。 For example, the image composition unit, the parking space determination unit, and the control unit provided in the parking assistance display control device may be arranged in the server device or in a communication terminal device such as a portable communication device.
 前述の駐車支援用表示制御装置に備えられる画像合成部、駐車可能スペース判断部および制御部がサーバ装置に配置される場合の駐車支援装置は、以下の第4の実施の形態に示す構成を有する。また、前述の画像合成部、駐車可能スペース判断部および制御部が携帯通信装置に配置される場合の駐車支援装置は、以下の第5の実施の形態に示す構成を有する。 The parking assistance apparatus in the case where the image composition unit, the parking space determination unit, and the control unit provided in the parking assistance display control device described above are arranged in the server device has a configuration shown in the following fourth embodiment. . Moreover, the parking assistance apparatus in case the above-mentioned image composition part, parking space determination part, and control part are arrange | positioned at a portable communication apparatus has a structure shown in the following 5th Embodiment.
 <第4の実施の形態>
 図50は、本発明の第4の実施の形態における駐車支援装置500の構成を示すブロック図である。本実施の形態の駐車支援装置500は、情報提供装置300とサーバ装置400とを備えて構成される。本実施の形態の駐車支援装置500は、前述の第1および第2の実施の形態の駐車支援装置100,200と同一の構成が含まれているので、同一の構成については同一の参照符号を付して、共通する説明を省略する。
<Fourth embodiment>
FIG. 50 is a block diagram showing a configuration of a parking assistance apparatus 500 according to the fourth embodiment of the present invention. The parking assistance apparatus 500 according to the present embodiment includes an information providing apparatus 300 and a server apparatus 400. Since the parking support device 500 of the present embodiment includes the same configuration as the parking support devices 100 and 200 of the first and second embodiments described above, the same reference numerals are used for the same configuration. In addition, a common description is omitted.
 情報提供装置300は、車両に搭載される。情報提供装置300は、情報提供装置本体310、周辺撮影装置2、周辺検出装置3、表示装置4、車内LAN5および操作入力装置6を備えて構成される。 The information providing apparatus 300 is mounted on a vehicle. The information providing apparatus 300 includes an information providing apparatus main body 310, a peripheral photographing apparatus 2, a peripheral detection apparatus 3, a display apparatus 4, an in-vehicle LAN 5, and an operation input apparatus 6.
 情報提供装置本体310は、画像出力部14、車内LANインタフェース15、車載側制御部311および車載側通信部312を備える。車載側制御部311は、たとえばCPU(Central Processing Unit)と、書き込み可能なRAMなどのメモリとによって構成される。メモリは、制御プログラムを記憶する。CPUが、メモリに記憶されている制御プログラムを実行することによって、画像出力部14、車内LANインタフェース15および車載側通信部312を統括的に制御する。 The information providing apparatus main body 310 includes an image output unit 14, an in-vehicle LAN interface 15, an in-vehicle side control unit 311, and an in-vehicle side communication unit 312. The in-vehicle side control unit 311 includes, for example, a CPU (Central Processing Unit) and a memory such as a writable RAM. The memory stores a control program. The CPU controls the image output unit 14, the in-vehicle LAN interface 15, and the in-vehicle side communication unit 312 in an integrated manner by executing a control program stored in the memory.
 サーバ装置400は、画像合成部11、駐車可能スペース判断部12、サーバ側通信部401およびサーバ側制御部402を備えて構成される。 The server device 400 includes an image composition unit 11, a parking space determination unit 12, a server side communication unit 401, and a server side control unit 402.
 サーバ側通信部401は、情報提供装置300と通信を行う。サーバ側通信部401は、情報提供装置300と通信を行う場合、たとえば、インターネットなどの通信網を介して、情報提供装置300と通信可能に構成される。 The server side communication unit 401 communicates with the information providing apparatus 300. When communicating with the information providing apparatus 300, the server side communication unit 401 is configured to be able to communicate with the information providing apparatus 300 via a communication network such as the Internet, for example.
 サーバ側制御部402は、たとえばCPUと、書き込み可能なRAMなどのメモリとによって構成される。メモリは、制御プログラムを記憶する。CPUが、メモリに記憶されている制御プログラムを実行することによって、画像合成部11および駐車可能スペース判断部12およびサーバ側通信部401を統括的に制御する。 The server-side control unit 402 includes, for example, a CPU and a memory such as a writable RAM. The memory stores a control program. The CPU executes the control program stored in the memory, thereby comprehensively controlling the image composition unit 11, the parking space determination unit 12, and the server side communication unit 401.
 以上のように本実施の形態では、画像合成部11、駐車可能スペース判断部12およびサーバ側制御部402は、サーバ装置400に配置される。このように配置される場合でも、前述の第1~第3の実施の形態と同様の効果を得ることができる。 As described above, in the present embodiment, the image composition unit 11, the parking space determination unit 12, and the server-side control unit 402 are arranged in the server device 400. Even in this arrangement, the same effects as those of the first to third embodiments can be obtained.
 <第5の実施の形態>
 図51は、本発明の第5の実施の形態における駐車支援装置510の構成を示すブロック図である。本実施の形態の駐車支援装置510は、情報提供装置300と携帯通信装置410とを備えて構成される。本実施の形態の駐車支援装置500は、前述の第4の実施の形態の駐車支援装置500と同一の構成が含まれているので、同一の構成については同一の参照符号を付して、共通する説明を省略する。
<Fifth embodiment>
FIG. 51 is a block diagram showing a configuration of a parking assistance apparatus 510 according to the fifth embodiment of the present invention. The parking assistance apparatus 510 according to the present embodiment includes an information providing apparatus 300 and a portable communication apparatus 410. Since the parking support device 500 of the present embodiment includes the same configuration as the parking support device 500 of the fourth embodiment described above, the same configuration is denoted by the same reference numeral and is common. Description to be omitted is omitted.
 携帯通信装置410は、たとえば携帯電話機、スマートフォンまたはタブレット型端末装置によって実現される。携帯通信装置410は、画像合成部11、駐車可能スペース判断部12、携帯側通信部411および携帯側制御部412を備えて構成される。 The mobile communication device 410 is realized by, for example, a mobile phone, a smartphone, or a tablet terminal device. The mobile communication device 410 includes an image composition unit 11, a parking space determination unit 12, a mobile side communication unit 411, and a mobile side control unit 412.
 携帯側通信部411は、情報提供装置300と通信を行う。携帯側通信部411は、情報提供装置300と通信を行う場合、たとえば、インターネットなどの通信網を介して、情報提供装置300と通信可能に構成される。また、携帯側通信部411は、情報提供装置300と通信を行う場合、たとえば、近距離無線通信規格であるブルートゥース(Bluetooth(登録商標))または無線LANなどの無線によって、情報提供装置300と通信可能に構成されてもよい。また、携帯側通信部411は、USB(Universal Serial Bus)ケーブルまたはLANケーブルなどの有線によって、情報提供装置300と通信可能に構成されてもよい。 The mobile communication unit 411 communicates with the information providing apparatus 300. When communicating with the information providing apparatus 300, the mobile communication unit 411 is configured to be able to communicate with the information providing apparatus 300 via a communication network such as the Internet, for example. Further, when communicating with the information providing apparatus 300, the portable communication unit 411 communicates with the information providing apparatus 300 by wireless such as Bluetooth (registered trademark) or wireless LAN, which is a short-range wireless communication standard. It may be configured to be possible. Further, the mobile communication unit 411 may be configured to be able to communicate with the information providing apparatus 300 by a wired line such as a USB (Universal Serial Bus) cable or a LAN cable.
 携帯側制御部412は、たとえばCPUと、書き込み可能なRAMなどのメモリとによって構成される。メモリは、制御プログラムを記憶する。CPUが、メモリに記憶されている制御プログラムを実行することによって、画像合成部11および駐車可能スペース判断部12および携帯側通信部411を統括的に制御する。 The portable side control unit 412 is constituted by, for example, a CPU and a memory such as a writable RAM. The memory stores a control program. The CPU executes the control program stored in the memory, thereby comprehensively controlling the image composition unit 11, the parking space determination unit 12, and the mobile communication unit 411.
 以上のように本実施の形態では、画像合成部11、駐車可能スペース判断部12および携帯側制御部412は、携帯通信装置410に配置される。このように配置される場合でも、前述の第1~第3の実施の形態と同様の効果を得ることができる。 As described above, in the present embodiment, the image composition unit 11, the parking space determination unit 12, and the portable side control unit 412 are arranged in the portable communication device 410. Even in this arrangement, the same effects as those of the first to third embodiments can be obtained.
 なお、本発明は、その発明の範囲内において、各実施の形態を自由に組み合わせることが可能である。また、各実施の形態の任意の構成要素を適宜、変更または省略することが可能である。 The present invention can be freely combined with each embodiment within the scope of the invention. In addition, any component in each embodiment can be changed or omitted as appropriate.
 本発明は詳細に説明されたが、上記した説明は、すべての態様において、例示であって、本発明がそれに限定されるものではない。例示されていない無数の変形例が、本発明の範囲から外れることなく想定され得るものと解される。 Although the present invention has been described in detail, the above description is illustrative in all aspects, and the present invention is not limited thereto. It is understood that countless variations that are not illustrated can be envisaged without departing from the scope of the present invention.
 1,10,10A 駐車支援用表示制御装置、2 周辺撮影装置、3 周辺検出装置、4 表示装置、5 車内LAN、6 操作入力装置、7 走行駆動装置、11 画像合成部、12 駐車可能スペース判断部、13 制御部、14 画像出力部、15 車内LANインタフェース、16 自動駐車制御部、21 左側カメラ、22 右側カメラ、100,200,201,500,510 駐車支援装置、311 車載側制御部、312 車載側通信部、300 情報提供装置、310 情報提供装置本体、400 サーバ装置、401 サーバ側通信部、402 サーバ側制御部、410 携帯通信装置、411 携帯側通信部、412 携帯側制御部。 1, 10, 10A Parking support display control device, 2 peripheral photographing device, 3 peripheral detection device, 4 display device, 5 in-vehicle LAN, 6 operation input device, 7 travel drive device, 11 image compositing unit, 12 parking space judgment Unit, 13 control unit, 14 image output unit, 15 in-vehicle LAN interface, 16 automatic parking control unit, 21 left camera, 22 right camera, 100, 200, 201, 500, 510 parking assist device, 311 in-vehicle control unit, 312 On-vehicle side communication unit, 300 information providing device, 310 information providing device body, 400 server device, 401 server side communication unit, 402 server side control unit, 410 mobile communication device, 411 mobile side communication unit, 412 mobile side control unit.

Claims (19)

  1.  周辺検出装置によって検出される車両の周辺の状況に基づいて、前記車両の周辺に駐車可能な駐車可能スペースが有るか否かを判断する駐車可能スペース判断部と、
     周辺撮影装置によって前記車両の進行方向に沿って順に撮影される前記車両の側方の車両側方画像を表す画像情報を取得し、取得した前記画像情報に基づいて、前記車両側方画像を前記車両の進行方向に沿って繋ぎ合わせた合成画像を生成する画像合成部と、
     前記駐車可能スペース判断部によって前記駐車可能スペースが有ると判断された場合、前記駐車可能スペースの位置を示す駐車スペース表示オブジェクトを、前記画像合成部によって生成された前記合成画像に重畳して、駐車スペース画像を生成する制御部と、
     前記制御部によって生成された前記駐車スペース画像を表示装置に出力する画像出力部とを備え、
     前記駐車スペース画像は、少なくとも一部分が、立体的に視認可能な立体視画像であることを特徴とする駐車支援用表示制御装置。
    A parking space determination unit that determines whether or not there is a parking space that can be parked around the vehicle, based on the situation around the vehicle detected by the periphery detection device;
    Image information representing vehicle side images of the side of the vehicle that are sequentially photographed along a traveling direction of the vehicle by a peripheral photographing device is acquired, and the vehicle side image is acquired based on the acquired image information. An image compositing unit that generates a composite image joined along the traveling direction of the vehicle;
    When the parking space determination unit determines that the parking space is present, the parking space display object indicating the position of the parking space is superimposed on the composite image generated by the image combining unit, and parking is performed. A control unit for generating a space image;
    An image output unit that outputs the parking space image generated by the control unit to a display device;
    The parking support display control device, wherein at least a part of the parking space image is a stereoscopically viewable stereoscopic image.
  2.  前記駐車スペース画像は、前記車両以外の他の車両が駐車される駐車車両領域を含み、
     前記駐車車両領域は、前記立体視画像によって表されることを特徴とする請求項1に記載の駐車支援用表示制御装置。
    The parking space image includes a parked vehicle area where a vehicle other than the vehicle is parked,
    The parking assistance display control apparatus according to claim 1, wherein the parked vehicle area is represented by the stereoscopic image.
  3.  前記駐車可能スペースは、前記立体視画像によって表されることを特徴とする請求項1に記載の駐車支援用表示制御装置。 The parking support display control device according to claim 1, wherein the parking space is represented by the stereoscopic image.
  4.  前記駐車スペース表示オブジェクトは、前記立体視画像によって表されることを特徴とする請求項1に記載の駐車支援用表示制御装置。 The parking support display control device according to claim 1, wherein the parking space display object is represented by the stereoscopic image.
  5.  前記駐車スペース画像は、前記周辺撮影装置によって前記車両側方画像が撮影されたときの視点とは異なる視点から見た画像に変換されて表示されることを特徴とする請求項1に記載の駐車支援用表示制御装置。 The parking space according to claim 1, wherein the parking space image is displayed after being converted into an image viewed from a viewpoint different from a viewpoint when the vehicle side image is captured by the peripheral imaging device. Display control device for support.
  6.  前記駐車スペース画像のうち、予め定める指定エリアは、前記周辺撮影装置によって前記車両側方画像が撮影されたときの視点とは異なる視点から見た画像に変換されて表示されることを特徴とする請求項1に記載の駐車支援用表示制御装置。 Of the parking space image, a predetermined designated area is converted into an image viewed from a viewpoint different from the viewpoint when the vehicle side image is captured by the peripheral imaging device, and is displayed. The display control apparatus for parking assistance according to claim 1.
  7.  前記駐車スペース画像は、前記車両以外の他の車両が駐車される駐車車両領域を含み、
     前記駐車車両領域は、平面的に視認される平面画像によって表されることを特徴とする請求項1に記載の駐車支援用表示制御装置。
    The parking space image includes a parked vehicle area where a vehicle other than the vehicle is parked,
    The parking assistance display control apparatus according to claim 1, wherein the parked vehicle area is represented by a planar image visually recognized in a planar manner.
  8.  前記駐車スペース画像は、前記車両以外の他の車両が駐車される駐車車両領域を含み、
     前記駐車車両領域は、他の領域に比べて強調して表示されることを特徴とする請求項1に記載の駐車支援用表示制御装置。
    The parking space image includes a parked vehicle area where a vehicle other than the vehicle is parked,
    The parking assist display control apparatus according to claim 1, wherein the parked vehicle area is displayed with emphasis as compared with other areas.
  9.  前記制御部は、前記合成画像に、前記車両の位置を示す車両位置表示オブジェクトをさらに重畳して、前記駐車スペース画像を生成することを特徴とする請求項1に記載の駐車支援用表示制御装置。 The parking control display control device according to claim 1, wherein the control unit further superimposes a vehicle position display object indicating the position of the vehicle on the composite image to generate the parking space image. .
  10.  前記画像合成部は、前記合成画像として、前記車両側方画像のうち、前記車両の進行方向に向かって右側方の右側方画像を前記車両の進行方向に沿って繋ぎ合わせた右側合成画像と、前記車両の進行方向に向かって左側方の左側方画像を前記車両の進行方向に沿って繋ぎ合わせた左側合成画像とを生成し、
     前記制御部は、前記駐車スペース表示オブジェクトを前記右側合成画像と前記左側合成画像とにそれぞれ重畳して、前記駐車スペース画像を生成することを特徴とする請求項1に記載の駐車支援用表示制御装置。
    The image composition unit, as the composite image, among the vehicle side images, a right composite image obtained by joining right side images on the right side in the traveling direction of the vehicle along the traveling direction of the vehicle, Generating a left composite image in which left side images on the left side in the direction of travel of the vehicle are joined together along the direction of travel of the vehicle;
    The parking control display control according to claim 1, wherein the control unit generates the parking space image by superimposing the parking space display object on the right composite image and the left composite image, respectively. apparatus.
  11.  前記制御部は、前記表示装置の表示画面のうち、予め定める第1の表示領域に前記左側合成画像が表示され、予め定める第2の表示領域に前記右側合成画像が表示されるように、前記駐車スペース画像を生成することを特徴とする請求項10に記載の駐車支援用表示制御装置。 The control unit is configured to display the left composite image in a predetermined first display area and display the right composite image in a predetermined second display area of the display screen of the display device. The parking support display control device according to claim 10, wherein a parking space image is generated.
  12.  前記制御部は、前記右側合成画像および前記左側合成画像が、前記車両の位置を支点として前記車両の進行方向から、互いに反対方向に予め定める回転角度回転して表示されるように、前記駐車スペース画像を生成することを特徴とする請求項10に記載の駐車支援用表示制御装置。 The control unit is configured to display the parking space so that the right composite image and the left composite image are displayed by rotating at predetermined rotation angles in opposite directions from the traveling direction of the vehicle with the position of the vehicle as a fulcrum. The display control apparatus for parking assistance according to claim 10, wherein an image is generated.
  13.  前記制御部は、前記車両と前記車両の進行方向に向かって右側および左側に存在する車両または駐車区画との位置関係に応じて、前記表示装置の表示画面における前記右側合成画像および前記左側合成画像の表示位置を変更するように、前記駐車スペース画像を生成することを特徴とする請求項10に記載の駐車支援用表示制御装置。 The control unit is configured to display the right composite image and the left composite image on the display screen of the display device according to a positional relationship between the vehicle and a vehicle or a parking space existing on the right and left sides in the traveling direction of the vehicle. The parking assist display control device according to claim 10, wherein the parking space image is generated so as to change a display position of the parking space.
  14.  前記制御部は、前記周辺撮影装置によって撮影される前記車両の進行方向の後方側の後方画像を、前記右側合成画像と前記左側合成画像との間に挿入して、前記右側合成画像と前記後方画像と前記左側合成画像とを繋ぎ合せるように、前記駐車スペース画像を生成することを特徴とする請求項10に記載の駐車支援用表示制御装置。 The control unit inserts a rear image on the rear side in the traveling direction of the vehicle photographed by the peripheral photographing device between the right composite image and the left composite image, so that the right composite image and the rear The parking support display control apparatus according to claim 10, wherein the parking space image is generated so as to connect the image and the left composite image.
  15.  前記制御部は、推奨される駐車可能スペースと残余の領域とが異なる表示形態で表示されるように、前記駐車スペース画像を生成することを特徴とする請求項1に記載の駐車支援用表示制御装置。 The parking control display control according to claim 1, wherein the control unit generates the parking space image so that a recommended parking space and a remaining area are displayed in different display forms. apparatus.
  16.  前記駐車スペース画像は、前記推奨される駐車可能スペースに、前記立体視画像によって表される表示オブジェクトを含むことを特徴とする請求項15に記載の駐車支援用表示制御装置。 The parking support display control device according to claim 15, wherein the parking space image includes a display object represented by the stereoscopic image in the recommended parking space.
  17.  前記制御部は、前記右側合成画像および前記左側合成画像のうち、推奨される駐車可能スペースが存在する側の合成画像と、残余の合成画像とが異なる表示形態で表示されるように、前記駐車スペース画像を生成することを特徴とする請求項10に記載の駐車支援用表示制御装置。 The control unit is configured to display the parking image so that a composite image on a side where a recommended parking space is present and a remaining composite image are displayed in different display forms among the right composite image and the left composite image. The display control device for parking assistance according to claim 10, wherein a space image is generated.
  18.  前記推奨される駐車可能スペースが存在する側の合成画像は、前記立体視画像によって表されることを特徴とする請求項17に記載の駐車支援用表示制御装置。 18. The parking assistance display control device according to claim 17, wherein the composite image on the side where the recommended parking space exists is represented by the stereoscopic image.
  19.  周辺撮影装置によって車両の進行方向に沿って順に撮影される前記車両の側方の車両側方画像を表す画像情報を取得し、
     取得した前記画像情報に基づいて、前記車両側方画像を前記車両の進行方向に沿って繋ぎ合わせた合成画像を生成し、
     周辺検出装置によって検出される前記車両の周辺の状況に基づいて、前記車両の周辺に駐車可能な駐車可能スペースが有るか否かを判断し、
     前記駐車可能スペースが有ると判断された場合、前記駐車可能スペースの位置を示す駐車スペース表示オブジェクトを前記合成画像に重畳して、少なくとも一部分が立体的に視認可能な立体視画像である駐車スペース画像を生成し、
     生成された前記駐車スペース画像を表示装置に出力することを特徴とする駐車支援用表示制御方法。
    Obtaining image information representing a vehicle side image of the side of the vehicle that is sequentially photographed along the traveling direction of the vehicle by the peripheral photographing device;
    Based on the acquired image information, generate a composite image in which the vehicle side images are joined along the traveling direction of the vehicle,
    Based on the situation around the vehicle detected by the surrounding detection device, determine whether there is a parking space that can be parked around the vehicle,
    When it is determined that there is a parking space, a parking space image that is a stereoscopic image that is at least partially stereoscopically visible by superimposing a parking space display object indicating the position of the parking space on the composite image. Produces
    A parking assistance display control method, comprising: outputting the generated parking space image to a display device.
PCT/JP2016/062155 2016-04-15 2016-04-15 Display control device for parking assistance, and display control method for parking assistance WO2017179205A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2018511865A JP6611918B2 (en) 2016-04-15 2016-04-15 Parking assistance display control apparatus and parking assistance display control method
PCT/JP2016/062155 WO2017179205A1 (en) 2016-04-15 2016-04-15 Display control device for parking assistance, and display control method for parking assistance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/062155 WO2017179205A1 (en) 2016-04-15 2016-04-15 Display control device for parking assistance, and display control method for parking assistance

Publications (1)

Publication Number Publication Date
WO2017179205A1 true WO2017179205A1 (en) 2017-10-19

Family

ID=60042392

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/062155 WO2017179205A1 (en) 2016-04-15 2016-04-15 Display control device for parking assistance, and display control method for parking assistance

Country Status (2)

Country Link
JP (1) JP6611918B2 (en)
WO (1) WO2017179205A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200025421A (en) * 2018-08-30 2020-03-10 이준서 Augmented Reality Based Parking Guidance System in Indoor Parking Lot
CN111507269A (en) * 2020-04-17 2020-08-07 浙江大华技术股份有限公司 Parking space state identification method and device, storage medium and electronic device
CN113823112A (en) * 2021-07-31 2021-12-21 浙江慧享信息科技有限公司 Park parking space reservation auxiliary system and auxiliary method based on 3D projection

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002274304A (en) * 2001-03-21 2002-09-25 Nissan Motor Co Ltd Parking position setting device
JP2009083735A (en) * 2007-10-01 2009-04-23 Nissan Motor Co Ltd Parking assistant device and method
JP2012175483A (en) * 2011-02-23 2012-09-10 Renesas Electronics Corp Device and method for traffic lane recognition
JP2013154730A (en) * 2012-01-30 2013-08-15 Fujitsu Ten Ltd Apparatus and method for processing image, and parking support system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012158227A (en) * 2011-01-31 2012-08-23 Fujitsu Ten Ltd Image processor, parking control system, and image processing method
JP2016016681A (en) * 2014-07-04 2016-02-01 クラリオン株式会社 Parking frame recognition device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002274304A (en) * 2001-03-21 2002-09-25 Nissan Motor Co Ltd Parking position setting device
JP2009083735A (en) * 2007-10-01 2009-04-23 Nissan Motor Co Ltd Parking assistant device and method
JP2012175483A (en) * 2011-02-23 2012-09-10 Renesas Electronics Corp Device and method for traffic lane recognition
JP2013154730A (en) * 2012-01-30 2013-08-15 Fujitsu Ten Ltd Apparatus and method for processing image, and parking support system

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200025421A (en) * 2018-08-30 2020-03-10 이준서 Augmented Reality Based Parking Guidance System in Indoor Parking Lot
KR102197704B1 (en) * 2018-08-30 2021-01-07 주식회사 큐엘씨 Augmented Reality Based Parking Guidance System in Indoor Parking Lot
CN111507269A (en) * 2020-04-17 2020-08-07 浙江大华技术股份有限公司 Parking space state identification method and device, storage medium and electronic device
CN111507269B (en) * 2020-04-17 2023-05-09 浙江大华技术股份有限公司 Parking space state identification method and device, storage medium and electronic device
CN113823112A (en) * 2021-07-31 2021-12-21 浙江慧享信息科技有限公司 Park parking space reservation auxiliary system and auxiliary method based on 3D projection
CN113823112B (en) * 2021-07-31 2023-01-03 浙江慧享信息科技有限公司 Park parking space reservation auxiliary system and auxiliary method based on 3D projection

Also Published As

Publication number Publication date
JP6611918B2 (en) 2019-11-27
JPWO2017179205A1 (en) 2018-09-27

Similar Documents

Publication Publication Date Title
CA3069114C (en) Parking assistance method and parking assistance device
US7554461B2 (en) Recording medium, parking support apparatus and parking support screen
JP6304628B2 (en) Display device and display method
US9862316B2 (en) Method and device for visualizing the surroundings of a vehicle
JP4476719B2 (en) Navigation system
CN106043306A (en) Vehicle control device
JP5726201B2 (en) Three-dimensional stereoscopic display device, three-dimensional stereoscopic display control device, and LSI circuit
JP6281289B2 (en) Perimeter monitoring apparatus and program
JP6760122B2 (en) Peripheral monitoring device
JP2016090344A (en) Navigation device and navigation program
JP5914114B2 (en) Parking assistance device and parking assistance method
JP6611918B2 (en) Parking assistance display control apparatus and parking assistance display control method
WO2019008762A1 (en) Parking assistance method and parking assistance device
JP2017069739A (en) Periphery monitoring device
JP2019054420A (en) Image processing system
CN110997409B (en) Peripheral monitoring device
WO2017179206A1 (en) Display control device for parking assistance, and display control method for parking assistance
JP6720729B2 (en) Display controller
JP7009785B2 (en) Peripheral monitoring device
JP2012066615A (en) Driving support system
JP5212422B2 (en) Driving assistance device
JP2017065572A (en) Surrounding state monitoring device
WO2021171397A1 (en) Display control device, display device, and display control method
JP2017218008A (en) Driving support device
JP2022048454A (en) Vehicle display device

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018511865

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16898665

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16898665

Country of ref document: EP

Kind code of ref document: A1