WO2010137684A1 - 画像生成装置及び画像表示システム - Google Patents
画像生成装置及び画像表示システム Download PDFInfo
- Publication number
- WO2010137684A1 WO2010137684A1 PCT/JP2010/059074 JP2010059074W WO2010137684A1 WO 2010137684 A1 WO2010137684 A1 WO 2010137684A1 JP 2010059074 W JP2010059074 W JP 2010059074W WO 2010137684 A1 WO2010137684 A1 WO 2010137684A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- vehicle
- mode
- display
- image generation
- Prior art date
Links
- 239000002131 composite material Substances 0.000 claims abstract description 183
- 230000000007 visual effect Effects 0.000 abstract description 44
- 238000000034 method Methods 0.000 abstract description 32
- 238000006243 chemical reaction Methods 0.000 abstract description 7
- 238000010586 diagram Methods 0.000 description 48
- 238000012545 processing Methods 0.000 description 28
- 230000006870 function Effects 0.000 description 26
- 230000008569 process Effects 0.000 description 21
- 230000008859 change Effects 0.000 description 10
- 230000007704 transition Effects 0.000 description 8
- 230000004044 response Effects 0.000 description 7
- 240000004050 Pentaglottis sempervirens Species 0.000 description 6
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 6
- 238000004891 communication Methods 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 5
- 230000007935 neutral effect Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 239000000203 mixture Substances 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 101000911772 Homo sapiens Hsc70-interacting protein Proteins 0.000 description 1
- 101001139126 Homo sapiens Krueppel-like factor 6 Proteins 0.000 description 1
- 101000661807 Homo sapiens Suppressor of tumorigenicity 14 protein Proteins 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000008571 general function Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/28—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with an adjustable field of view
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/304—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/602—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/806—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8093—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2380/00—Specific applications
- G09G2380/10—Automotive applications
Definitions
- the present invention relates to a technique for generating an image to be displayed on a display device mounted on a vehicle.
- the related art image display system is mounted on a vehicle such as an automobile, and images the surroundings of the vehicle to generate an image and display the image on the display in the passenger compartment.
- a vehicle such as an automobile
- images the surroundings of the vehicle to generate an image and display the image on the display in the passenger compartment.
- the outer region of the front fender on the opposite side of the driver's seat tends to be a blind spot from the driver's seat.
- an image display system that displays an image obtained by photographing the outside area of the front fender, when passing by an oncoming vehicle on a narrow road, there is a gap between the vehicle body on the reverse side of the driver's seat and the obstacle. The driver can easily check the clearance.
- Patent Document 1 an image showing a wider area around the vehicle is displayed in the vehicle interior instead of a limited area around the vehicle such as the outer area of the front fender.
- Patent Document 1 three images obtained by in-vehicle cameras installed on both the front side and the left and right sides of the vehicle are arranged and displayed on one screen.
- patent document 2 the mode of the periphery of the vehicle seen from the virtual viewpoint set just above the vehicle is shown using a plurality of captured images obtained by capturing the periphery of the vehicle with a plurality of in-vehicle cameras. Techniques for providing images have been proposed. In Patent Document 2, a technique for moving the position of the vehicle in an image in accordance with the direction indicated by the direction indicator is also proposed.
- the viewing direction of the viewpoint of viewing the object (subject) shown in the displayed image is the direction of the optical axis of the in-vehicle camera or the direction looking down from directly above the vehicle. For this reason, a driver who visually recognizes an image grasps the positional relationship between the actual object and the vehicle through a thinking process of performing coordinate transformation in the head based on the position of the object in the image. It may be difficult to instantaneously determine the positional relationship between the vehicle and the vehicle.
- Patent Document 1 in the case where both the front and left and right sides of the vehicle are shown on one screen, the viewing direction is different between the image showing the front and the image showing the side, The driver is confused about the direction in which the object is present, and it is difficult to instantaneously determine the positional relationship between the actual object and the vehicle.
- the present invention has been made in view of the above problems, and an object of the present invention is to provide a technique capable of intuitively grasping the positional relationship between a vehicle and an object.
- An image generation device that generates an image to be displayed on a display device mounted on a vehicle, and is arranged behind the vehicle based on a plurality of images around the vehicle taken by a plurality of cameras.
- a composite image generation unit that generates a composite image around the vehicle viewed from a virtual viewpoint directed to the front of the vehicle, an output unit that outputs the generated composite image to the display device, and a driver of the vehicle
- An input unit that inputs a direction instruction of the vehicle, and the composite image generation unit is configured such that when the direction instruction is not input by the input unit, the left side region of the vehicle is substantially the same as the right side region of the vehicle.
- the direction instruction is input by the input unit, one side area of the vehicle indicated by the direction instruction is the other side of the vehicle.
- Side of A second composite image displayed with a size larger than the region is generated.
- the image generation unit when the input unit stops inputting the direction instruction, the image generation unit continues to generate the second composite image for a predetermined period, When the direction instruction is not input for the predetermined period, the image generation unit starts generating the first composite image after the predetermined period.
- a front image captured by a camera provided in front of the vehicle and the composite image generated by the composite image generation unit A display image generation unit configured to generate a display image including the output image, and the output unit outputs the generated display image to the display device.
- the composite image generation unit when the virtual viewpoint of the composite image is changed from a first position to a second position, the composite image generation unit The virtual viewpoint is moved stepwise from the first position to the second position, and a plurality of synthesized images for creating an animation in which the virtual viewpoint continuously moves is generated.
- An image generation device that generates an image to be displayed on a display device mounted on a vehicle, and is arranged behind the vehicle based on a plurality of images around the vehicle taken by a plurality of cameras.
- a composite image generation unit that generates a composite image around the vehicle including a right side region and a left side region of the vehicle, as viewed from a virtual viewpoint directed to the front of the vehicle, and provided in front of the vehicle
- a display image generation unit that generates a display image including a front image captured by a camera and the composite image generated by the composite image generation unit, and outputs the generated display image to the display device And an output unit.
- An image generation device that generates an image to be displayed on a display device mounted on a vehicle, and is a composite image viewed from a virtual viewpoint based on a plurality of images around the vehicle taken by a plurality of cameras.
- a composite image generation unit that generates the output, and an output unit that outputs the generated composite image to the display device, and when changing the virtual viewpoint of the composite image from a first position to a second position,
- the composite image generation unit generates a plurality of composite images for creating an animation in which the virtual viewpoint moves continuously by moving the virtual viewpoint stepwise from the first position to the second position.
- the composite image generation unit changes the virtual viewpoint from a reference position.
- the reference position is a position corresponding to the viewpoint of the driver of the vehicle.
- the viewpoint of the composite image displayed on the display device is frequently switched and is difficult to see. Is prevented.
- the driver pays attention when driving.
- the desired area can be confirmed on the same screen without switching the screen.
- the direction of the field of view of the front image, the direction of the field of view of the composite image, and the direction of the field of view of the driver itself are approximately the same, so the driver does not involve complicated judgment such as coordinate conversion in the head. Intuitively grasp the positional relationship between objects displayed on the screen. For this reason, even if a lot of information is provided, the driver can make an accurate determination and can ensure safety.
- the animation representation of the composite image is performed such that the virtual viewpoint continuously moves from the first position to the second position on the display device. This makes it easier for the driver to intuitively grasp the position of the virtual viewpoint of the composite image after the virtual viewpoint is changed than when the virtual viewpoint is instantaneously switched from the first position to the second position.
- the front area of the vehicle and the left and right side areas of the vehicle are displayed on the same screen on the display device, so the driver wants to pay attention when driving.
- the area can be checked on the same screen without switching the screen.
- the direction of the field of view of the front image, the direction of the field of view of the composite image, and the direction of the field of view of the driver itself are approximately the same, so the driver does not involve complicated judgment such as coordinate conversion in the head. Intuitively grasp the positional relationship between objects displayed on the screen. For this reason, even if a lot of information is provided, the driver can make an accurate determination and can ensure safety.
- an animation representation of a composite image is performed such that the virtual viewpoint moves from the first position to the second position on the display device. This makes it easier for the driver to intuitively grasp the position of the virtual viewpoint of the composite image after the virtual viewpoint is changed than when the virtual viewpoint is instantaneously switched from the first position to the second position.
- the position of the virtual viewpoint of the composite image after the virtual viewpoint is changed in relation to the reference position is Is easy to grasp intuitively.
- the position of the virtual viewpoint of the composite image after the virtual viewpoint is changed is Is easier to grasp intuitively.
- FIG. 1 is a block diagram of an image display system.
- FIG. 2 is a diagram illustrating a position where the in-vehicle camera is arranged in the vehicle.
- FIG. 3 is a diagram illustrating a method for generating a composite image viewed from a virtual viewpoint.
- FIG. 4 is a diagram illustrating transition of operation modes of the image display system.
- FIG. 5 is a diagram illustrating display mode transition in the front mode.
- FIG. 6 is a diagram illustrating an example of a display image in the two-image mode.
- FIG. 7 is a diagram for explaining the field-of-view range shown in the two-image mode.
- FIG. 8 is a diagram illustrating a screen display example in the two-image mode.
- FIG. 1 is a block diagram of an image display system.
- FIG. 2 is a diagram illustrating a position where the in-vehicle camera is arranged in the vehicle.
- FIG. 3 is a diagram illustrating a method for generating a composite image
- FIG. 9 is a diagram illustrating the transition of the viewpoint position of the virtual viewpoint.
- FIG. 10 is a diagram illustrating an example of a display image in the two-image mode.
- FIG. 11 is a diagram illustrating an example of a display image in the two-image mode.
- FIG. 12 is a diagram illustrating a flow of processing for changing the viewpoint position of the virtual viewpoint.
- FIG. 13 is a diagram illustrating a screen display example in the single image mode.
- FIG. 14 is a diagram illustrating how the virtual viewpoint moves.
- FIG. 15 is a diagram illustrating a screen display example in the side camera mode.
- FIG. 16 is a diagram illustrating display mode transition in the back mode.
- FIG. 17 is a diagram illustrating the horizontal angle of the visual field range in the standard mode.
- FIG. 18 is a diagram illustrating the horizontal angle of the viewing range in the wide mode.
- FIG. 19 is a diagram illustrating an example of a display image in the combined standard mode.
- FIG. 20 is a diagram illustrating an example of a display image in the composite overhead mode.
- FIG. 21 is a diagram illustrating the viewpoint position of the virtual viewpoint in the composite overhead view mode.
- FIG. 22 is a diagram showing a virtual viewpoint setting screen.
- FIG. 23 is a diagram showing a virtual viewpoint setting screen.
- FIG. 24 is a diagram illustrating an example of a display image in the combined standard mode.
- FIG. 25 is a diagram showing a flow of processing for storing mode information.
- FIG. 26 is a diagram showing a flow of processing at the start time of the back mode.
- FIG. 27 is a diagram showing a flow of processing for storing mode information.
- FIG. 28 is a diagram showing the flow of processing at the start of the back mode.
- FIG. 1 is a block diagram of an image display system 100 according to the present embodiment.
- This image display system 100 is mounted on a vehicle (in this embodiment, an automobile), and has a function of photographing the periphery of the vehicle, generating an image, and displaying the image in the passenger compartment.
- a driver of a vehicle serving as a user of the image display system 100 can easily grasp the periphery of the vehicle by using the image display system 100.
- the image display system 100 provides various information to the imaging unit 5 that captures the periphery of the vehicle, the image generation device 10 that generates a display image indicating the periphery of the vehicle, and the driver of the vehicle. And a navigation device 20 to be provided.
- the image generation device 10 is configured as an ECU (Electronic Control Unit) having an image generation function, and is disposed at a predetermined position of the vehicle.
- ECU Electronic Control Unit
- Navigation device 20 provides navigation guidance to the driver.
- the navigation device 20 includes a display 21 such as a liquid crystal having a touch panel function, an operation unit 22 that is operated by a driver, and a control unit 23 that controls the entire device.
- the navigation device 20 is installed on an instrument panel or the like of the vehicle so that the screen of the display 21 is visible from the driver.
- Various instructions from the driver are received by the operation unit 22 and the display 21 as a touch panel.
- the control part 23 is comprised as a computer provided with CPU, RAM, ROM, etc., and various functions including a navigation function are implement
- the navigation device 20 is communicably connected to the image generation device 10, and can transmit and receive various control signals to and from the image generation device 10 and receive display images generated by the image generation device 10.
- a map image for navigation guidance is usually displayed on the display 21, but in response to a predetermined driver operation or a signal from the image generation device 10, a map around the vehicle generated by the image generation device 10 is displayed.
- a display image showing the state is displayed.
- the navigation device 20 also functions as a display device that receives and displays the display image generated by the image generation device 10.
- the photographing unit 5 is electrically connected to the image generation device 10 and operates based on a signal from the image generation device 10.
- the photographing unit 5 includes a front camera 51, a side camera 52, and a back camera 53 that are in-vehicle cameras. These in-vehicle cameras 51, 52, and 53 include a lens and an image sensor and electronically acquire an image.
- FIG. 2 is a diagram showing positions where the in-vehicle cameras 51, 52, 53 are arranged on the vehicle 9.
- the front camera 51 is provided in the vicinity of the license plate mounting position at the front end of the vehicle 9, and its optical axis 51 a is directed in the straight traveling direction of the vehicle 9.
- the side cameras 52 are provided on the left and right door mirrors 93, respectively, and their optical axes 52a are directed outward of the vehicle 9 so as to be orthogonal to the straight traveling direction.
- the back camera 53 is provided in the vicinity of the license plate mounting position at the rear end of the vehicle 9, and its optical axis 53 a is directed in the direction opposite to the straight traveling direction of the vehicle 9.
- the attachment position of the front camera 51 and the back camera 53 is desirably substantially in the center of the left and right, but may be slightly shifted in the left and right directions from the center of the left and right.
- a fish-eye lens or the like is employed as the lens of these in-vehicle cameras 51, 52, 53, and the in-vehicle cameras 51, 52, 53 have an angle of view ⁇ of 180 degrees or more. For this reason, it is possible to shoot the entire periphery of the vehicle 9 by using the four in-vehicle cameras 51, 52, and 53.
- the image generation apparatus 10 includes a control unit 1 that controls the entire apparatus, an image processing unit 3 that processes a captured image acquired by the imaging unit 5 and generates a display image, and a navigation device 20. And a communication unit 42 that communicates with each other. Various instructions from the driver received by the operation unit 22 or the display 21 of the navigation device 20 are received by the communication unit 42 as control signals and input to the control unit 1. Thereby, the image generation apparatus 10 can also perform an operation in response to a driver operation on the navigation apparatus 20.
- the image processing unit 3 is configured as a hardware circuit capable of various types of image processing, and includes a captured image adjustment unit 31, a composite image generation unit 32, and a display image generation unit 33 as main functions.
- the captured image adjustment unit 31 adjusts the captured image acquired by the imaging unit 5 for display.
- the photographed image adjustment unit 31 performs image quality adjustment such as brightness and contrast of the photographed image, and correction of image distortion so that it becomes natural during display.
- the composite image generation unit 32 generates a composite image viewed from an arbitrary virtual viewpoint around the vehicle 9 based on the plurality of captured images acquired by the plurality of in-vehicle cameras 51, 52, and 53 of the imaging unit 5. A method by which the composite image generation unit 32 generates a composite image viewed from a virtual viewpoint will be described later.
- the display image generation unit 33 combines one or more of the photographed image adjusted by the photographed image adjustment unit 31 and the composite image generated by the composite image generation unit 32 and provides the display to the driver. An image is generated.
- the generated display image is output to the navigation device 20 by the communication unit 42 and displayed on the display 21 of the navigation device 20.
- the control unit 1 is configured as a computer including a CPU, a RAM, a ROM, and the like, and various control functions are realized by the CPU performing arithmetic processing according to a predetermined program.
- the functions of the control unit 1 realized in this way include a function of controlling image processing executed by the image processing unit 3, that is, a function of controlling the content of the display image.
- Various parameters necessary for generating the composite image generated by the composite image generation unit 32 are instructed by the function of the control unit 1.
- the control unit 1 includes a nonvolatile memory 11 constituted by a flash memory and a timer 12 having a time measuring function.
- the image generation device 10 includes a signal input unit 41 for inputting signals from various devices provided in the vehicle 9.
- a signal from the outside of the image generation apparatus 10 is input to the control unit 1 via the signal input unit 41.
- signals indicating various types of information are input to the control unit 1 from the shift sensor 81, the vehicle speed sensor 82, the direction indicator 83, the steering sensor 84, the changeover switch 85, and the like.
- a part or all of the shift sensor 81, the vehicle speed sensor 82, the direction indicator 83, the steering sensor 84, and the changeover switch 85 may be included in the image display system 100.
- the operation position of the shift lever of the transmission of the vehicle 9 ie, “P (parking)”, “D (forward)”, “N (neutral)”, “R (reverse)”, etc.
- the shift position is input.
- the traveling speed (km / h) of the vehicle 9 at that time is input.
- a turn signal indicating a direction instruction based on the operation of the turn signal switch that is, a direction instruction intended by the driver of the vehicle 9 is input.
- a turn signal is generated, and the turn signal indicates the operated direction (left direction or right direction).
- the turn signal switch is in the neutral position, the turn signal is turned off.
- the steering sensor 84 receives the direction and angle of rotation of the steering wheel by the driver.
- the changeover switch 85 is a switch that receives an instruction from the driver to change the mode of the display image. A signal indicating a driver instruction is input to the control unit 1 from the changeover switch 85.
- FIG. 3 is a diagram for explaining a method of generating a composite image viewed from an arbitrary virtual viewpoint.
- the front camera 51, the side camera 52, and the back camera 53 of the photographing unit 5 perform photographing simultaneously, four photographed images P1 to P4 respectively showing the front, left side, right side, and rear of the vehicle 9 are acquired.
- the obtained four captured images P1 to P4 are multiplexed and then projected onto a virtual three-dimensional solid curved surface SP.
- the three-dimensional curved surface SP has, for example, a substantially hemispherical shape (a bowl shape), and a central portion (a bottom portion of the bowl) is determined as the position of the vehicle 9.
- the correspondence between the positions of the pixels included in the captured images P1 to P4 and the positions of the pixels of the solid curved surface SP is determined in advance. Therefore, the value of each pixel of the three-dimensional curved surface SP can be determined based on this correspondence and the value of each pixel included in the captured images P1 to P4.
- the correspondence between the positions of the pixels of the captured images P1 to P4 and the positions of the pixels of the three-dimensional curved surface SP is stored as table data in the nonvolatile memory 11 of the control unit 1 or the like.
- the virtual viewpoint VP for the three-dimensional curved surface SP is set by the control unit 1 at an arbitrary viewpoint position around the vehicle in an arbitrary visual field direction.
- the virtual viewpoint VP is defined by the viewpoint position and the viewing direction. Then, according to the set virtual viewpoint VP, a necessary area on the three-dimensional curved surface SP is cut out as an image, so that a composite image viewed from an arbitrary virtual viewpoint is generated.
- a composite image CP1 that looks down from directly above the vehicle 9 is generated.
- the virtual viewpoint VP2 in which the viewpoint position is the left rear position of the vehicle 9 and the visual field direction is the substantially forward direction of the vehicle 9 is set, A composite image CP2 that looks over the entire image is generated.
- the relationship between the virtual viewpoint VP and the necessary area on the three-dimensional curved surface SP is determined in advance, and is stored in the nonvolatile memory 11 of the control unit 1 as table data.
- the image of the vehicle 9 shown in the composite image is prepared in advance as data such as a bitmap and stored in the nonvolatile memory 11 or the like.
- data of the image of the vehicle 9 having a shape corresponding to the viewpoint position and the visual field direction of the virtual viewpoint VP of the composite image is read and superimposed on the composite image.
- the same reference numeral 9 is used for both the actual vehicle and the vehicle image shown in the image.
- FIG. 4 is a diagram illustrating transition of operation modes of the image display system 100.
- the image display system 100 has three operation modes: a navigation mode M0, a front mode M1, and a back mode M2. These operation modes are switched by the control of the control unit 1 in accordance with the driver's operation and the vehicle running state.
- the navigation mode M0 is an operation mode in which a map image for navigation guidance is displayed on the display 21 by the function of the navigation device 20.
- the functions of the photographing unit 5 and the image generation device 10 are not used, and various displays are performed using the functions of the navigation device 20 alone. For this reason, when the navigation device 20 has a function of receiving and displaying television broadcast radio waves, a television broadcast screen may be displayed instead of the map image for navigation guidance.
- the front mode M1 and the back mode M2 use the functions of the photographing unit 5 and the image generation device 10 to display on the display 21 an image for display that shows the situation around the vehicle 9 in real time. It is.
- the front mode M1 is an operation mode for enabling a front monitor function for displaying a display image mainly showing a front area and a side area of the vehicle 9 that are required when the vehicle moves forward.
- the back mode M2 is an operation mode for enabling a back monitor function for displaying a display image mainly showing a rear area of the vehicle 9 that is required when the vehicle is going backward.
- the mode In the navigation mode M0, when the traveling speed input from the vehicle speed sensor 82 is less than 10 km / h, for example, the mode is switched to the front mode M1.
- the front mode M1 when the traveling speed becomes, for example, 10 km / h or higher, the mode is switched to the navigation mode M0.
- the front mode M1 is canceled in order to concentrate the driver on traveling.
- the traveling speed of the vehicle 9 is relatively low, the driver drives in consideration of the situation around the vehicle 9, more specifically, approaching an intersection with poor visibility, changing direction, or shifting the width. There are a lot of scenes that do. For this reason, when the traveling speed is relatively low, the mode is switched to the front mode M1 mainly showing the front area and the side area of the vehicle 9.
- a condition that there is an explicit operation instruction from the driver may be added to the condition that the traveling speed is less than 10 km / h.
- the mode is switched to the back mode M2. That is, when the transmission of the vehicle 9 is operated to the “R (reverse)” position, the vehicle 9 is in a reverse state, and therefore, the back mode M2 that mainly shows the rear region of the vehicle 9 is switched.
- the position of the shift lever is other than “R (reverse)” in the back mode M2, it is switched to the navigation mode M0 or the front mode M1 based on the traveling speed at that time. That is, if the traveling speed is 10 km / h or higher, the mode is switched to the navigation mode M0, and if the traveling speed is less than 10 km / h, the mode is switched to the front mode M1.
- FIG. 5 is a diagram showing display mode transition in the front mode M1.
- the front mode M1 there are three display modes, a two-image mode M11, a one-image mode M12, and a side camera mode M13, and these display modes have different display modes.
- These display modes are switched under the control of the control unit 1 in the order of the two-image mode M11, the one-image mode M12, and the side camera mode M13 each time the driver presses the changeover switch 85.
- the changeover switch 85 is pressed in the side camera mode M13, the mode returns to the two-image mode M11 again.
- the two-image mode M11 is a display mode in which a display image including the front image FP1 obtained by photographing with the front camera 51 and the composite image FP2 viewed from the virtual viewpoint VP is displayed on the display 21. That is, in the two-image mode M11, two images of the front image FP1 and the composite image FP2 are shown on the same screen.
- the one image mode M12 is a display mode in which a display image including only the composite image FP3 viewed from the virtual viewpoint VP is displayed on the display 21.
- the side camera mode M13 is a display mode in which a display image including only the side image FP4 obtained by photographing with the side camera 52 is displayed on the display 21.
- FIG. 6 is a diagram illustrating an example of a display image displayed on the display 21 in the two-image mode M11. As shown in the figure, in the display image in the two-image mode M11, the front image FP1 is arranged at the top and the composite image FP2 is arranged at the bottom.
- the front image FP1 is not a composite image from the virtual viewpoint VP, but a display image obtained by adjusting a captured image obtained by capturing with the front camera 51 for display by the captured image adjusting unit 31.
- the composite image FP2 is a composite image including a side region of the vehicle 9 viewed from the virtual viewpoint VP directed forward from the rear position of the vehicle 9.
- FIG. 7 is a diagram for explaining the visual field range indicated by the two-image mode M11 around the vehicle 9.
- a range FV1 indicated by a broken line is a visual field range indicated by a front image FP1
- a range FV2 indicated by a two-dot chain line is a visual field range indicated by a composite image FP2.
- the visual field range FV1 and the visual field range FV2 partially overlap in the region A2 that is a boundary between them.
- a region having a horizontal angle of 180 ° that extends in the left-right direction in the front region of the vehicle 9 is set as the visual field range FV1. For this reason, the driver can grasp the object existing in the left and right front of the vehicle 9 that easily becomes a blind spot when entering the intersection with poor visibility by browsing the front image FP1.
- a range including both the left and right lateral regions from the front to the rear of the vehicle 9 and the rear of the vehicle 9 and the rear region of the vehicle 9 is set as the visual field range FV2.
- region can be grasped
- direction change or width adjustment it is possible to easily confirm a region that tends to be a blind spot from the driver's seat, for example, a region A1 near the outside of the front fender 94 that is not reflected on the door mirror 93.
- the two images FP1 and FP2 in the visual field ranges FV1 and FV2 can be simultaneously viewed without switching the screen (see also FIG. 6). You can see the situation around the at a glance.
- the viewpoint position of the virtual viewpoint VP of the composite image FP2 is set to the rear position of the vehicle 9, and the visual field direction is set to the front direction of the vehicle 9.
- the state of the periphery of the vehicle 9 is shown together with the image of the vehicle 9 when viewed in the front direction from the rear position of the vehicle 9.
- the direction of the field of view of the front image FP1 the direction of the field of view of the composite image FP2, and the direction of the field of view of the driver itself are approximately the same. Is not invited.
- the driver since the driver does not go through a complicated thinking process such as coordinate conversion in the head, the driver can intuitively grasp the positional relationship between the object displayed on the display 21 and the vehicle 9.
- FIG. 7 is a diagram illustrating a screen display example of the display 21 in the two-image mode M11 when the object T moves as described above.
- the object T moves from the position TP1 outside the visual field range to the position TP2 within the visual field range FV2 of the composite image FP2, first, the object T is shown in the composite image FP2 below the screen (state ST1). At this time, the object T is not shown in the front image FP1 in the upper part of the screen. Subsequently, when the object T moves to a position TP3 within the overlapping area A2 of the visual field range FV1 and the visual field range FV2, the object T is shown in both the front image FP1 and the composite image FP2 (state ST2). Further, when the object T moves to a position TP4 within the visual field range FV1 of the front image FP1, the object is shown in the front image FP1 (state ST3).
- the object T moves across the field of view ranges FV1 and FV2 of the two images FP1 and FP2 around the vehicle 9, the direction of the field of view of the front image FP1 and the field of view of the composite image FP2 Since the orientation and the orientation of the driver's own visual field are approximately the same, the object T moves in approximately the same direction in both the images FP1 and FP2, so the driver intuitively grasps the movement of the object T. can do.
- there is a region A2 where the visual field range FV1 and the visual field range FV2 overlap each other there is a scene where the object T is shown at the same time in both the front image FP1 and the composite image FP2, and the movement of the object T is continuous. You can grasp it.
- the driver can intuitively grasp the situation around the vehicle 9 as described above. For this reason, the driver can make an accurate determination and can sufficiently ensure driving safety.
- FIG. 9 is a diagram illustrating the transition of the viewpoint position of the virtual viewpoint VP.
- the viewpoint position of the virtual viewpoint VP is the position of the substantially right and left center behind the vehicle 9.
- the VPC and visual field direction are set in the forward direction of the vehicle 9. Thereby, as shown in FIG. 6, a composite image FP2 is generated in which the left side region of the vehicle 9 is displayed with substantially the same size as the right side region.
- the visual field direction of the virtual viewpoint VP is set to the front direction of the vehicle 9.
- the viewpoint position is moved to a position in the direction indicated by the turn signal.
- the viewpoint position of the virtual viewpoint VP is set to the position VPL on the left side of the vehicle 9.
- a composite image FP2 is generated in which the left side area of the vehicle 9 indicated by the turn signal of the direction indicator 83 is displayed in a size larger than the right side area, and is displayed on the display 21. Is done.
- the composite image FP2 shows the state of the periphery of the vehicle 9 in a state where the front direction is viewed from the rear position of the vehicle 9.
- the viewpoint position of the virtual viewpoint VP is set to the position VPR on the right side of the vehicle 9.
- a composite image FP2 in which the right side area of the vehicle 9 indicated by the turn signal of the direction indicator 83 is displayed in a size larger than the left side area is generated and displayed on the display 21. Is done. Also in this case, the composite image FP2 shows the state of the periphery of the vehicle 9 in a state where the front direction is viewed from the rear position of the vehicle 9.
- the direction instructed by the direction indicator 83 is highly likely that there is an object that the vehicle 9 moves and contacts when the direction is changed or the width is adjusted. Therefore, by showing the side area in the direction instructed by the direction indicator 83 in this way, the driver's attention can be directed to an object that may come into contact, and the contact between the vehicle 9 and the object can be reduced. It can be effectively prevented.
- the composite image FP2 shows the surroundings of the vehicle 9 together with the image of the vehicle 9 when viewed from the rear position of the vehicle 9 in the forward direction.
- the driver since the lateral region of the vehicle 9 is shown with the same visual field direction as the driver's own visual field direction, the driver is not confused about the direction in which the object shown in the image exists.
- the driver since the driver does not involve a complicated thinking process such as coordinate conversion in the head, the driver can intuitively grasp the positional relationship between the vehicle 9 and the object.
- a side region in another direction opposite to the one direction indicated by the turn signal is also synthesized. Included in the image. For this reason, even if an object exists in the side area opposite to the direction indicated by the turn signal, it is possible to grasp the object. Thus, for example, in order to avoid an object in the direction indicated by the turn signal, even when the steering wheel is operated in a direction opposite to the direction indicated by the turn signal, contact with an object existing in the opposite direction is prevented. Can be prevented.
- the viewpoint position of the virtual viewpoint VP of the composite image FP2 is returned to the position VPC at the center of the left and right. It is.
- the composite image generation unit 32 continues to generate the composite image FP2 for a predetermined period, and when the direction instruction is not input for the predetermined period, The image generation unit 32 starts generating the composite image FP1 after a predetermined period.
- the viewpoint position of the virtual viewpoint VP is returned to the position VPC at the center of the left and right after a predetermined time has elapsed with the turn signal being turned off, not immediately after the turn signal is turned off.
- FIG. 12 is a diagram showing a flow of processing for changing the viewpoint position of the virtual viewpoint VP. This process is executed under the control of the control unit 1 when the turn signal is turned on by operating the turn signal switch.
- step S11 the direction indicated by the turn signal of the direction indicator 83 is determined (step S11). If the turn signal indicates the left direction, the process proceeds to step S12. If the turn signal indicates the right direction, the process proceeds to step S17.
- step S12 the virtual viewpoint VP is set to the left side position VPL.
- a composite image FP ⁇ b> 2 showing the left side region larger than the right direction is generated and displayed on the display 21.
- step S13 the state of the turn signal is monitored in such a state that the virtual viewpoint VP is set to the left side position VPL (step S13).
- the process returns to step S12, and the virtual viewpoint VP is maintained at the left side position VPL. That is, the display of the composite image FP2 that shows a relatively large lateral region in the left direction is maintained.
- the turn signal indicates the right direction, the process proceeds to step S17.
- step S13 when the turn signal is turned off in step S13, that is, when the direction instruction is changed from the presence to the absence, the time measurement by the timer 12 is started when the turn signal is turned off (step S14). .
- step S15 and S16 the state of the turn signal is monitored until a predetermined time has elapsed from the start of timing.
- This predetermined time is, for example, 3 seconds in the present embodiment.
- the virtual viewpoint VP is maintained at the left side position VPL, and the display of the composite image FP2 that shows a relatively large side region in the left direction is maintained.
- the viewpoint position of the virtual viewpoint VP is returned to the position VPC at the center of the left and right (step S22).
- the composite image FP2 including both the left and right side areas of the vehicle 9 is displayed substantially equally.
- step S15 if the turn signal indicates the left direction again in step S15 before the predetermined time has elapsed from the start of timing, the process returns to step S12, and the virtual viewpoint VP is maintained at the left side position VPL. That is, the display of the composite image FP2 that shows a relatively large lateral region in the left direction is maintained. If the turn signal indicates the right direction in step S15, the process proceeds to step S17.
- the turn signal switch of the direction indicator 83 may return from the operation position to the neutral position regardless of the driver's intention. For this reason, the scene where a driver operates the turn signal switch of the direction indicator 83 continuously in the same direction for a short time occurs. In such a scene, if the viewpoint position of the virtual viewpoint VP is immediately changed in response to turning on / off of the turn signal, the viewpoint position of the composite image FP2 displayed on the display 21 is frequently switched and is difficult to see. .
- the viewpoint position of the virtual viewpoint VP is maintained until a predetermined time elapses even after the turn signal is turned off, and the viewpoint position of the virtual viewpoint VP is set on condition that the predetermined time has elapsed with the turn signal turned off.
- the shorter the predetermined time used for this determination the more frequently the viewpoint position of the composite image FP2 may be switched, and the longer the predetermined time, the more difficult the viewpoint position of the composite image FP2 returns to the approximate center. For this reason, it is desirable to set the predetermined time between 2 seconds and 4 seconds, for example.
- step S17 the virtual viewpoint VP is set to the right side position VPR.
- a composite image FP ⁇ b> 2 that shows the lateral region in the right direction larger than the left direction is generated and displayed on the display 21.
- step S18 the state of the turn signal is monitored with the virtual viewpoint VP set to the right side position VPR.
- the process returns to step S17, the virtual viewpoint VP is maintained at the right side position VPR, and when the turn signal indicates the left direction, the process proceeds to step S12.
- step S18 when the turn signal is turned off, the timer 12 starts measuring time when the turn signal is turned off (step S19), and the state of the turn signal is maintained until a predetermined time elapses from the start of timing. Monitored (steps S20 and S21). At this time, the virtual viewpoint VP is maintained at the right side position VPR until a predetermined time elapses. If the predetermined time has elapsed with the turn signal turned off (Yes in step S21), the viewpoint position of the virtual viewpoint VP is returned to the position VPC at the substantially left and right center (step S22). As a result, the composite image FP2 including both the left and right side areas of the vehicle 9 is displayed substantially equally.
- step S12 If the turn signal indicates the right direction again in step S20 before the predetermined time has elapsed from the start of timing, the process returns to step S17, and the virtual viewpoint VP is maintained at the right side position VPR. That is, the display of the composite image FP2 that shows a relatively large lateral region in the right direction is maintained. If the turn signal indicates the left direction in step S20, the process proceeds to step S12.
- the direction of the visual field of the composite image FP3 and the direction of the visual field of the driver itself are approximately the same, so that the driver can intuitively grasp the positional relationship of the objects displayed on the display 21.
- FIG. 13 is a diagram illustrating an example of display when animation expression is performed in the one-image mode M12.
- FIG. 14 is a diagram showing how the virtual viewpoint VP moves at this time.
- the control unit 1 changes the viewpoint position of the virtual viewpoint VP so as to move linearly from a position VPD corresponding to the driver's viewpoint to a position VPA behind the vehicle 9.
- the composite image generation unit 32 creates a plurality of composite images FP3 by moving the virtual viewpoint VP stepwise from the position VPD to the position VPA.
- the image generation unit 32 generates an animation in which the virtual viewpoint VP continuously moves by sequentially generating the plurality of composite images FP3.
- the parameters for generating the plurality of composite images FP3 are derived by the control unit 1 by linear interpolation based on the parameters of the position VPD before the change and the parameters of the position VPA after the change.
- the plurality of generated composite images FP3 are sequentially output to the navigation device 20 and are displayed on the display 21 continuously in time.
- an animation expression is made on the display 21 such that the virtual viewpoint VP of the composite image FP3 moves from the position corresponding to the driver's viewpoint to the rear position of the vehicle 9. That is, in the composite image FP3, the image of the vehicle 9 is initially not visible because it is the viewpoint of the driver (state ST11), but the upper image of the vehicle 9 is gradually shown (states ST12 and ST13), and finally The image of the vehicle 9 is shown so as to look down at the vehicle 9 from the rear (state ST14).
- Such animation expression takes about 1 second.
- the driver can intuitively determine at which position the virtual viewpoint VP is a composite image compared to the case where the virtual viewpoint VP is switched instantaneously. Can be easily grasped.
- the viewpoint position of the virtual viewpoint VP is moved from this reference position with the position corresponding to the viewpoint of the driver of the vehicle 9 as the reference position. For this reason, since the viewpoint position after the change of the virtual viewpoint VP is shown based on the viewpoint position of the driver, it is easy to grasp intuitively.
- the reference position for starting the animation expression may be a position that is easy for the driver to intuitively grasp even if it is not a position corresponding to the driver's viewpoint. For example, a position directly above the center of the vehicle 9 or a substantially center position on the left and right of the front bumper may be used as the reference position.
- Such animation expression can be performed not only when the display mode is changed but also when the virtual viewpoint VP is changed. For example, it is desirable to perform animation expression even when the viewpoint position of the virtual viewpoint VP is changed in the horizontal direction in response to the operation of the direction indicator 83 in the above-described two-image mode M11.
- a plurality of synthesized images that can be animated by continuous display may be generated while moving the position of the virtual viewpoint in stages.
- the parameter for generating the composite image can be derived by linear interpolation based on the position parameter before the change and the position parameter after the change.
- the side image FP4 is not a composite image from the virtual viewpoint VP, but is a display image obtained by adjusting the captured image obtained by capturing with the left side camera 52 for display by the captured image adjusting unit 31.
- the outside region of the front fender 94 on the left side of the vehicle 9 on the opposite side tends to be a blind spot.
- the outer region of the left front fender 94 is shown enlarged. This makes it easier to grasp the state of the object present in the blind spot than in other display modes.
- the range displayed on the display 21 can be switched by pressing the changeover switch 85. Specifically, it is possible to switch between a side image FP4 showing an enlarged area near the front wheel 96 on the left side of the vehicle 9 and a side image FP5 showing an enlarged rear area from the front wheel 96.
- FIG. 16 is a diagram illustrating display mode transition in the back mode M2.
- the back mode M2 there are four display modes of a standard mode M21, a wide mode M22, a composite standard mode M23, and a composite overhead mode M24, and the display modes are different from each other.
- the driver can select any one display mode and set it to the current display mode (hereinafter referred to as “current mode”).
- the changeover switch 85 when the changeover switch 85 is pressed, the current mode is set in the order of the standard mode M21, the wide mode M22, and the composite standard mode M23.
- the composite standard mode M23 is the current mode
- the changeover switch 85 may be switched by pressing a command button on the screen.
- composition standard mode M23 is the current mode
- viewpoint switching button CB1 displayed as a command button on the screen of the display 21 is pressed
- the composition overhead mode M24 is set to the current mode.
- the composite standard mode M23 is set to the current mode.
- the composition overhead view mode M24 is the current mode and the switch 85 is pressed
- the standard mode M21 is set to the current mode.
- a display image corresponding to the current mode is generated by the image processing unit 3 and displayed on the display 21.
- the standard mode M21 and the wide mode M22 are the current mode
- a display image including only the back images BP1 and BP2 obtained by photographing with the back camera 53 is displayed on the display 21.
- the composite standard mode M23 is the current mode
- a display image including the composite image BP4 viewed from the virtual viewpoint VP and the back image BP5 obtained by photographing with the back camera 53 is displayed on the display 21.
- the composite bird's-eye view mode M24 is the current mode
- the composite image BP6 viewed from the virtual viewpoint VP is displayed instead of the back image BP5 in the composite standard mode M23.
- Standard mode> As shown in FIG. 16, in the standard mode M21, only the back image BP1 is displayed on the display 21.
- the back image BP1 is not a composite image from the virtual viewpoint VP, but a display image obtained by adjusting the captured image obtained by capturing with the back camera 53 for display by the captured image adjusting unit 31.
- the horizontal angle of the visual field range of the back image BP1 is 135 °, and in the standard mode M21, the rear region of the vehicle 9 is shown in a natural manner.
- a guideline GL indicating a predicted course when the vehicle 9 moves backward is superimposed and displayed.
- the guide line GL is moved according to the rotation direction and rotation angle of the steering wheel input from the steering sensor 84.
- the driver can move the vehicle 9 backward using the guideline GL as a guide.
- the back image BP2 is not a composite image from the virtual viewpoint VP, but is a display image obtained by adjusting the captured image obtained by capturing with the back camera 53 for display by the captured image adjusting unit 31. Also in the back image BP2, a guideline GL indicating a predicted course when the vehicle 9 is moving backward is displayed superimposed.
- the horizontal angle of the visual field range of the back image BP2 is 180 °, and an object in a wider range in the horizontal direction than the standard mode M21 can be confirmed. Therefore, when the vehicle is parked in the front-stop position, the left and right areas that are likely to become blind spots can be confirmed by using the wide mode M22 in such a display mode.
- FIG. 19 is a diagram illustrating an example of a display image displayed on the display 21 in the combined standard mode M23.
- the combined image BP4 is disposed on the left side
- the back image BP5 is disposed on the right side.
- the composite image BP4 is a composite image viewed from a virtual viewpoint VP that looks down on the entire periphery of the vehicle 9.
- the back image BP5 is not a composite image from the virtual viewpoint VP, but a display image obtained by adjusting the captured image obtained by capturing with the back camera 53 for display by the captured image adjusting unit 31.
- the horizontal angle of the visual field range of the back image BP5 is 135 °.
- the driver can confirm both the appearance of the entire periphery of the vehicle 9 and the appearance of the rear area of the vehicle 9 by visually recognizing the display image of the composite standard mode M23 having such a display mode. Accordingly, the driver can safely move the vehicle 9 backward while being aware of the objects around the entire vehicle 9.
- FIG. 20 is a diagram illustrating an example of a display image displayed on the display 21 in the composite overhead mode M24.
- a composite image BP4 viewed from a virtual viewpoint VP overlooking the entire periphery of the vehicle 9 is arranged on the left side of the display image in the composite bird's-eye view mode M24, similarly to the composite standard mode M23.
- a composite image BP6 viewed from the virtual viewpoint VP looking down the vicinity of the rear end of the vehicle 9 substantially directly is arranged.
- the viewpoint position of the virtual viewpoint VP of the composite image BP6 is set to a position just above the rear end of the vehicle 9, and the visual field direction is set to a direction almost directly below.
- the composite image BP6 shows an enlarged region in the vicinity of the rear end of the vehicle 9 so as to look down from the upper side thereof.
- the rearward direction of the vehicle 9, which is the traveling direction when the vehicle 9 moves backward is the upward direction of the image.
- the driver can easily confirm the clearance between the vehicle 9 and its surrounding objects, particularly the object existing near the rear end of the vehicle 9, by visually recognizing the display image in the composite overhead mode M24 having such a display mode. .
- the synthetic bird's-eye view mode M24 is a display mode that can be used effectively under special conditions such as when the parking position of the vehicle 9 is finally adjusted when the vehicle 9 is parked backward.
- the viewpoint position of the virtual viewpoint VP is set to a desired position of the driver. it can.
- the setting button CB2 displayed as a command button on the screen is pressed in the composite standard mode M23 or the composite overhead mode M24, a setting screen for setting the viewpoint position of the virtual viewpoint VP is displayed.
- FIG. 22 and 23 are diagrams showing a setting screen for this virtual viewpoint VP.
- an index indicating the position of the virtual viewpoint VP relative to the vehicle 9 is shown along with an illustration showing the side of the vehicle 9.
- the index can be moved by pressing the command buttons CB4 and CB5 on the screen, and the position of the moved index with respect to the illustration of the vehicle 9 is set as the viewpoint position of the virtual viewpoint VP.
- the visual field direction of the virtual viewpoint VP is set toward the approximate center 9 c of the vehicle 9.
- FIG. 22 shows an example where the viewpoint position of the virtual viewpoint VP is set to a position immediately above the approximate center of the vehicle 9, and
- FIG. 23 shows an example where the viewpoint position of the virtual viewpoint VP is set to the rear position of the vehicle 9.
- this setting screen a composite image BP7 when the position of the index is the virtual viewpoint VP is displayed. Therefore, by visually recognizing such a setting screen, it is possible to easily confirm what composite image BP7 is obtained when the virtual viewpoint VP is moved, and the driver moves the virtual viewpoint VP to a desired position. be able to.
- the set contents are reflected in the display of the composite standard mode M23 and the composite overhead mode M24.
- the vehicle 9 is shown as viewed from behind in the composite image BP4 on the left side of the display image as shown in FIG. Since the viewpoint position of the virtual viewpoint VP can be set in this way, the driver can grasp the positional relationship between the vehicle 9 and surrounding objects from a desired angle when the vehicle 9 moves backward.
- the image display system 100 stores the display mode most recently set to the current mode in the back mode M2, and when the back mode M2 is entered next, in principle, the current mode is set to the most recent mode.
- the control unit 1 sets the displayed mode to the current mode immediately after the back mode M2. This eliminates the need for a complicated operation for the driver to select a desired display mode each time the vehicle 9 is moved backward.
- FIG. 25 is a diagram showing a flow of processing for storing information indicating the latest current mode. This process is repeatedly executed by the control unit 1 when the operation mode is the back mode M2.
- step S31 it is determined whether or not the current mode is switched to another display mode.
- mode information indicating the current mode after switching is stored in the nonvolatile memory 11 (step S32). Since this process is performed every time the current mode is switched, the nonvolatile memory 11 stores mode information indicating the display mode most recently set to the current mode. This mode information is obtained even when the operation mode is other than the back mode M2 (when the transmission of the vehicle 9 is other than the “R (reverse)” position) or when the image display system 100 is turned off. It is stored in the nonvolatile memory 11.
- FIG. 26 is a diagram showing the flow of processing at the start of the back mode M2. This process is executed by the control unit 1 in response to the transmission of the vehicle 9 being operated to the “R (reverse)” position and the operation mode being the back mode M2.
- the mode information stored in the nonvolatile memory 11 is read (step S41). Then, it is determined which is the display mode indicated by the read mode information (step S42).
- the display mode indicated by the mode information is other than the synthetic overhead mode M24 (No in step S42), the display mode indicated by the mode information is set to the current mode (step S43).
- the display mode indicated by the mode information is the composite overhead mode M24 (Yes in step S42)
- the composite standard mode M23 is set to the current mode (step S44).
- the display mode that has recently entered the current mode is set in principle to the current mode immediately after the back mode M2.
- the display mode that has recently become the current mode is the combined overhead mode M24
- the combined standard mode M23 different from the combined overhead mode M24 is set to the current mode immediately after the back mode M2. Since the composite bird's-eye view mode M24 is used mainly when the parking position of the vehicle 9 is finally adjusted, there are few scenes used when the vehicle 9 starts to move backward. For this reason, when the display mode that has recently become the current mode is the composite overhead mode M24, the current mode is set to the composite overhead mode by setting the exceptionally different display mode to the current mode immediately after the back mode M2. A complicated operation of changing from M24 to another display mode can be eliminated.
- processing for exceptionally handling the synthetic bird's-eye view mode M24 can be realized by processing different from the processing shown in FIGS.
- FIG. 27 is a diagram showing another example of the flow of processing for storing information indicating the latest current mode. This process is also repeatedly executed by the control unit 1 when the operation mode is the back mode M2.
- step S51 it is determined whether or not the current mode has been switched to another display mode.
- step S52 it is determined which is the current mode after switching.
- step S52 mode information indicating the current mode after switching is stored in the nonvolatile memory 11 (step S53).
- mode information indicating the synthetic standard mode M23 is stored in the nonvolatile memory 11 (step S54).
- mode information indicating the display mode that has been set to the current mode most recently is stored in the nonvolatile memory 11, but when the latest current mode is the synthetic overhead mode M24, the synthetic standard mode M23 is exceptional. Mode information indicating is stored.
- FIG. 28 is a diagram showing a flow of processing at the start of the back mode M2 when the processing in FIG. 27 is performed. This process is also executed by the control unit 1 in response to the transmission of the vehicle 9 being operated to the “R (reverse)” position and the operation mode being the back mode M2.
- the mode information stored in the nonvolatile memory 11 is read (step S61). Then, the display mode indicated by the read mode information is set to the current mode as it is (step S62). Even in such a process, when the display mode that has recently become the current mode is the composite overhead mode M24, an exceptionally different display mode can be set as the current mode immediately after the back mode M2. As a result, a complicated operation of changing the current mode from the composite overhead mode M24 to another display mode can be eliminated.
- the image generation apparatus 10 and the navigation apparatus 20 are described as different apparatuses.
- the navigation apparatus 20 and the image generation apparatus 10 are arranged in the same housing and configured as an integrated apparatus. May be.
- the display device that displays the display image generated by the image generation device 10 has been described as the navigation device 20, but a general function that does not have a special function such as a navigation function. It may be a display device.
- control unit 1 of the image generation apparatus 10 may be realized by the control unit 23 of the navigation apparatus 20.
- signals from the shift sensor 81, the vehicle speed sensor 82, the direction indicator 83, the steering sensor 84, and the changeover switch 85 are input to the image generation device 10, and some or all of these signals are input to the navigation device 20. It may be entered. In this case, part or all of these signals may be input to the control unit 1 of the image generation apparatus 10 via the communication unit 42.
- a direction instruction intended by the driver of the vehicle 9 is input from the direction indicator 83, but may be input by other means.
- the movement of the driver's viewpoint may be detected from an image obtained by photographing the driver's eyes, and a direction instruction intended by the driver may be input from the detection result.
- the specific display mode that is exceptionally handled in the back mode M2 has been described as the composite overhead mode M24.
- a display mode different from the composite overhead mode M24 is set as the specific display mode. May be.
- the specific display mode may be a display mode in which only an image from a virtual viewpoint in which the vicinity of the rear end of the vehicle 9 is looked down almost directly is displayed as a display image. It is desirable that the display mode of the mode used under special conditions such as the final adjustment of the parking position is the specific display mode.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Mechanical Engineering (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Physics & Mathematics (AREA)
- Closed-Circuit Television Systems (AREA)
- Controls And Circuits For Display Device (AREA)
- Traffic Control Systems (AREA)
- Studio Devices (AREA)
- User Interface Of Digital Computer (AREA)
- Navigation (AREA)
Abstract
Description
(1) 車両に搭載された表示装置に表示される画像を生成する画像生成装置であって、複数のカメラで撮影された前記車両の周辺の複数の画像に基づいて、前記車両の後方に配置され前記車両の前方に向けられた仮想視点からみた前記車両の周囲の合成画像を生成する合成画像生成部と、生成された前記合成画像を前記表示装置に出力する出力部と、前記車両のドライバの方向指示を入力する入力部と、を備え、前記合成画像生成部は、前記入力部によって前記方向指示が入力されていないときは、前記車両の左側方領域が前記車両の右側方領域と実質的に同じサイズで表示される第1合成画像を生成し、前記入力部によって前記方向指示が入力されているときは、該方向指示によって示される前記車両の一の側方領域が前記車両の他の側方領域よりも大きいサイズで表示される第2合成画像を生成する。
図1は、本実施の形態の画像表示システム100のブロック図である。この画像表示システム100は、車両(本実施の形態では、自動車)に搭載され、車両の周辺を撮影して画像を生成し、車室内に表示する機能を有している。画像表示システム100のユーザとなる車両のドライバは、この画像表示システム100を利用することにより当該車両の周辺を容易に把握できる。
次に、画像処理部3の合成画像生成部32が、撮影部5で得られた複数の撮影画像に基づいて任意の仮想視点からみた合成画像を生成する手法について説明する。図3は、任意の仮想視点からみた合成画像を生成する手法を説明するための図である。
次に、画像表示システム100の動作の概要について説明する。図4は、画像表示システム100の動作モードの遷移を示す図である。画像表示システム100は、ナビモードM0、フロントモードM1、及び、バックモードM2の3つの動作モードを有している。これらの動作モードは、ドライバの操作や車両の走行状態に応じて制御部1の制御により切り替えられる。
まず、フロントモードM1の表示態様について説明する。図5は、フロントモードM1における表示モードの遷移を示す図である。フロントモードM1では、二画像モードM11、一画像モードM12、及び、サイドカメラモードM13の3つの表示モードがあり、これらの表示モードは互いに表示態様が異なっている。これらの表示モードは、ドライバが切替スイッチ85を押下するごとに、二画像モードM11、一画像モードM12、サイドカメラモードM13の順で制御部1の制御により切り替えられる。サイドカメラモードM13の場合に切替スイッチ85を押下すると、再び、二画像モードM11に戻る。
<4-1-1.視野範囲>
二画面モードM11において、表示画像生成部は、フロントカメラ51で撮影されたフロント画像FP1と、合成画像生成部32で生成された合成画像FP2とを含む表示用画像を生成し、通信部42は、生成された表示用画像をナビゲーション装置20に出力する。図6は、二画像モードM11においてディスプレイ21に表示される表示用画像の一例を示す図である。図に示すように、二画像モードM11における表示用画像では、上部にフロント画像FP1が配置され、下部に合成画像FP2が配置されている。フロント画像FP1は、仮想視点VPからの合成画像ではなく、フロントカメラ51での撮影により得られた撮影画像を撮影画像調整部31で表示用に調整した表示画像である。一方、合成画像FP2は、車両9の後方位置から前方に向けた仮想視点VPからみた車両9の側方領域を含む合成画像である。
また、二画像モードM11では、ドライバの方向指示器83のウインカースイッチの操作に応答して、合成画像FP2の仮想視点VPの視点位置が、制御部1の制御により移動される。図9は、仮想視点VPの視点位置の遷移を示す図である。
次に、図5に戻って、一画像モードM12における表示態様について説明する。一画像モードM12においては合成画像FP3のみがディスプレイ21に表示される。仮想視点VPの視点位置は車両9の後方の左右略中央の位置に設定され、視野方向は車両9の前方方向に設定されている。このため、合成画像FP3では、車両9の後方位置から前方方向をみた状態で、車両9の左右双方の側方領域が示される。細い道などにおいて対向車とすれ違う場合には車両9の左右双方の側方を注目する必要があるため、このような表示態様の一画像モードM12が有効に利用できる。
次に、図5に戻って、サイドカメラモードM13における表示態様について説明する。サイドカメラモードM13においては、サイド画像FP4のみがディスプレイ21に表示される。サイド画像FP4は、仮想視点VPからの合成画像ではなく、左側のサイドカメラ52での撮影により得られた撮影画像を撮影画像調整部31で表示用に調整した表示画像である。
次に、シフトレバーの位置が”R(後退)”となったときの動作モードであるバックモードM2の表示態様について説明する。図16は、バックモードM2における表示モードの遷移を示す図である。バックモードM2では、標準モードM21、ワイドモードM22、合成標準モードM23、及び、合成俯瞰モードM24の4つの表示モードがあり、互いに表示態様が異なっている。ドライバは、所定の操作を行うことで、任意の一の表示モードを選択して、現在の表示モード(以下、「現在モード」という。)に設定することができる。
図16に示すように、標準モードM21においては、バック画像BP1のみがディスプレイ21に表示される。バック画像BP1は、仮想視点VPからの合成画像ではなく、バックカメラ53での撮影により得られた撮影画像を撮影画像調整部31で表示用に調整した表示画像である。図17に示すように、バック画像BP1の視野範囲の水平角度は135°であり、標準モードM21では車両9の後方領域が自然な態様で示される。
図16に示すように、ワイドモードM22においても、バック画像BP2のみがディスプレイ21に表示される。バック画像BP2も、仮想視点VPからの合成画像ではなく、バックカメラ53での撮影により得られた撮影画像を撮影画像調整部31で表示用に調整した表示画像である。このバック画像BP2においても、車両9の後退時における予測進路を示すガイドラインGLが重畳して表示される。
図19は、合成標準モードM23においてディスプレイ21に表示される表示用画像の一例を示す図である。図に示すように、合成標準モードM23における表示用画像では、左側に合成画像BP4が配置され、右側にバック画像BP5が配置されている。合成画像BP4は、車両9の周囲全体を見下ろすような仮想視点VPからみた合成画像である。一方、バック画像BP5は、仮想視点VPからの合成画像ではなく、バックカメラ53での撮影により得られた撮影画像を撮影画像調整部31で表示用に調整した表示画像である。バック画像BP5の視野範囲の水平角度は135°である。
図20は、合成俯瞰モードM24においてディスプレイ21に表示される表示用画像の一例を示す図である。図に示すように、合成俯瞰モードM24における表示用画像の左側には、合成標準モードM23と同様に、車両9の周囲全体を見下ろすような仮想視点VPからみた合成画像BP4が配置される。一方で、表示用画像の右側には、車両9の後端近傍を略直下に見下ろす仮想視点VPからみた合成画像BP6が配置される。
ところで、合成標準モードM23(図19参照。)及び合成俯瞰モードM24(図20参照。)において左側に配置される合成画像BP4については、その仮想視点VPの視点位置をドライバの所望の位置に設定できる。合成標準モードM23あるいは合成俯瞰モードM24において、画面上にコマンドボタンとして表示される設定ボタンCB2を押下すると、仮想視点VPの視点位置を設定するための設定画面が表示される。
このようにバックモードM2においては、互いに表示態様が異なる4つの表示モードがあり、切替スイッチ85等でドライバの指示が受け付けられ、一つの表示モードが現在モードに設定される。一般にドライバは、自分の好みや日常に利用する駐車場の環境に合わせて、4つの表示モードのうちの所望の一つの表示モードを頻繁に利用する。仮に、車両9を後退させるたびにドライバが所望の表示モードを現在モードに設定する必要があるとすると、操作が煩雑となる。
このような処理によっても、直近に現在モードとなった表示モードが合成俯瞰モードM24の場合は、例外的に異なる表示モードをバックモードM2となった直後の現在モードに設定することができる。その結果、現在モードを合成俯瞰モードM24から別の表示モードに変更するといった煩雑な操作を不要とすることができる。
以上、本発明の実施の形態について説明してきたが、この発明は上記実施の形態に限定されるものではなく様々な変形が可能である。以下では、このような変形例について説明する。もちろん、以下で説明する形態を適宜に組み合わせてもよい。
本出願は、2009年5月29日に提出された日本特許出願(特願2009-130100)に基づくものであり、その内容はここに参照として取り込まれる。
3 画像処理部
5 撮影部
10 画像生成装置
11 不揮発性メモリ
21 ディスプレイ
32 合成画像生成部
42 通信部
100 画像表示システム
Claims (11)
- 車両に搭載された表示装置に表示される画像を生成する画像生成装置であって、
複数のカメラで撮影された前記車両の周辺の複数の画像に基づいて、前記車両の後方に配置され前記車両の前方に向けられた仮想視点からみた前記車両の周囲の合成画像を生成する合成画像生成部と、
生成された前記合成画像を前記表示装置に出力する出力部と、
前記車両のドライバの方向指示を入力する入力部と、を備え、
前記合成画像生成部は、
前記入力部によって前記方向指示が入力されていないときは、前記車両の左側方領域が前記車両の右側方領域と実質的に同じサイズで表示される第1合成画像を生成し、
前記入力部によって前記方向指示が入力されているときは、該方向指示によって示される前記車両の一の側方領域が前記車両の他の側方領域よりも大きいサイズで表示される第2合成画像を生成する画像生成装置。 - 請求項1に記載の画像生成装置において、
前記入力部が前記方向指示を入力することを停止したとき、前記画像生成部は、所定期間、前記第2合成画像を生成し続け、
前記所定期間、前記方向指示が入力されていないとき、前記画像生成部は、前記所定期間の後、前記第1合成画像を生成することを開始する画像生成装置。 - 請求項1に記載の画像生成装置において、
前記車両の前方に設けられたカメラで撮影されたフロント画像と、前記合成画像生成部で生成された前記合成画像とを含む表示用画像を生成する表示画像生成部、をさらに備え、
前記出力部は、生成された前記表示用画像を前記表示装置に出力する画像生成装置。 - 請求項1に記載の画像生成装置において、
前記合成画像の前記仮想視点を第1位置から第2位置に変更するとき、前記合成画像生成部は、前記第1位置から前記第2位置まで段階的に前記仮想視点を移動して、前記仮想視点が連続的に移動するアニメーションを作成するための複数の合成画像を生成する画像生成装置。 - 車両に搭載された表示装置に表示される画像を生成する画像生成装置であって、
複数のカメラで撮影された前記車両の周辺の複数の画像に基づいて、前記車両の後方に配置され前記車両の前方に向けられた仮想視点からみた、前記車両の右側方領域と左側方領域を含む前記車両の周囲の合成画像を生成する合成画像生成部と、
前記車両の前方に設けられたカメラで撮影されたフロント画像と、前記合成画像生成部で生成された前記合成画像とを含む表示用画像を生成する表示画像生成部と、
生成された前記表示用画像を前記表示装置に出力する出力部と、を備える画像生成装置。 - 車両に搭載された表示装置に表示される画像を生成する画像生成装置であって、
複数のカメラで撮影された前記車両の周辺の複数の画像に基づいて、仮想視点からみた合成画像を生成する合成画像生成部と、
生成された前記合成画像を前記表示装置に出力する出力部と、を備え、
前記合成画像の前記仮想視点を第1位置から第2位置に変更するとき、前記合成画像生成部は、前記第1位置から前記第2位置まで段階的に前記仮想視点を移動して、前記仮想視点が連続的に移動するアニメーションを作成するための複数の合成画像を生成する画像生成装置。 - 請求項6に記載の画像生成装置において、
前記合成画像生成部は、基準位置から前記仮想視点を変更する画像生成装置。 - 請求項7に記載の画像生成装置において、
前記基準位置は、前記車両のドライバの視点に相当する位置である画像生成装置。 - 車両に搭載される画像表示システムであって、
請求項1に記載の画像生成装置と、
前記画像生成装置で生成された画像を表示する表示装置と、を備える画像表示システム。 - 車両に搭載される画像表示システムであって、
請求項5に記載の画像生成装置と、
前記画像生成装置で生成された画像を表示する表示装置と、を備える画像表示システム。 - 車両に搭載される画像表示システムであって、
請求項6に記載の画像生成装置と、
前記画像生成装置で生成された画像を表示する表示装置と、を備える画像表示システム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/322,654 US8941737B2 (en) | 2009-05-29 | 2010-05-28 | Image generating apparatus and image display system |
CN2010800236866A CN102448773A (zh) | 2009-05-29 | 2010-05-28 | 图像生成设备和图像显示系统 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009130100A JP2010274813A (ja) | 2009-05-29 | 2009-05-29 | 画像生成装置及び画像表示システム |
JP2009-130100 | 2009-05-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010137684A1 true WO2010137684A1 (ja) | 2010-12-02 |
Family
ID=43222789
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/059074 WO2010137684A1 (ja) | 2009-05-29 | 2010-05-28 | 画像生成装置及び画像表示システム |
Country Status (4)
Country | Link |
---|---|
US (1) | US8941737B2 (ja) |
JP (1) | JP2010274813A (ja) |
CN (1) | CN102448773A (ja) |
WO (1) | WO2010137684A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014027489A1 (ja) * | 2012-08-17 | 2014-02-20 | 本田技研工業株式会社 | 運転支援装置 |
EP2581268B1 (en) | 2011-10-13 | 2016-05-11 | Harman Becker Automotive Systems GmbH | Method of controlling an optical output device for displaying a vehicle surround view and vehicle surround view system |
US20220122457A1 (en) * | 2017-10-13 | 2022-04-21 | Waymo Llc | Lane change notification |
USD991273S1 (en) * | 2021-11-17 | 2023-07-04 | Mazda Motor Corporation | Portion of a display screen with graphical user interface |
USD991949S1 (en) * | 2021-11-17 | 2023-07-11 | Mazda Motor Corporation | Portion of a display screen with graphical user interface |
Families Citing this family (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2511137B1 (en) * | 2011-04-14 | 2019-03-27 | Harman Becker Automotive Systems GmbH | Vehicle Surround View System |
EP2554434B1 (en) * | 2011-08-05 | 2014-05-21 | Harman Becker Automotive Systems GmbH | Vehicle surround view system |
JP5811804B2 (ja) | 2011-11-24 | 2015-11-11 | トヨタ自動車株式会社 | 車両用周辺監視装置 |
KR101265711B1 (ko) * | 2011-11-30 | 2013-05-20 | 주식회사 이미지넥스트 | 3d 차량 주변 영상 생성 방법 및 장치 |
US9760092B2 (en) * | 2012-03-16 | 2017-09-12 | Waymo Llc | Actively modifying a field of view of an autonomous vehicle in view of constraints |
JP5971700B2 (ja) * | 2012-05-17 | 2016-08-17 | アルパイン株式会社 | 表示装置 |
CN102689613A (zh) * | 2012-05-26 | 2012-09-26 | 柴晓斌 | 一种为驾驶员指示汽车预期行驶位置的方法 |
DE102012014466A1 (de) * | 2012-07-21 | 2014-01-23 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zur Darstellung einer Szene für ein Fahrzeug |
JP6054738B2 (ja) * | 2012-12-25 | 2016-12-27 | 京セラ株式会社 | カメラモジュール、カメラシステムおよび画像表示方法 |
JP6084097B2 (ja) * | 2013-03-29 | 2017-02-22 | 富士通テン株式会社 | 画像生成装置、画像表示システム及び画像生成方法 |
US9674490B2 (en) | 2013-04-18 | 2017-06-06 | Magna Electronics Inc. | Vision system for vehicle with adjustable cameras |
JP6234701B2 (ja) * | 2013-05-15 | 2017-11-22 | クラリオン株式会社 | 車両用周囲モニタ装置 |
US10093247B2 (en) * | 2013-05-23 | 2018-10-09 | GM Global Technology Operations LLC | Enhanced front curb viewing system |
DE102013215484A1 (de) * | 2013-08-06 | 2015-02-12 | Bayerische Motoren Werke Aktiengesellschaft | Anzeigesystem und Visualisierungsverfahren für ein Fahrzeug |
JP6347934B2 (ja) * | 2013-10-11 | 2018-06-27 | 株式会社デンソーテン | 画像表示装置、画像表示システム、画像表示方法、及び、プログラム |
GB201406405D0 (en) * | 2014-04-09 | 2014-05-21 | Jaguar Land Rover Ltd | Apparatus and method for displaying information |
JP5961821B2 (ja) * | 2014-07-02 | 2016-08-02 | パナソニックIpマネジメント株式会社 | 運転支援装置 |
JP6327115B2 (ja) * | 2014-11-04 | 2018-05-23 | 株式会社デンソー | 車両周辺画像表示装置、車両周辺画像表示方法 |
CN105721789B (zh) * | 2014-12-01 | 2019-09-10 | 中国航空工业集团公司第六三一研究所 | 一种低延时全向导航视频多模式显示控制方法 |
JP6528193B2 (ja) * | 2015-02-10 | 2019-06-12 | 任天堂株式会社 | 電子機器 |
USD788170S1 (en) * | 2015-09-03 | 2017-05-30 | Continental Automotive Gmbh | Display screen with icon |
JP6610139B2 (ja) * | 2015-09-30 | 2019-11-27 | アイシン精機株式会社 | 周辺監視装置 |
CN105644442B (zh) * | 2016-02-19 | 2018-11-23 | 深圳市歌美迪电子技术发展有限公司 | 一种扩展显示视野的方法、系统及汽车 |
JP6477562B2 (ja) * | 2016-03-18 | 2019-03-06 | 株式会社デンソー | 情報処理装置 |
JP6370833B2 (ja) * | 2016-05-16 | 2018-08-08 | 株式会社デンソーテン | 画像生成装置、画像表示システム及び画像生成方法 |
JP6723820B2 (ja) * | 2016-05-18 | 2020-07-15 | 株式会社デンソーテン | 画像生成装置、画像表示システムおよび画像表示方法 |
JP6605396B2 (ja) * | 2016-05-26 | 2019-11-13 | アルパイン株式会社 | 車載表示システム |
JP6261654B2 (ja) * | 2016-05-31 | 2018-01-17 | 株式会社デンソーテン | 表示装置、表示方法及びプログラム |
JP6727971B2 (ja) * | 2016-07-19 | 2020-07-22 | 株式会社クボタ | 作業車 |
WO2018038177A1 (ja) * | 2016-08-23 | 2018-03-01 | 株式会社アルファ | 車両用ドアアウトサイドハンドル装置、車両用ドア及び車両 |
DE102016217488A1 (de) * | 2016-09-14 | 2018-03-15 | Robert Bosch Gmbh | Verfahren zum Bereitstellen einer Rückspiegelansicht einer Fahrzeugumgebung eines Fahrzeugs |
JP6816436B2 (ja) | 2016-10-04 | 2021-01-20 | アイシン精機株式会社 | 周辺監視装置 |
US10462354B2 (en) * | 2016-12-09 | 2019-10-29 | Magna Electronics Inc. | Vehicle control system utilizing multi-camera module |
CN106856008B (zh) * | 2016-12-13 | 2020-05-05 | 中国航空工业集团公司洛阳电光设备研究所 | 一种用于机载合成视景的三维地形渲染方法 |
JP6639379B2 (ja) * | 2016-12-21 | 2020-02-05 | トヨタ自動車株式会社 | 車両周辺監視装置 |
JP6730177B2 (ja) * | 2016-12-28 | 2020-07-29 | 株式会社デンソーテン | 画像生成装置および画像生成方法 |
JP6419894B2 (ja) * | 2017-06-02 | 2018-11-07 | 株式会社デンソーテン | 画像処理装置、表示装置、携帯用表示装置、画像処理方法、表示装置の表示方法、携帯用表示装置の表示方法及びプログラム |
JP6482696B2 (ja) | 2017-06-23 | 2019-03-13 | キヤノン株式会社 | 表示制御装置、表示制御方法、およびプログラム |
EP3480779A1 (en) * | 2017-11-01 | 2019-05-08 | Volvo Car Corporation | Method and system for handling images |
DE102018102051B4 (de) * | 2018-01-30 | 2021-09-02 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren zum Darstellen eines Umgebungsbereichs eines Kraftfahrzeugs mit einem Bildfenster in einem Bild, Computerprogrammprodukt sowie Anzeigesystem |
JP7147255B2 (ja) | 2018-05-11 | 2022-10-05 | トヨタ自動車株式会社 | 画像表示装置 |
CN110843672A (zh) * | 2018-08-21 | 2020-02-28 | 上海博泰悦臻网络技术服务有限公司 | 模拟后车视角的车辆行驶全景监控方法、系统、及车辆 |
JP6731020B2 (ja) * | 2018-09-03 | 2020-07-29 | 株式会社Subaru | 車外環境認識装置および車外環境認識方法 |
JP7073991B2 (ja) * | 2018-09-05 | 2022-05-24 | トヨタ自動車株式会社 | 車両用周辺表示装置 |
US11987182B2 (en) | 2018-12-11 | 2024-05-21 | Sony Group Corporation | Image processing apparatus, image processing method, and image processing system |
CN113170083A (zh) * | 2018-12-11 | 2021-07-23 | 索尼集团公司 | 图像处理装置、图像处理方法和图像处理系统 |
CN113228157B (zh) * | 2018-12-27 | 2024-07-12 | 本田技研工业株式会社 | 图像显示装置、图像显示系统及图像显示方法 |
JP2020135206A (ja) * | 2019-02-15 | 2020-08-31 | パナソニックIpマネジメント株式会社 | 画像処理装置、車載用カメラシステム及び画像処理方法 |
JP7247851B2 (ja) * | 2019-10-11 | 2023-03-29 | トヨタ自動車株式会社 | 運転者支援装置 |
CN111918035B (zh) * | 2020-07-31 | 2022-04-15 | 上海励驰半导体有限公司 | 车载环视方法、装置、存储介质及车载终端 |
KR102473407B1 (ko) * | 2020-12-28 | 2022-12-05 | 삼성전기주식회사 | 틸트 카메라를 이용한 차량의 svm 시스템 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002109697A (ja) * | 2000-10-02 | 2002-04-12 | Matsushita Electric Ind Co Ltd | 運転支援装置 |
JP2002135765A (ja) * | 1998-07-31 | 2002-05-10 | Matsushita Electric Ind Co Ltd | カメラキャリブレーション指示装置及びカメラキャリブレーション装置 |
JP2005170286A (ja) * | 2003-12-12 | 2005-06-30 | Ichikoh Ind Ltd | 車両用映像表示装置 |
JP2006279511A (ja) * | 2005-03-29 | 2006-10-12 | Aisin Seiki Co Ltd | 周辺監視システム |
JP2009093485A (ja) * | 2007-10-10 | 2009-04-30 | Nippon Soken Inc | 画像生成装置及び画像生成プログラム |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0399952A (ja) * | 1989-09-12 | 1991-04-25 | Nissan Motor Co Ltd | 車両用周囲状況モニタ |
JP3468658B2 (ja) | 1997-03-19 | 2003-11-17 | 三菱自動車工業株式会社 | 車両用周辺視認装置 |
EP2309453A3 (en) * | 1998-07-31 | 2012-09-26 | Panasonic Corporation | Image displaying apparatus and image displaying method |
EP1227683B1 (en) * | 1999-10-12 | 2006-07-12 | Matsushita Electric Industrial Co., Ltd. | Monitor camera, method of adjusting camera, and vehicle monitor system |
JP2001114048A (ja) | 1999-10-20 | 2001-04-24 | Matsushita Electric Ind Co Ltd | 車載運転支援情報表示装置 |
JP4218453B2 (ja) | 2003-07-15 | 2009-02-04 | 株式会社デンソー | 車両用前方視界支援装置 |
JP2005223524A (ja) | 2004-02-04 | 2005-08-18 | Nissan Motor Co Ltd | 車両周辺監視装置 |
-
2009
- 2009-05-29 JP JP2009130100A patent/JP2010274813A/ja active Pending
-
2010
- 2010-05-28 CN CN2010800236866A patent/CN102448773A/zh active Pending
- 2010-05-28 US US13/322,654 patent/US8941737B2/en active Active
- 2010-05-28 WO PCT/JP2010/059074 patent/WO2010137684A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002135765A (ja) * | 1998-07-31 | 2002-05-10 | Matsushita Electric Ind Co Ltd | カメラキャリブレーション指示装置及びカメラキャリブレーション装置 |
JP2002109697A (ja) * | 2000-10-02 | 2002-04-12 | Matsushita Electric Ind Co Ltd | 運転支援装置 |
JP2005170286A (ja) * | 2003-12-12 | 2005-06-30 | Ichikoh Ind Ltd | 車両用映像表示装置 |
JP2006279511A (ja) * | 2005-03-29 | 2006-10-12 | Aisin Seiki Co Ltd | 周辺監視システム |
JP2009093485A (ja) * | 2007-10-10 | 2009-04-30 | Nippon Soken Inc | 画像生成装置及び画像生成プログラム |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2581268B1 (en) | 2011-10-13 | 2016-05-11 | Harman Becker Automotive Systems GmbH | Method of controlling an optical output device for displaying a vehicle surround view and vehicle surround view system |
EP2581268B2 (en) † | 2011-10-13 | 2019-09-11 | Harman Becker Automotive Systems GmbH | Method of controlling an optical output device for displaying a vehicle surround view and vehicle surround view system |
WO2014027489A1 (ja) * | 2012-08-17 | 2014-02-20 | 本田技研工業株式会社 | 運転支援装置 |
JP5836490B2 (ja) * | 2012-08-17 | 2015-12-24 | 本田技研工業株式会社 | 運転支援装置 |
US9789819B2 (en) | 2012-08-17 | 2017-10-17 | Honda Motor Co., Ltd. | Driving assistance device |
US20220122457A1 (en) * | 2017-10-13 | 2022-04-21 | Waymo Llc | Lane change notification |
US11837093B2 (en) * | 2017-10-13 | 2023-12-05 | Waymo Llc | Lane change notification |
USD991273S1 (en) * | 2021-11-17 | 2023-07-04 | Mazda Motor Corporation | Portion of a display screen with graphical user interface |
USD991949S1 (en) * | 2021-11-17 | 2023-07-11 | Mazda Motor Corporation | Portion of a display screen with graphical user interface |
Also Published As
Publication number | Publication date |
---|---|
CN102448773A (zh) | 2012-05-09 |
US20120069187A1 (en) | 2012-03-22 |
JP2010274813A (ja) | 2010-12-09 |
US8941737B2 (en) | 2015-01-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2010137684A1 (ja) | 画像生成装置及び画像表示システム | |
JP5271154B2 (ja) | 画像生成装置及び画像表示システム | |
JP5087051B2 (ja) | 画像生成装置及び画像表示システム | |
US8031225B2 (en) | Surroundings monitoring system for a vehicle | |
JP4254887B2 (ja) | 車両用画像表示システム | |
JP5321267B2 (ja) | 車両用画像表示装置及び俯瞰画像の表示方法 | |
EP2368768B1 (en) | Display device | |
JP3916958B2 (ja) | 車両後方モニタシステムおよびモニタ装置 | |
WO2011078201A1 (ja) | 画像処理装置、画像処理システム、および画像処理方法 | |
CN107298050B (zh) | 图像显示装置 | |
WO2011078183A1 (ja) | 画像処理装置、画像処理システム、および画像処理方法 | |
JP5251804B2 (ja) | 運転支援装置 | |
US10793069B2 (en) | Method for assisting the driver of a motor vehicle in maneuvering the motor vehicle with a trailer, driver assistance system as well as vehicle/trailer combination | |
JP5067169B2 (ja) | 車両用駐車支援装置および画像表示方法 | |
JP2005186648A (ja) | 車両用周囲視認装置および表示制御装置 | |
JP2016063390A (ja) | 画像処理装置、及び画像表示システム | |
CN110895443A (zh) | 显示控制装置 | |
JP4059309B2 (ja) | 車載カメラの画像表示制御方法及びその装置 | |
JP2010184606A (ja) | 車両周辺表示装置 | |
JP5466743B2 (ja) | 画像生成装置及び画像表示システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080023686.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10780638 Country of ref document: EP Kind code of ref document: A1 |
|
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 13322654 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10780638 Country of ref document: EP Kind code of ref document: A1 |