WO2019187283A1 - 画像処理装置、画像表示システムおよび画像処理方法 - Google Patents

画像処理装置、画像表示システムおよび画像処理方法 Download PDF

Info

Publication number
WO2019187283A1
WO2019187283A1 PCT/JP2018/039984 JP2018039984W WO2019187283A1 WO 2019187283 A1 WO2019187283 A1 WO 2019187283A1 JP 2018039984 W JP2018039984 W JP 2018039984W WO 2019187283 A1 WO2019187283 A1 WO 2019187283A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image processing
vehicle
face
face position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/039984
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
直史 吉田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Intellectual Property Management Co Ltd
Original Assignee
Panasonic Intellectual Property Management Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Intellectual Property Management Co Ltd filed Critical Panasonic Intellectual Property Management Co Ltd
Priority to DE112018007360.2T priority Critical patent/DE112018007360T5/de
Publication of WO2019187283A1 publication Critical patent/WO2019187283A1/ja
Priority to US16/992,691 priority patent/US11034305B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/12Mirror assemblies combined with other articles, e.g. clocks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/02Rear-view mirror arrangements
    • B60R1/04Rear-view mirror arrangements mounted inside vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/28Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with an adjustable field of view
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/04Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/65Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive
    • B60K35/658Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive the instruments being ergonomically adjustable to the user
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/12Mirror assemblies combined with other articles, e.g. clocks
    • B60R2001/1215Mirror assemblies combined with other articles, e.g. clocks with information displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/12Mirror assemblies combined with other articles, e.g. clocks
    • B60R2001/1253Mirror assemblies combined with other articles, e.g. clocks with cameras, video cameras or video screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/202Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used displaying a blind spot scene on the vehicle part responsible for the blind spot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/303Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/602Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint
    • B60R2300/605Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint the adjustment being automatic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/802Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views
    • B60R2300/8026Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views in addition to a rear-view mirror system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8046Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for replacing a rear-view mirror system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8066Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring rearward traffic
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30268Vehicle interior

Definitions

  • the present disclosure relates to an image processing device, an image display system, and an image processing method that perform image processing on an image obtained from an imaging device disposed in a vehicle.
  • Japanese Patent Application Laid-Open No. 2004-228561 detects a user's viewpoint coordinates in a coordinate system defined on a room mirror provided with an image display, and controls an image display form on the image display based on the detected viewpoint coordinates.
  • An image display apparatus is disclosed.
  • Patent Document 1 has a problem that it is difficult for the user to grasp the positional relationship between the image of the rear of the vehicle and the vehicle of the image.
  • the present disclosure provides an image processing apparatus, an image display system, and an image processing method that can output a captured image of the rear of the vehicle, which makes it easy to grasp the positional relationship with the vehicle.
  • An image processing device includes an acquisition unit that acquires from the imaging device a first image captured by the imaging device disposed in the vehicle in a direction of capturing the rear of the vehicle, and a driver A position detection unit that detects a face position, and a range corresponding to the face position in the target image including the first image is cut out and cut out according to the face position detected by the position detection unit. And an image processing unit that performs image processing for superimposing a position image indicating the position of the vehicle on a position corresponding to the face position of the second image, and outputs a third image after the image processing.
  • the image processing apparatus and the like of the present disclosure can output a captured image of the rear of the vehicle that makes it easy to grasp the positional relationship with the vehicle.
  • FIG. 1 is a schematic diagram illustrating an example of a vehicle according to an embodiment.
  • FIG. 2 is a block diagram illustrating an example of a functional configuration of the image display system according to the embodiment.
  • FIG. 3 is a diagram for describing an imaging range behind the vehicle that is imaged by the imaging device according to the embodiment.
  • FIG. 4 is an explanatory diagram showing images before and after the image processing by the image processing apparatus according to the embodiment and in the image processing process.
  • FIG. 5 is a diagram for explaining a range in which the image processing unit cuts out an image at a face position during normal driving.
  • FIG. 6 is a diagram for explaining a range in which the image processing unit cuts out an image at the face position after movement.
  • FIG. 1 is a schematic diagram illustrating an example of a vehicle according to an embodiment.
  • FIG. 2 is a block diagram illustrating an example of a functional configuration of the image display system according to the embodiment.
  • FIG. 3 is a diagram for describing an imaging range behind the vehicle that is imaged
  • FIG. 7 is a diagram for explaining a difference in image processing by the image processing unit before and after the face position moves.
  • FIG. 8 is a sequence diagram illustrating an example of the operation of the image display system according to the embodiment.
  • FIG. 9 is a flowchart illustrating an example of image processing in the image processing apparatus according to the embodiment.
  • FIG. 10 is a block diagram illustrating an example of a functional configuration of an image display system according to the second modification.
  • FIG. 11 is a flowchart illustrating an example of image processing in the image processing apparatus according to the second modification.
  • a landscape in the rear window direction is displayed on the display for image display based on the image information acquired by the camera, and a landscape that cannot be seen as a blind spot is displayed using a wire frame or the like. It is disclosed to use and display.
  • the image becomes complicated, and thus the user can view images of the vehicle and the surroundings of the vehicle. And the rear side of the vehicle in the displayed image are difficult to grasp.
  • the present inventor has intensively studied and found an image processing apparatus, an image display system, and an image processing method having the following configuration.
  • An image processing device includes an acquisition unit that acquires from the imaging device a first image captured by the imaging device disposed in the vehicle in a direction of capturing the rear of the vehicle, and a driver A position detection unit that detects a face position, and a range corresponding to the face position in the target image including the first image is cut out and cut out according to the face position detected by the position detection unit. And an image processing unit that performs image processing for superimposing a position image indicating the position of the vehicle on a position corresponding to the face position of the second image, and outputs a third image after the image processing.
  • the position of the range of the image cut out from the target image including the image captured by the imaging device and the position of the position image to be superimposed on the second image are determined according to the detected driver's face position. is doing. For this reason, it is possible to cut out an image from a target image including an image obtained by imaging the rear of the vehicle in an appropriate range according to the driver's face position, and to an appropriate position according to the face position of the cut out image. Position images can be superimposed. Therefore, it is possible to output an image obtained by capturing the rear of the vehicle, which makes it easy to grasp the positional relationship with the vehicle. Thereby, the driver can grasp the situation behind the vehicle with less discomfort by looking at the image displayed on the display device provided as a substitute for the rearview mirror, for example.
  • the image processing unit detects the position of the face detected by the position detection unit via the display surface.
  • a range estimated to be visible to the driver may be cut out from the target image.
  • the image processing apparatus cuts out a range estimated to be visible when it is assumed that the driver is looking at the rearview mirror from the target image, so there is little discomfort even for a driver familiar with the rearview mirror. Images can be provided.
  • the image processing unit has a range in which the second image is cut out from the target image. Move to the other side in the left-right direction of the vehicle from the range before moving, and move the position where the position image is superimposed to one side in the left-right direction of the vehicle from the position before the face position moves. Also good.
  • the image processing apparatus moves the cutout range to the other side and moves the position where the position image is superimposed to one side. This is because when the face position moves to one side in the left-right direction, it is presumed that the range on the other side is reflected in the room mirror from before the movement. Similarly, when the face position moves to one side in the left-right direction, the line-of-sight direction through the rearview mirror moves to the other side, so the equipment in the vehicle interior of the vehicle is estimated to move to one side on the opposite side. It is. As described above, since the image processing apparatus determines the range to be cut out and the position at which the position image is to be superimposed, it is possible to provide an image with little discomfort to the driver who is familiar with the rearview mirror.
  • the image processing unit moves a first distance for moving a position where the position image is superimposed, and a second distance for moving the range where the second image is cut out from the target image. May be longer.
  • the imaging target of the image captured by the imaging device is outside the vehicle and is farther from the driver than the position of the equipment inside the vehicle indicated by the position image. For this reason, when the driver moves his / her face position in the left / right direction and looks at the rearview mirror, the driver has a distance that the equipment in the near position moves more than the object outside the vehicle in the far position. growing. Therefore, as described above, by making the first distance longer than the second distance, the position of the range of the image to be cut out from the target image including the image captured by the imaging device and the position image to be superimposed on the second image The position can be determined in the same way as when looking at the rearview mirror. For this reason, it is possible to provide an image with little discomfort to the driver who is familiar with the room mirror.
  • the position image may be an image smaller than the second image and a schematic diagram showing the equipment of the vehicle in a position behind the driver.
  • the position image is an image smaller than the clipped image, it is possible to reduce the complexity of the superimposed image.
  • the position image is a schematic diagram showing the equipment of the vehicle, the driver can see the image, the positional relationship between the image around the vehicle and the vehicle, and the rear of the vehicle in the displayed image. Can be grasped intuitively.
  • the image processing unit may change the transparency of the position image according to a predetermined input.
  • an illuminance sensor that detects illuminance around the vehicle is provided, the predetermined input is the illuminance detected by the illuminance sensor, and the image processing unit increases the transparency as the illuminance decreases. May be.
  • the image processing unit further includes a storage unit that stores a plurality of different types of position images, and the image processing unit is selected in advance from the plurality of types of position images stored in the storage unit. One or more position images may be superimposed on the second image.
  • the image processing unit further includes a determination unit that determines whether a subsequent vehicle is located at a position within a predetermined distance from the vehicle, and the image processing unit is located at a position within the predetermined distance. If the determination unit determines that the position image is not superimposed in the image processing.
  • the image processing unit does not superimpose the position image. Therefore, the difference between the size of the following vehicle reflected in the image and the size of the position image is not included.
  • the uncomfortable feeling given to the driver can be reduced by increasing.
  • An image display system includes the above-described image processing device, the imaging device, and a display device that displays the third image output by the image processing device.
  • a recording medium such as a method, an integrated circuit, a computer program, or a computer-readable CD-ROM, and the system, method, integrated circuit, computer program, or recording. You may implement
  • FIG. 1 is a schematic diagram illustrating an example of a vehicle according to an embodiment.
  • FIG. 2 is a block diagram illustrating an example of a functional configuration of the image display system according to the embodiment.
  • FIG. 3 is a diagram for describing an imaging range behind the vehicle that is imaged by the imaging device according to the embodiment.
  • FIG. 4 is an explanatory diagram showing images before and after the image processing by the image processing apparatus according to the embodiment and in the image processing process.
  • the vehicle 1 includes an image display system 100.
  • the image display system 100 includes an image processing device 10, imaging devices 12, 13 and 14, and a display device 40.
  • the front-rear and left-right directions are directions when the traveling direction of the vehicle 1 is the front, which can be said to be front-rear, left-right directions for the user of the vehicle 1.
  • the imaging device 12 is a camera that is fixed near the left door of the vehicle 1 and photographs an imaging range R1 on the left rear side of the vehicle 1.
  • the imaging device 12 captures an imaging range R1 on the left rear side of the vehicle 1 and generates an image 51.
  • the image 51 generated by the imaging device 12 is also referred to as a left rear image.
  • the imaging device 13 is a camera that is fixed in the vicinity of the right door of the vehicle 1 and photographs an imaging range R2 on the right rear side of the vehicle 1.
  • the imaging device 13 captures an imaging range R2 on the right rear side of the vehicle 1 to generate an image 52.
  • the image 52 generated by the imaging device 13 is also referred to as a right rear image.
  • the imaging device 14 is a camera that is fixed in the vicinity of the rear bumper or trunk hood of the vehicle 1 and shoots the imaging range R3 at the center rear of the vehicle 1.
  • the imaging device 14 captures an imaging range R3 at the center rear of the vehicle 1 and generates an image 53.
  • the image 53 generated by the imaging device 14 is also referred to as a center rear image.
  • each of the imaging devices 12 to 14 is arranged in the vehicle 1 in a direction in which the rear side of the vehicle 1 is imaged. Since the imaging ranges R1 to R3 are imaged by the imaging devices 12 to 14, the imaging range R10 wider than the imaging ranges R1 to R3 is imaged by the imaging devices 12 to 14.
  • the imaging ranges R1 and R3 of the imaging devices 12 and 14 partially overlap, and the imaging ranges R2 and R3 of the imaging devices 13 and 14 overlap. Therefore, a common target is shown in a part of the left rear image and the center rear image. Moreover, a common object is shown in a part of the right rear image and the center rear image.
  • Each of the imaging devices 12, 13, and 14 generates an image by imaging under different shooting conditions. Specifically, each of the imaging devices 12, 13, and 14 is arranged at a different position from each other and arranged in different directions, and acquires an image at, for example, 60 fps. Further, the optical characteristics of the optical systems of the imaging devices 12, 13 and 14 may be different.
  • the image processing apparatus 10 acquires the images 51 to 53 from the respective imaging devices 12 to 14, and outputs a position image 80 to be displayed on the display device 40 based on the acquired images 51 to 53. Specifically, the image processing apparatus 10 sets the display surface of the display device 40 as a room mirror according to the driver's face position in the composite image 50 of the wide imaging range R10 obtained from the acquired images 51 to 53. When the assumption is made, an image 54 in a range estimated to be visible to the driver through the rearview mirror is cut out, and an image 60 based on the cut out image 54 is output. Details of the image processing apparatus 10 will be described later.
  • the display device 40 is a display device that displays an image 60 that is an image of the rear of the vehicle 1.
  • the display device 40 displays the image 60 output by the image processing device 10. Since the display device 40 displays the image 60 output from the image processing device 10, the display device 40 can be used as a substitute for a conventional room mirror that reflects the back of the vehicle 1 using light reflection.
  • the display device 40 is disposed in the vicinity of the front central portion of the ceiling of the vehicle 1.
  • the image processing apparatus 10 includes an acquisition unit 101, a position detection unit 102, an image processing unit 103, and a storage unit 104.
  • the acquiring unit 101 acquires images 51 to 53 captured by the imaging devices 12 to 14 from the imaging devices 12 to 14, respectively.
  • the acquisition unit 101 acquires images captured by each of the imaging devices 12 to 14 at a plurality of different timings (for example, 60 fps).
  • the position detection unit 102 detects the driver's face position.
  • the position detection unit 102 includes an imaging unit 30 that is arranged on the display device 40 and images the display surface side of the display device 40 (that is, the driver side or the rear side of the display device 40).
  • the face position of the driver with respect to the display device 40 is specified by executing face recognition processing on the image picked up by the above.
  • the position detection unit 102 can specify the direction from the reference position of the display device 40 to the face position.
  • the reference position may be the position of the display device 40 where the imaging unit 30 is disposed, or may be the center position of the display device 40 in the horizontal direction.
  • the imaging unit 30 is a camera, for example.
  • the position detection unit 102 may detect the position of the driver's eyes, or may estimate the driver's face position by detecting the position of the driver's head.
  • the image processing unit 103 generates the combined image 50 by combining the images 51 to 53 as described above.
  • the image processing unit 103 cuts out a range corresponding to the face position in the composite image 50 as the target image according to the driver's face position detected by the position detection unit 102, and an image obtained by cutting out the range.
  • Image processing is performed in which a position image 80 indicating the position of the vehicle 1 is superimposed on a position corresponding to 60 face positions. Then, the image processing unit 103 outputs the image 60 after the image processing.
  • the image processing unit 103 When the driver's face position detected by the position detection unit 102 moves in one of the left and right directions of the vehicle 1, the image processing unit 103 has a range where the image is cut out from the composite image 50 more than the range before the face position moves.
  • the vehicle 1 is moved to the other side in the left-right direction, and the position where the position image 80 is superimposed is moved to one side in the left-right direction of the vehicle 1 from the position before the face position moves.
  • FIGS an example in which the driver's face position moves to the right will be described with reference to FIGS.
  • FIG. 5 is a diagram for explaining a range in which the image processing unit cuts out an image at a face position during normal driving.
  • FIG. 6 is a diagram for explaining a range in which the image processing unit cuts out an image at the face position after movement.
  • FIG. 7 is a diagram for explaining a difference in image processing by the image processing unit before and after the face position moves.
  • FIG. 5 and 6 are plan views of the front side of the interior of the vehicle 1 as viewed from above, and (b) is a plan view of the vehicle 1 as viewed from above. It is a figure which shows the range of the image displayed on the display apparatus 40 among imaging range R10 imaged by (3).
  • (A1) and (a2) of FIG. 7 are diagrams for explaining image processing by the image processing unit 103 in the state of FIG. 5, and (a1) shows a processing example of cutting out the image 54 from the composite image 50.
  • (A2) is a figure which shows the example of a process which superimposes the position image 80 on the cut-out image 54.
  • FIG. (B1) and (b2) of FIG. 7 are diagrams for explaining image processing by the image processing unit 103 in the state of FIG.
  • FIG. 6 shows an example of processing for cutting out the image 55 from the composite image 50.
  • (B2) is a figure which shows the example of a process which superimposes the position image 80 on the cut-out image 55.
  • FIG. 5 and 6 indicate the front-rear direction of the vehicle 1.
  • the driver U1 Before the face position moves, as shown in FIG. 5A, for example, the driver U1 holds the steering wheel 22 and performs normal driving, and the driver U1 is in the driver's seat of the vehicle 1. It is assumed that 21 is seated at the center in the left-right direction. In this case, the direction D1 of the center of the field of view through the rearview mirror viewed by the driver U1 when the display device 40 is assumed to be a rearview mirror at the face position P1 during normal driving of the driver U1 is the rear of the vehicle 1. It is assumed that the range cut out by the image processing unit 103 is adjusted so that the image to be displayed on the display device 40 is displayed. That is, in this case, as shown in FIG.
  • the image processing unit 103 starts from the center position (the center position in the left-right direction) of the display device 40 of the vehicle 1 and has a wide imaging range R10.
  • a range 71 corresponding to the angle range ⁇ 10 centering on the direction D1 is determined as a range for cutting out an image.
  • the angle range ⁇ 10 is an angle range obtained by combining the left and right angle ranges ⁇ 11 centered on the direction D1 with the center position of the display device 40 as a starting point.
  • the display device 40 in this case is configured such that when the display surface of the display device 40 is a room mirror and the driver U1 at the face position P1 looks at the display surface, the range immediately behind the vehicle 1 is the display surface.
  • the angle is adjusted so that the angle is reflected by the angle. That is, the angle formed by the line from the face position P1 of the driver U1 to the center position P3 in the left-right direction of the display device 40 and the display surface of the display device 40 is determined by the direction D1 and the display surface of the display device 40. It is almost equal to the angle formed.
  • the image processing unit 103 obtains an image 54 by cutting out the determined range 71 from the composite image 50. Then, as illustrated in (a2) of FIG. 7, the image processing unit 103 superimposes the position image 80 on the center position P11 in the horizontal direction of the cut-out image 54 and superimposes the image 60 obtained by the superimposition. Output to the display device 40.
  • the image 60 is an example of an image that is output after the image processing unit 103 performs image processing before the driver moves the face position.
  • the driver U1 moves the face to the right side of the driver's seat 21 is considered as the case after the face position has moved.
  • the direction D2 of the center of the field of view through the rearview mirror that is visually recognized by the driver U1 is a direction from the left to the direction D1. It becomes.
  • the center position (the center position in the left-right direction) of the display device 40 of the vehicle 1 is the starting point, and the direction D2 of the wide imaging range R10 is the center.
  • a range 72 corresponding to the angle range ⁇ 20 is determined as a range for cutting out an image.
  • the angle range ⁇ 20 is an angle range obtained by combining the left and right angle ranges ⁇ 11 centering on the direction D2 with the center position of the display device 40 as a starting point.
  • the angle formed by the line from the face position P2 of the driver U1 to the center position P3 of the display device 40 and the display surface of the display device 40 is determined by the direction D2 and the display surface of the display device 40. It is almost equal to the angle formed.
  • the image processing unit 103 obtains an image 55 by cutting out the determined range 72 from the composite image 50. Then, as illustrated in (b2) of FIG. 7, the image processing unit 103 detects the position image 80 at the position P12 on the right side by the first distance d1 from the position P11 where the position image 80 is superimposed in the case of the face position P1. , And an image 61 obtained by superimposing is output to the display device 40. At this time, when the face position moves from the face position P1 to the face position P2, the image processing unit 103 cuts out the image 61 from the composite image 50 for the first distance d1 for moving the position where the position image 80 is superimposed.
  • the range 71 is made longer than the second distance d2 for moving. That is, the image 61 is an example of an image that is output after image processing is performed by the image processing unit 103 after the driver moves the face position.
  • the first distance d1 is a distance on the image 61, and is represented by, for example, the number of pixels.
  • the second distance d2 is a distance on the image 61.
  • the second distance d2 is a distance on the composite image 50, and is based on the number of pixels in an image in which the composite image 50 is associated with the same resolution as the image 61 in the image 71 having the same size as the cut-out range 71 of the composite image 50. May be represented.
  • the image processing unit 103 includes external parameters indicating the position and orientation of the camera in the three-dimensional space of the imaging devices 12 to 14 and an optical system such as the camera focal length, aberration, and image center.
  • a process of adjusting an internal parameter indicating the characteristics of (i.e., calibration) may be performed.
  • the storage unit 104 stores a position image 80.
  • the position image 80 is an image smaller than the cut-out images 54 and 55, and is a schematic diagram (for example, CG) showing the equipment of the vehicle 1 at a position behind the driver U1.
  • the storage unit 104 may store a plurality of different types of position images.
  • the image processing unit 103 superimposes the images 54 and 55 obtained by cutting out one or more position images selected in advance from among a plurality of types of position images 80 stored in the storage unit 104.
  • an image selected in advance by the user may be used, or an image selected by the image processing apparatus 10 in the initial setting at the time of factory shipment may be used.
  • the position image 80 may be a schematic diagram of a seat of the vehicle 1, or a schematic diagram of other equipment such as a rear wiper, a rear speaker, and a rear pillar of the vehicle 1. May be.
  • FIG. 8 is a sequence diagram showing an example of the operation of the image display system 100 according to the embodiment.
  • FIG. 9 is a flowchart illustrating an example of image processing in the image processing apparatus according to the embodiment.
  • each of the imaging devices 12 to 14 outputs an image obtained by imaging (S1 to S3).
  • the position detection unit 102 detects the face position of the driver U1 (S4), and the image processing unit 103 responds to the detected face position with respect to the images obtained by the imaging devices 12 to 14.
  • the image processing is executed (S5), and the image processed image is output. Details of the image processing in step S5 will be described later.
  • the display device 40 acquires the image output from the image processing device 10 and displays the image (S6).
  • the image display system 100 by repeatedly executing the processing of steps S1 to S6, the three images captured by the imaging devices 12 to 14 at the same timing are subjected to image processing according to the detected face position in real time. Then, the image after image processing is displayed on the display device 40.
  • the acquisition unit 101 acquires three images 51 to 53 captured by the imaging apparatuses 12 to 14 (S11).
  • the image processing unit 103 combines the three images 51 to 53 to obtain a combined image 50 (S12).
  • the image processing unit 103 cuts out the image 54 in the range 71 corresponding to the face position in the composite image 50 according to the detected face position (S13).
  • the image processing unit 103 superimposes the position image 80 at a position corresponding to the face position of the cut-out image 54 (S14).
  • the image processing unit 103 outputs the image 60 (or the image 61) after the position image 80 is superimposed to the display device 40 (S15).
  • the frequency of performing the process of detecting the face position may be less than the frequency of image processing. While the vehicle 1 is traveling, the scenery around the vehicle 1 keeps changing, so it is necessary to use the latest images obtained by the imaging devices 12 to 14, but the face position is less frequently moved, This is because the speed at which the face position moves is slower than the speed at which the vehicle 1 travels.
  • the image processing apparatus 10 includes an acquisition unit 101, a position detection unit 102, and an image processing unit 103.
  • the acquisition unit 101 acquires images 51 to 53 captured by the imaging devices 12 to 14 arranged in the vehicle 1 in a direction to capture the rear of the vehicle 1 from the imaging devices 12 to 14.
  • the position detection unit 102 detects the face position of the driver U1.
  • the image processing unit 103 is obtained by cutting out a range 71 corresponding to the face position P1 from the composite image 50 obtained by combining the images 51 to 53 in accordance with the face position detected by the position detection unit 102. Then, image processing for superimposing a position image 80 indicating the position of the vehicle 1 on a position P11 corresponding to the face position P1 of the image 54 is performed, and an image 60 (or image 61) after image processing is output.
  • the driver U1 can grasp the situation behind the vehicle 1 with less discomfort by looking at the image displayed on the display device 40 provided as a substitute for the rearview mirror, for example.
  • the image processing unit 103 assumes that the display surface of the display device 40 arranged in the vehicle interior of the vehicle 1 is a mirror, via the display surface.
  • the ranges 71 and 72 estimated to be visible to the driver U1 at the face positions P1 and P2 detected by the position detection unit 102 are cut out from the composite image 50.
  • the image processing apparatus 10 cuts out a range estimated to be visible when it is assumed that the driver is looking at the rearview mirror from the target image, so that the driver who is familiar with the rearview mirror also feels uncomfortable. Fewer images can be provided.
  • the image processing unit 103 when the face position P2 detected by the position detection unit 102 has moved to one side (for example, the right side) of the left and right direction of the vehicle 1,
  • the range 72 where the image 54 is cut out from the composite image 50 is moved to the other side (for example, the left side) of the vehicle 1 from the range 71 before the face position is moved, and the position P12 where the position image 80 is superimposed is the face position. It is moved to one side (for example, right side) in the left-right direction of the vehicle 1 from the face position P1 before moving.
  • the image processing apparatus 10 sets the cutout range to the other side (for example, the left side) as shown in FIG.
  • the position where the position image is superimposed is moved to one side (for example, the right side). This is because when the face position moves to one side (for example, the right side) in the left-right direction, it is estimated that the range on the other side (for example, the left side) is reflected in the room mirror from before the movement.
  • the image processing apparatus 10 determines the range to be cut out and the position at which the position image is superimposed, the image processing apparatus 10 can provide an image with little discomfort to the driver familiar with the rearview mirror.
  • the image processing unit 103 synthesizes an image with the first distance d1 for moving the position where the position image 80 is superimposed when the detected face position moves.
  • the range to be cut out from the image 50 is longer than the second distance d2 for moving.
  • the imaging target of the images captured by the imaging devices 12 to 14 is outside the vehicle 1 and is farther from the driver U1 than the position of the equipment inside the vehicle 1 indicated by the position image 80. For this reason, when the driver U1 moves the face position in the left-right direction and looks at the rearview mirror, the driver U1 moves closer to the equipment outside the vehicle 1 than the vehicle 1 located farther away. The distance to be increased. Therefore, as described above, by setting the first distance d1 to be longer than the second distance d2, the position of the range of the image cut out from the target image including the image captured by the imaging device, and the position to be superimposed on the second image The position of the image can be determined in the same manner as when looking at the rearview mirror. For this reason, it is possible to provide an image with little discomfort to the driver who is familiar with the room mirror.
  • the position image 80 is an image smaller than the cut-out images 54 and 55, and indicates the equipment of the vehicle 1 at a position behind the driver U1. It is a schematic diagram.
  • the position image 80 is an image smaller than the cut-out images 54 and 55, it can reduce that the images 60 and 61 after superimposition become complicated.
  • the position image 80 is a schematic diagram showing the equipment of the vehicle 1, the driver U1 looks at the images 60 and 61, so that the positional relationship between the images 54 and 55 around the vehicle 1 and the vehicle 1. And the vehicle rear in the displayed images 60 and 61 can be grasped intuitively.
  • the image processing apparatus is similar to the configuration of the embodiment, except that the image processing unit 103 further executes the following processing.
  • the image processing unit 103 may further change the transparency of the position image 80 in accordance with a predetermined input. That is, the image processing unit 103 may adjust the transparency of the position image 80 to be superimposed on the cut-out images 54 and 55 and may superimpose the adjusted position image 80.
  • the image processing unit 103 may adjust the transparency of the position image 80 using a value set by the user as a predetermined input, for example.
  • the image processing unit 103 may adjust the transparency of the position image 80 using the illuminance detected by the illuminance sensor as a predetermined input, for example. In this case, the image processing unit 103 increases the transparency of the position image 80 as the illuminance detected by the illuminance sensor decreases.
  • the illuminance sensor in this case is a sensor that is provided in the vehicle 1 and detects the illuminance around the vehicle 1.
  • the illuminance sensor may be disposed inside the vehicle 1 or may be disposed outside the vehicle 1. As long as the illuminance outside the vehicle 1 can be estimated using the detection result of the illuminance sensor, it may be arranged at any position.
  • the image processing unit 103 changes the transparency of the position image 80 in accordance with a predetermined input. For this reason, for example, it is possible to provide an image on which a position image whose transparency is changed according to the preference of the driver, the brightness of the passenger compartment, and the like is superimposed.
  • the image processing unit 103 increases the transparency as the illuminance detected by the illuminance sensor decreases. For this reason, when it is estimated that an image obtained by imaging an object outside the vehicle 1 is dark and it is difficult to visually recognize the object outside, the visibility of the object outside the vehicle 1 can be increased by increasing the transparency of the position image 80. It can reduce that it falls.
  • FIG. 10 is a block diagram illustrating an example of a functional configuration of the image display system according to the second modification.
  • an image display system 100A according to the second modification is different from the image display system 100 according to the embodiment in that the image processing apparatus 10A further includes a determination unit 105, and the image processing unit The processing by 103A is different.
  • the other configuration of the image display system 100A according to the modification 2 is the same as that of the image display system 100 according to the embodiment, and thus the description thereof is omitted.
  • the determination unit 105 determines whether or not the subsequent vehicle is located at a position within a predetermined distance from the vehicle 1.
  • the determination unit 105 may determine whether or not the subsequent vehicle is located at a position within a predetermined distance, for example, by performing image analysis on an image obtained by the imaging device 14.
  • the determination unit 105 may perform the above determination using the detection result of the distance sensor.
  • the image processing unit 103A performs the composite image 50 according to the detected face position in the image processing described in the embodiment.
  • the range corresponding to the face position is cut out, and the position image 80 is not superimposed.
  • FIG. 11 is a flowchart illustrating an example of image processing in the image processing apparatus according to the second modification.
  • the image processing according to the modified example 2 is different from the image processing according to the embodiment in that steps S21 and S22 are added. A description of the same processing as the image processing according to the embodiment is omitted.
  • steps S11 to S13 are performed as in the embodiment.
  • step S13 the determination unit 105 determines whether the subsequent vehicle is located at a position within a predetermined distance from the vehicle 1 (S21).
  • the image processing unit 103 When the determination unit 105 determines that the succeeding vehicle is located at a position within a predetermined distance from the vehicle 1 (Yes in S21), the image processing unit 103 does not superimpose the position image 80 and does not superimpose it. The later image is output (S22).
  • the image processing unit 103 performs steps S14 and S15.
  • the image processing unit 103 does not superimpose the position image 80, so the size of the following vehicle reflected in the image and the size of the position image 80 are displayed.
  • the discomfort given to the driver U1 can be reduced by increasing the difference between.
  • the image display systems 100 and 100A according to the above embodiment and the first and second modifications thereof are configured to include the plurality of imaging devices 12 to 14.
  • the configuration is not limited thereto, and the configuration may include a single imaging device. Good.
  • the position detection unit 102 includes the imaging unit 30 and specifies the driver's face position with respect to the display device 40 by executing face recognition processing on the image obtained by the imaging unit 30.
  • the position detection unit may include a depth sensor and may specify the driver's face position using a result detected by the depth sensor.
  • the position detection unit may have a thermography and may specify the driver's face position using a result detected by the thermography.
  • each component of the image processing apparatus 10 may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
  • Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
  • the software that realizes the image processing apparatus according to each of the above embodiments is the following program.
  • the program acquires, from the imaging device, a first image captured by the imaging device arranged in the vehicle in a direction to image the rear of the vehicle, detects a driver's face position, In accordance with the detected face position, a range corresponding to the face position is cut out of the target image including the first image, and the second image obtained by cutting is cut out at a position corresponding to the face position.
  • An image processing method for superimposing a position image indicating the position of the vehicle and outputting a third image after the image processing is executed.
  • the image processing device, the image display system, and the image processing method according to one or more aspects of the present disclosure have been described based on the embodiments.
  • the present disclosure is not limited to the embodiments. Absent. Unless it deviates from the gist of the present disclosure, one or more of the present disclosure may be applied to various modifications conceived by those skilled in the art in the present embodiment, or forms configured by combining components in different embodiments. It may be included within the scope of the embodiments.
  • the present disclosure is useful as an image processing device, an image display system, an image processing method, and the like that can easily grasp the positional relationship with the vehicle and can output an image captured behind the vehicle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Transportation (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
PCT/JP2018/039984 2018-03-28 2018-10-26 画像処理装置、画像表示システムおよび画像処理方法 Ceased WO2019187283A1 (ja)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE112018007360.2T DE112018007360T5 (de) 2018-03-28 2018-10-26 Bildverarbeitungsvorrichtung, Bildanzeigesystem und Bildverarbeitungsverfahren
US16/992,691 US11034305B2 (en) 2018-03-28 2020-08-13 Image processing device, image display system, and image processing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-062600 2018-03-28
JP2018062600A JP6890288B2 (ja) 2018-03-28 2018-03-28 画像処理装置、画像表示システムおよび画像処理方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/992,691 Continuation US11034305B2 (en) 2018-03-28 2020-08-13 Image processing device, image display system, and image processing method

Publications (1)

Publication Number Publication Date
WO2019187283A1 true WO2019187283A1 (ja) 2019-10-03

Family

ID=68061066

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/039984 Ceased WO2019187283A1 (ja) 2018-03-28 2018-10-26 画像処理装置、画像表示システムおよび画像処理方法

Country Status (4)

Country Link
US (1) US11034305B2 (enExample)
JP (1) JP6890288B2 (enExample)
DE (1) DE112018007360T5 (enExample)
WO (1) WO2019187283A1 (enExample)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7147255B2 (ja) * 2018-05-11 2022-10-05 トヨタ自動車株式会社 画像表示装置
KR20210100608A (ko) * 2018-12-11 2021-08-17 소니그룹주식회사 화상 처리 장치, 화상 처리 방법 및 화상 처리 시스템
US20200294194A1 (en) * 2019-03-11 2020-09-17 Nvidia Corporation View synthesis using neural networks
CN110956134B (zh) * 2019-11-29 2023-08-25 华人运通(上海)云计算科技有限公司 人脸识别方法、装置、设备以及计算机可读存储介质
JP2024063450A (ja) 2022-10-26 2024-05-13 トヨタ自動車株式会社 車両周辺監視システム、車両周辺監視方法及びプログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003196645A (ja) * 2001-12-28 2003-07-11 Equos Research Co Ltd 車両の画像処理装置
JP2005335410A (ja) * 2004-05-24 2005-12-08 Olympus Corp 画像表示装置
JP2010109684A (ja) * 2008-10-30 2010-05-13 Clarion Co Ltd 車両周辺画像表示システム
JP2015168271A (ja) * 2014-03-04 2015-09-28 サカエ理研工業株式会社 ルームミラー及びそのルームミラーを用いた車両死角支援装置並びにそのルームミラー又は車両死角支援装置の表示画像の調整方法
JP2016195301A (ja) * 2015-03-31 2016-11-17 パナソニックIpマネジメント株式会社 画像処理装置、および、電子ミラーシステム

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6891563B2 (en) * 1996-05-22 2005-05-10 Donnelly Corporation Vehicular vision system
JP4855158B2 (ja) * 2006-07-05 2012-01-18 本田技研工業株式会社 運転支援装置
JP5088669B2 (ja) * 2007-03-23 2012-12-05 株式会社デンソー 車両周辺監視装置
JP2010128794A (ja) * 2008-11-27 2010-06-10 Aisin Seiki Co Ltd 車両周辺認知支援装置
US20140347488A1 (en) * 2011-12-09 2014-11-27 Nissan Motor Co., Ltd. Video display mirror and video display mirror system
JP5321711B2 (ja) 2012-04-23 2013-10-23 日産自動車株式会社 車両用周辺監視装置および映像表示方法
JP6364702B2 (ja) 2013-03-29 2018-08-01 アイシン精機株式会社 画像表示制御装置、画像表示システム、および表示ユニット
WO2016020808A1 (ja) * 2014-08-07 2016-02-11 株式会社半導体エネルギー研究所 表示装置、および運転支援システム
DE102015002923B4 (de) * 2015-03-06 2023-01-12 Mekra Lang Gmbh & Co. Kg Anzeigeeinrichtung für ein Fahrzeug insbesondere Nutzfahrzeug
EP3246664A3 (en) * 2016-05-19 2018-02-14 Ricoh Company, Ltd. Information processing system and information display apparatus
US10654422B2 (en) * 2016-08-29 2020-05-19 Razmik Karabed View friendly monitor systems
JP6643969B2 (ja) * 2016-11-01 2020-02-12 矢崎総業株式会社 車両用表示装置
JP6626817B2 (ja) * 2016-11-30 2019-12-25 京セラ株式会社 カメラモニタシステム、画像処理装置、車両及び画像処理方法
US10518702B2 (en) * 2017-01-13 2019-12-31 Denso International America, Inc. System and method for image adjustment and stitching for tractor-trailer panoramic displays
JP6665819B2 (ja) * 2017-03-17 2020-03-13 トヨタ自動車株式会社 車載表示装置
WO2019017198A1 (ja) * 2017-07-19 2019-01-24 株式会社デンソー 車両用表示装置及び表示制御装置
US20200210733A1 (en) * 2017-08-22 2020-07-02 Seeing Machines Limited Enhanced video-based driver monitoring using phase detect sensors
US11645840B2 (en) * 2017-08-31 2023-05-09 Sony Corporation Information processing apparatus, information processing method, program, and moving body
JP2019067220A (ja) * 2017-10-02 2019-04-25 シャープ株式会社 駐車位置表示処理装置、駐車位置表示方法、およびプログラム
JP7283059B2 (ja) * 2018-11-28 2023-05-30 株式会社アイシン 周辺監視装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003196645A (ja) * 2001-12-28 2003-07-11 Equos Research Co Ltd 車両の画像処理装置
JP2005335410A (ja) * 2004-05-24 2005-12-08 Olympus Corp 画像表示装置
JP2010109684A (ja) * 2008-10-30 2010-05-13 Clarion Co Ltd 車両周辺画像表示システム
JP2015168271A (ja) * 2014-03-04 2015-09-28 サカエ理研工業株式会社 ルームミラー及びそのルームミラーを用いた車両死角支援装置並びにそのルームミラー又は車両死角支援装置の表示画像の調整方法
JP2016195301A (ja) * 2015-03-31 2016-11-17 パナソニックIpマネジメント株式会社 画像処理装置、および、電子ミラーシステム

Also Published As

Publication number Publication date
JP6890288B2 (ja) 2021-06-18
DE112018007360T5 (de) 2021-01-14
US20200369207A1 (en) 2020-11-26
US11034305B2 (en) 2021-06-15
JP2019175133A (ja) 2019-10-10

Similar Documents

Publication Publication Date Title
US11034305B2 (en) Image processing device, image display system, and image processing method
JP4810953B2 (ja) 車両用死角映像表示装置
US8330816B2 (en) Image processing device
US11958358B2 (en) Image processing apparatus, moving apparatus, method, and program
JP5093611B2 (ja) 車両周辺確認装置
US20230191994A1 (en) Image processing apparatus, image processing method, and image processing system
US11987182B2 (en) Image processing apparatus, image processing method, and image processing system
JP2018129668A (ja) 画像表示装置
US11794667B2 (en) Image processing apparatus, image processing method, and image processing system
US20240430379A1 (en) Image processing apparatus, image processing method, and image processing system
JP2006044596A (ja) 車両用表示装置
JP2009183473A (ja) 視線方向検出装置及び視線方向検出方法
JPWO2018221209A1 (ja) 画像処理装置、画像処理方法、及び、プログラム
CN105793909A (zh) 用于借助车辆周围环境的通过摄像机采集的两个图像产生警告的方法和设备
JP2005269010A (ja) 画像生成装置、画像生成プログラム、及び画像生成方法
JP2019153237A (ja) 画像処理装置、画像表示システムおよび画像処理方法
WO2021192508A1 (ja) 画像合成装置、及び、画像合成方法
JP2022040819A (ja) 画像処理装置及び画像処理方法
JP4945315B2 (ja) 運転支援システム及び車両
JP2020113066A (ja) 映像処理装置、映像処理方法、およびプログラム
JP2009006968A (ja) 車両用表示装置
JP5083137B2 (ja) 運転支援装置
JP6136748B2 (ja) 2次元3次元表示装置
JP4696825B2 (ja) 車両用死角映像表示装置
JP2021101515A (ja) 表示装置、表示方法及び表示プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18911735

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 18911735

Country of ref document: EP

Kind code of ref document: A1