CN111052179A - Vehicle device, calibration result determination system, calibration result determination method, and program - Google Patents

Vehicle device, calibration result determination system, calibration result determination method, and program Download PDF

Info

Publication number
CN111052179A
CN111052179A CN201880051609.8A CN201880051609A CN111052179A CN 111052179 A CN111052179 A CN 111052179A CN 201880051609 A CN201880051609 A CN 201880051609A CN 111052179 A CN111052179 A CN 111052179A
Authority
CN
China
Prior art keywords
image
beam pattern
vehicle
superimposed
light emission
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880051609.8A
Other languages
Chinese (zh)
Other versions
CN111052179B (en
Inventor
永井亮行
荒濑贵之
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
JVCKenwood Corp
Original Assignee
JVCKenwood Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by JVCKenwood Corp filed Critical JVCKenwood Corp
Publication of CN111052179A publication Critical patent/CN111052179A/en
Application granted granted Critical
Publication of CN111052179B publication Critical patent/CN111052179B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/304Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/607Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mechanical Engineering (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

The present invention enables the calibration result of the camera parameters used for generating the overhead image to be easily and accurately confirmed. The vehicle-mounted device (16) includes: an image acquisition unit (21) that acquires an image of the surroundings of the vehicle (1) from the camera (11); a light emission control unit 25 for controlling a light beam pattern emitted from a light emission device 14, the light emission device 14 emitting a light beam pattern to the ground or floor at a predetermined position with respect to the vehicle 1; an image processing unit (24) that generates an overhead image including a beam pattern using the image acquired by the image acquisition unit (21); a superimposed image generation unit (26) that generates a superimposed image for a position corresponding to the beam pattern included in the overhead image; and an image determination processing unit (27) that detects an overlapping portion of the line and the image of the beam pattern in the overhead image on which the superimposed image is superimposed, and thereby determines whether or not the camera parameters for generating the overhead image are correctly calibrated.

Description

Vehicle device, calibration result determination system, calibration result determination method, and program
Technical Field
The invention relates to a vehicle device, a calibration result determination system, a calibration result determination method, and a program.
Background
For driving safety of vehicles such as automobiles, the following techniques are used: a camera for photographing the surroundings is mounted on a vehicle to accurately check a blind spot or a place that is difficult to see from the perspective of a driver, and the photographed image is processed and displayed so as to be easily viewed by the driver. First, a camera for monitoring the rear of the vehicle is mounted on the vehicle, a camera for monitoring the front of the vehicle and a camera for monitoring the side direction are mounted on the vehicle, and images obtained by these cameras are used to provide a display for assisting the driver.
Also, the following techniques are widely used in recent years: image information obtained from ultra-wide-angle (e.g., 180 degrees) and high-resolution (e.g., 100 ten thousand pixels or more) cameras provided at four places (front, rear, and both sides) is subjected to image processing, whereby an overhead image is generated as if the vehicle were viewed from above, and is displayed in a simulated manner. Such a technique is used, for example, to accurately check a portion that is blind from the driver's perspective or a portion that is difficult to see when parking in a garage, parallel parking, or the like.
In an apparatus for generating and displaying an overhead image in a simulated manner, an image that is viewed from above a vehicle is generated by performing viewpoint conversion processing and synthesis processing on images captured by four cameras. In an apparatus for generating an overhead image, in order to convert an image captured by four cameras into an image as if the image were viewed from above a vehicle, the position of a virtual camera above the vehicle is determined, and parameters for converting the image are determined on the premise that the position of an actual camera is determined. In practice, however, it is very difficult to install four cameras without any error, and the parameters of the respective cameras do not necessarily follow exactly the design values. Therefore, in order to correctly execute the viewpoint conversion processing and the synthesis processing, not only the setting orientation and the angle of view of the camera need to be adjusted, but also calibration (correction) for correcting distortion and the like at the time of image conversion in the apparatus that generates the overhead view image is needed.
Conventionally, there are techniques of: calibration can be performed even in a situation where the vehicle loading state changes, and calibration can be performed by finely estimating all camera parameters without using, for example, the parallelism of the vehicle with respect to the white line (for example, patent document 1).
Prior art documents
Patent document
Patent document 1: japanese patent laid-open No. 2016-070814.
Disclosure of Invention
Problems to be solved by the invention
In general, camera parameters for generating an overhead image are calibrated by providing a mark, a calibration chart, or the like at a predetermined position and evaluating the result of imaging. Specifically, for example, a predetermined pattern precisely set at a predetermined position is photographed by four cameras, and camera parameters are corrected so that an image actually photographed is the same as that photographed by a camera conforming to design values.
In the conversion to the overhead view image, it is most important that the boundary portion of the image of each camera in the synthesized image is not deformed, and for example, as shown in fig. 11, when the white lines 91 and 92 existing at a distance of one meter from the vehicle 81 are accurately continued without being shifted or deformed in the vicinity of the image boundaries 93 to 96 of each camera in the synthesized overhead view image, and the white lines 91 and 92 are separated from the vehicle 81 by one meter in parallel, the user can recognize the image.
Therefore, for example, in an inspection by a dealer or the like, after a predetermined calibration using a mark, a calibration chart, or the like, an overhead image including a white line (or a line of a patch: hereinafter, the same portion is omitted) captured by each camera is generated and displayed in a state where the white line (or the patch) is drawn at a predetermined position around the vehicle, and it is confirmed that a normal image without distortion or displacement of the white line is displayed on the displayed image. For example, when overhead images are displayed on both sides of a vehicle in a state where white lines are drawn parallel to the vehicle, as shown in fig. 11, if the white lines 91 or 92 of the displayed overhead images are skewed or if the image boundaries 93 to 96 of the synthesized portion are offset, the camera parameters need to be adjusted again because the calibration is not performed correctly.
In this way, in the inspection by a dealer or the like, it takes a lot of work time to draw a white line or the like to a predetermined position in parallel with the vehicle every time the verification operation after the calibration is performed. In addition, it is very difficult to confirm the bird's-eye view image after calibration at a place where a white line is present in advance at a position where the camera can capture the image, because the vehicle must be stopped so that the positional relationship between the white line and the vehicle is constant.
In patent document 1, it is necessary to capture a predetermined number or more of linear structures for each camera during calibration, and for example, it is assumed that a white line of a road is captured during driving. That is, even if the technology described in patent document 1 is applied to the inspection by the dealer, it is necessary to form a linear structure by drawing a white line or sticking a tape or the like at a predetermined position around the vehicle or to move the inspection vehicle to a position where a linear structure such as a white line is previously present at a position where the inspection vehicle can be imaged by the camera, as in the conventional art.
In addition, when the dealer checks, it may be difficult to recognize a white line or the like as a linear structure for calibration depending on the surrounding brightness or the color of the floor or the floor, and in this case, it may take a time to prepare tapes or the like of different colors in advance.
Further, in the inspection by the dealer or the like, the confirmation operation of the corrected overhead image using the mark, the calibration chart, or the like is performed by visual observation, and thus the confirmation result is unstable. Further, since the linear structure for confirming the overhead image is actually prepared by drawing a white line or attaching a tape or the like by manual work, the recognition ease itself of the linear structure in the overhead image after calibration is not uniform.
Accordingly, an object of the present invention is to solve the above-described problems, and to provide a vehicle device, a calibration result determination system, a calibration result determination method, and a program that enable easy and accurate confirmation of the calibration result of camera parameters used when generating an overhead view image.
Means for solving the problems
In order to solve the above problem, one aspect of the vehicle device of the present invention includes: an image acquisition unit that acquires an image captured by a camera that captures the surroundings of a vehicle; a light emission control unit that controls a light beam pattern emitted by a light emitting device that emits a light beam pattern that is irradiated onto a floor or a floor at a predetermined position with respect to a vehicle; an image processing unit that generates an overhead image including a beam pattern whose light emission is controlled by the light emission control unit, using the image acquired by the image acquisition unit; a superimposed image generation unit that generates a superimposed image at a position corresponding to the beam pattern included in the overhead image generated by the image processing unit; and an image determination processing unit configured to determine whether or not the camera parameters for generating the overhead image are correctly calibrated by detecting a line and a repeated portion of the beam pattern in the overhead image on which the superimposed image generated by the superimposed image generation unit is superimposed.
In addition, an aspect of the calibration result determination system according to the present invention includes: the vehicle device of the present invention described above; a camera that photographs the surroundings of the vehicle; and a light emitting device that emits a light beam pattern that irradiates the ground or floor at a predetermined position with respect to the vehicle.
In addition, an aspect of the calibration result determination method according to the present invention includes: a light emission control step of controlling a light beam pattern emitted by a light emitting device that emits a light beam pattern that is irradiated onto a floor or a floor at a predetermined position with respect to a vehicle; an image generation step of generating an overhead image including a beam pattern whose light emission is controlled by the processing of the light emission control step, using an image captured by a camera that captures the surroundings of the vehicle; a superimposed image generation step of generating a superimposed image at a position corresponding to a beam pattern included in the overhead image generated by the processing in the image processing step; and an image determination step of determining whether or not the camera parameters for generating the overhead image are correctly calibrated by detecting a line and a repeated portion of the beam pattern in the overhead image on which the superimposed image generated by the processing of the superimposed image generation step is superimposed.
In addition, according to an aspect of the program described in the present invention, the computer executes a process including the steps of: a light emission control step of controlling a light beam pattern emitted by a light emitting device that emits a light beam pattern that is irradiated onto a floor or a floor at a predetermined position with respect to a vehicle; an image generation step of generating an overhead image including a beam pattern whose light emission is controlled by the processing of the light emission control step, using an image captured by a camera that captures the surroundings of the vehicle; a superimposed image generation step of generating a superimposed image at a position corresponding to a beam pattern included in the overhead image generated by the processing in the image processing step; an image determination step of determining whether or not the camera parameters for generating the overhead image are correctly calibrated by detecting a line and a repeated portion of the beam pattern in the overhead image on which the superimposed image generated by the processing of the superimposed image generation step is superimposed.
Effects of the invention
According to the present invention, the calibration result of the camera parameters used when generating the overhead image can be easily and accurately confirmed.
Drawings
Fig. 1 is a diagram showing a functional configuration of a calibration result determination system.
Fig. 2 is a diagram for explaining the beam pattern 42.
Fig. 3 is a diagram for explaining the beam pattern 42.
Fig. 4 is a diagram for explaining display of an image 61 of a beam pattern, a superimposed image 62, and a repeated portion 63.
Fig. 5 is a diagram for explaining display of the image 61 of the beam pattern, the superimposed image 62, and the repeated section 63.
Fig. 6 is a diagram for explaining display of an image 61 of a beam pattern, a superimposed image 62, and a repeated portion 63.
Fig. 7 is a diagram for explaining display of the image 61 of the beam pattern, the superimposed image 62, and the repeated section 63.
Fig. 8 is a diagram for explaining display of the image 61 of the beam pattern, the superimposed image 62, and the repeated section 63.
Fig. 9 is a diagram for explaining an image in the vicinity of the synthesis boundary of the overhead image.
Fig. 10 is a flowchart for explaining the verification process of the calibration result.
Fig. 11 is a diagram for explaining an overhead image.
Detailed Description
An information transmission/reception system according to an embodiment of the present invention will be described below with reference to fig. 1 to 10.
Fig. 1 is a functional block diagram showing a functional configuration of a determination system including a calibration result of an in-vehicle apparatus as one embodiment of the present invention.
The calibration result determination system includes cameras 11-1 to 11-4, a display device 12, a sensor 13, light emitting devices 14-1 to 14-4, a control device 15, and an in-vehicle device 16, and is mounted on the vehicle 1. The in-vehicle device 16 corresponds to a vehicle device. The calibration result determination system can share most of the components provided in the conventional vehicle 1 for generating and displaying an image around the vehicle 1 for driver assistance, an overhead image obtained by synthesis processing, and the like.
The cameras 11-1 to 11-4 are, for example, a camera for monitoring the rear of the vehicle 1, a camera for monitoring the front of the vehicle 1, and a camera for monitoring the directions of both side surfaces of the vehicle 1. The cameras 11-1 to 11-4 are mounted at predetermined positions of the vehicle 1 based on known design information determined in advance so as to capture images of the entire periphery of the vehicle 1. The cameras 11-1 to 11-4 are preferably ultra wide-angle (e.g., 180 degrees) and high-resolution (e.g., 100 ten thousand pixels or more). The images obtained by the cameras 11-1 to 11-4 are subjected to image processing by an onboard device 16 described later, and are displayed on the display device 12 to assist the driver. Specifically, the images obtained by the cameras 11-1 to 11-4 are subjected to image processing by the in-vehicle device 16, and the images of the front, rear, and side surfaces of the vehicle 1 are displayed on the display device 12, whereby it is possible to assist in confirming the portion that becomes the blind spot of the driver. Further, the in-vehicle device 16 performs image processing such as viewpoint conversion processing and synthesis processing on the images captured by the cameras 11-1 to 11-4, thereby generating an overhead image and performing analog display on the display device 12. Hereinafter, the cameras 11-1 to 11-4 will be collectively referred to as the cameras 11 without particularly distinguishing the cameras 11-1 to 11-4.
The display device 12 is configured by, for example, a liquid crystal display or the like, and displays, as necessary, images of the front, rear, and side surfaces of the vehicle 1, which are portions for assisting the confirmation of the blind spot of the driver, generated by an in-vehicle device 16 described later, and an overhead image or the like. The display device 12 receives and displays a result of determination as to whether or not the calibration has been performed correctly, which is obtained by processing performed by the in-vehicle device 16, which will be described later.
The sensor 13 includes a sensor 13 for detecting brightness around the vehicle 1, and supplies the detection result to the in-vehicle device 16. The sensor 13 may include or be used as a sensor for detecting information necessary for other actions, not shown, of the vehicle 1. The sensor 13 may also be used as a sensor for an automotive lamp, for example, as the sensor 13 for detecting the brightness around the vehicle 1.
The light emitting devices 14-1 to 14-4 are provided on the ground surface on at least both side surfaces of the vehicle 1, which faces the outside of the vehicle 1, and irradiate a ray-shaped light beam pattern in parallel with the side surfaces of the vehicle 1. As a specific example, as shown in FIG. 2, a part of the light emitting devices 14-1 to 14-4 may be a lower part of a door mirror 41 installed on both sides of the vehicle 1. The light emitting devices 14-1 to 14-4 irradiate a ray-shaped light beam pattern 42 on the ground parallel to the side surface of the vehicle 1 based on the control of the in-vehicle device 16. The light emitting devices 14-1 to 14-4 are only required to be provided on at least both side surfaces of the vehicle 1. However, the light emitting devices 14-1 to 14-4 are preferably: and a linear beam pattern 42 which is provided on either one side or both sides of the front or rear of the vehicle 1 and irradiates the ground on either one side or both sides of the front or rear of the vehicle 1 in parallel with the side surface of the vehicle 1. The color of the light beam pattern 42 emitted from the light emitting devices 14-1 to 14-4 is a complementary color of the ground or a color close to the complementary color, and the color and the luminance (color temperature, etc.) are controlled by a light emission control unit 25, which will be described later, of the in-vehicle device 16. Hereinafter, the light emitting devices 14-1 to 14-4 are collectively referred to as light emitting devices 14 without particularly distinguishing the light emitting devices 14-1 to 14-4 from each other.
Therefore, the shape of the beam pattern 42 emitted by the light-emitting device 14 may be any shape as long as the beam pattern 42 is irradiated at least at all boundary portions (for example, positions corresponding to the rectangles α -1 to α -4 in fig. 3) so as to include a straight line existing across the boundary line, the beam pattern 42 may be configured by a plurality of straight lines or rectangles, for example, as shown in fig. 3, the beam pattern 42 may be, for example, a rectangle a having a constant width with respect to the vehicle 1 around the vehicle 1, straight lines b-1 and b-2 having a length equal to or greater than the vehicle length in parallel with the side surface of the vehicle 1, or straight lines c-1 to c-4 existing across the boundary line at all boundary portions.
Returning to fig. 1, the control device 15 controls the operation of each part (may include each part not shown) of the vehicle 1. When the operation of each part of the vehicle 1 is required in the process of determining whether the calibration is correctly performed, such as when the light-emitting device 14 is provided below the door mirrors 41 on both sides described with reference to fig. 2, the control device 15 executes necessary processes such as control of the opening and closing operation of the door mirrors 41 based on the control from the in-vehicle device 16.
The in-vehicle device 16 includes an image acquisition unit 21, an operation input unit 22, a control unit 23, an image processing unit 24, a light emission control unit 25, a superimposed image generation unit 26, an image determination processing unit 27, and a display control unit 28. In addition, the in-vehicle device 16 may include various known display systems, car navigation systems, car audio systems, or the like.
The image acquisition unit 21 acquires an image captured by the camera 11 and supplies the image to the image processing unit 24.
The operation input unit 22 is configured by, for example, a receiving unit that receives a signal from a touch panel, a button, a key, or a remote controller, and acquires an operation input from a user and supplies the operation input to the control unit 23.
The control unit 23 controls the operation of the in-vehicle device 16 based on the operation input of the user supplied from the operation input unit 22. The control unit 23 controls, for example, image processing by an image processing unit 24 described later, light emission control of the light emitting device 14 by the light emission control unit 25, generation of a superimposed image by the superimposed image generation unit 26, synthesis of the superimposed image by the image processing unit 24, and determination processing of whether or not calibration by the image determination processing unit 27 is correctly performed. Details of the confirmation processing regarding the calibration result will be described later with reference to the flowchart of fig. 10.
The control unit 23 controls the processing necessary for known calibration for correcting the camera parameters at the time of generating the overhead image, based on the user operation input supplied from the operation input unit 22. Specifically, a predetermined mark or calibration chart precisely set at a predetermined position is captured, and a camera parameter used when the overhead image is generated by the image processing unit 24 is corrected by the control unit 23 based on an operation input by the user supplied from the operation input unit 22 so that an actually captured image is the same as that captured by the camera 11 according to the design value.
The image processing unit 24 generates images of the front, rear, and side surfaces of the vehicle 1 for assisting the confirmation of the portion that becomes a blind spot of the driver, for example, by performing processing for removing image distortion using a known distortion function on the image captured at a super-wide angle and high resolution and provided from the image acquisition unit 21 by the control of the control unit 23, for example. The image processing unit 24 generates an overhead image, which is an image as if the ground is viewed from above the vehicle 1, by performing viewpoint conversion processing on an image viewed from a virtual overhead viewpoint based on design values and the like relating to known camera mounting, using all the images of the front, rear, and both side surfaces of the vehicle 1 captured at a super-wide angle and high resolution supplied from the image acquisition unit 21, and performing synthesis processing and the like on the images. Such viewpoint conversion processing and synthesis processing are realized by the following known techniques and the like: the luminance value of a specific pixel of the image captured by each camera 11 is assigned to a corresponding specific pixel of the image from the virtual overhead viewpoint using a known geometric transformation formula of the camera.
Then, when performing the calibration of each parameter for generating the overhead image, the image processing unit 24 executes known processing necessary for the calibration based on the control of the control unit 23.
When the verification process of the calibration result is performed, the image processing unit 24 performs a viewpoint conversion process, a synthesis process, and the like on the image supplied from the image acquisition unit 21 based on the control of the control unit 23 to generate an overhead image. Then, the image processing unit 24 supplies the generated overhead image to the light emission control unit 25 and the superimposed image generation unit 26, and also supplies the superimposed image generated by the superimposed image generation unit 26 described later to the image determination processing unit 27 and the display control unit 28 while superimposing the superimposed image on the overhead image.
The light emission control section 25 controls the color and luminance of the light beam pattern 42 emitted from the light emitting device 14 based on the control of the control section 23. When the verification process of the calibration result is performed, the light emission control unit 25 receives a supply of a control signal instructing light emission from the control unit 23, acquires the synthesized overhead image from the image processing unit 24, detects the color of the ground (floor) around the vehicle 1, and causes the light emitting device 14 to emit light in a color complementary to the color or a color as close to the complementary color as possible. Thus, since the beam pattern 42 irradiated on the ground (floor) is easily recognized on the overhead image, the recognition of the beam pattern 42 in the verification process of the calibration result, which will be described later, becomes stable. The light emission control unit 25 controls the intensity of light emission based on the brightness information of the surroundings of the vehicle 1 supplied from the sensor 13 or based on the brightness information of the overhead image supplied from the image processing unit 24. In this case, it is preferable to control the light emission intensity of the light beam pattern 42 based on the luminance information in the vicinity of the image boundary of the synthesized overhead image. This prevents the light flux pattern 42 from being difficult to recognize on the synthesized overhead image due to excessively weak or excessively strong light emission intensity.
When the light emitting device 14 is provided, for example, below the door mirrors 41 on both sides described with reference to fig. 2 and the operation of each part of the vehicle 1 is required for the process of determining whether the calibration is performed correctly, the light emission control unit 25 controls the control device 15 to control the operation of each part (for example, the door mirrors 41) of the vehicle 1 and to irradiate the light beam pattern 42 to a predetermined position described with reference to fig. 3.
Even in the process of irradiating light beam pattern 42 by light emitting device 14, light emission control unit 25 acquires the synthesized overhead image from image processing unit 24, determines whether or not light beam pattern 42 irradiated to the ground (floor) around vehicle 1 is easily recognized with respect to the ground (floor), and controls the color and brightness of light beam pattern 42 based on the determination result.
The superimposed image generation unit 26 generates a superimposed image 62, in which the superimposed image 62 is formed of a straight line or a rectangle that is superimposed by a predetermined width and is shifted by a predetermined width when the camera parameters for overhead image generation are correctly aligned with respect to the light beam pattern 42 that is emitted from the light-emitting device 14 and described using fig. 3. The color and transparency of the straight line or rectangle constituting the superimposed image 62 are set so that, for example, as shown in fig. 4, the superimposed overlapping portion 63 is easily recognized in a state where the superimposed image 62 is superimposed on the image 61 of the beam pattern included in the overhead image. As a specific example, in the case where the color of the light flux pattern 42 is red, the overlapped portion 63 is displayed as purple by setting the color of the overlapped image 62 to blue having a certain degree of transparency, and becomes easily recognizable.
In addition, when the transparency of the superimposed image 62 is too low or the superimposed image has no transparency, only the superimposed image 62 can be recognized at a portion where the image 61 of the beam pattern to be recognized as the overlapping portion 63 overlaps the superimposed image 62 as shown in fig. 5. On the other hand, when the transparency of the superimposed image 62 is too high, as shown in fig. 6, only the image 61 of the beam pattern can be recognized even at a portion where the image 61 of the beam pattern to be recognized as the overlapping portion 63 overlaps the superimposed image 62. To avoid this, the superimposed image generating unit 26 controls the transparency of the superimposed image 62 so that the superimposed image 62 can be clearly recognized as a mixed color of the colors of the image 61 of the light flux pattern and the superimposed image 62 in the overlapping portion 63 where the image 61 of the light flux pattern and the superimposed image 62 overlap with each other, as described with reference to fig. 4.
As described above, since the color of the light flux pattern 42 changes according to the color of the floor (or floor) or the brightness of the surroundings, the superimposed image generating unit 26 inquires of the light emission control unit 25 about the details of the light emission control of the light flux pattern 42, and determines the color and transparency of the superimposed image 62 according to the result.
Returning to fig. 1, the image determination processing unit 27 acquires a video in which the superimposed image 62 generated by the superimposed image generation unit 26 is superimposed from the image processing unit 24, and determines whether or not the alignment has been performed correctly by performing image recognition on the overlapping portion 63 in which the image 61 of the beam pattern and the superimposed image 62 are superimposed.
Specifically, for example, when the calibration is performed correctly, the overlapping portion 63 where the image 61 of the light flux pattern and the superimposed image 62 overlap each other is clearly recognized as a region having a predetermined width predetermined by the design value, as described with reference to fig. 4, which is a mixture of the color of the image 61 of the light flux pattern and the color of the superimposed image 62. In contrast, when the calibration is not performed correctly and the image is distorted, as shown in fig. 7, the width of the overlapping portion 63 where the image 61 of the beam pattern and the superimposed image 62 overlap is not constant. In addition, when the calibration is not performed correctly and the images are shifted, as shown in fig. 8, the overlapping portion 63 in which the image 61 of the beam pattern overlaps the superimposed image 62 does not exist, or even if it exists, the width of the overlapping portion 63 becomes smaller than a theoretical value (a predetermined number of points predetermined from a design value) or becomes thicker than the theoretical value.
In fig. 9, the overlapping portion 63 of the portion corresponding to the boundary of the image composition shown by the rectangle α in fig. 3 is described, in fig. 9, the image 61 of the beam pattern is not displayed correctly and continuously at the boundary of the image composition, and the width of the overlapping portion 63 above the display screen is narrower than the theoretical value with respect to the boundary portion, and therefore, it is recognized that the bird's-eye view is generated on the image of the corresponding portion, in addition, since the width of the overlapping portion 63 on the lower side of the display screen with respect to the boundary portion is not constant, it is recognized that distortion is generated on the image of the corresponding portion, the image determination processing unit 27 supplies these determination results to the display control unit 28, the determination result of the image determination processing unit 27 may be an emphasized display of a portion where the width of the overlapping portion 63 is different from the theoretical value due to the displacement or deformation of the bird's-eye view image, or may include any one of an error message of text, numerical data indicating a deviation from the theoretical value, or the like, and in the confirmation of the calibration result of the overhead image may be a case where the overlap or the alarm is generated, for example, a predetermined pixel width of the overlapping portion may be different from the theoretical value, or the overhead image may be used.
The display control unit 28 synthesizes a plurality of videos or images supplied from the image processing unit 24 as necessary, or adds a necessary message or the like to generate a display screen, and supplies the display screen to the display device 12 to control the display thereof. When the verification process of the calibration result is performed, the display control unit 28 supplies the determination result, which indicates whether or not the calibration is performed correctly or in which part the offset or the distortion is generated when the calibration is not performed correctly, supplied from the image determination processing unit 27 to the display device 12 together with the overhead image in the state in which the superimposed image 62 is superimposed, and displays the result under the control of the control unit 23.
That is, the in-vehicle device 16 includes: an image acquisition unit 21 that acquires an image captured by the camera 11; a light emission control unit 25 that controls light emission of a light beam pattern 42 of a light emitting device 14 that irradiates a ground surface or a floor surface at a predetermined position with respect to the vehicle 1 with the light beam pattern 42; an image processing unit 24 that generates an overhead image including the beam pattern 42 whose light emission is controlled by the light emission control unit 25, using the image acquired by the image acquisition unit 21; a superimposed image generation unit 26 that generates a superimposed image 62 for a position corresponding to the image 61 of the beam pattern included in the overhead image generated by the image processing unit 24; and an image determination processing unit 27 that determines whether or not the camera parameters for generating the overhead image are accurately calibrated by detecting the overlapping portion 63 of the superimposed image 62 and the image 61 of the light beam pattern in the overhead image on which the superimposed image 62 generated by the superimposed image generation unit 26 is superimposed, and that can easily and accurately confirm the calibration result by processing the overhead image of the light beam pattern 42 irradiated to a predetermined position and the image 61 including the light beam pattern without performing troublesome work such as marking a predetermined position around the vehicle 1 after predetermined calibration using a mark, a calibration chart, or the like, at the time of inspection by a dealer, for example.
Further, the above-described configuration can be realized at low cost because a plurality of mechanisms provided in the conventional vehicle 1, which are components for generating an image around the vehicle 1 for assisting the driver, an overhead image obtained by a synthesis process, and the like, can be shared.
Next, the verification process of the calibration result will be described with reference to the flowchart of fig. 10.
In step S1, the control unit 23 determines whether or not a confirmation process of the calibration result is instructed from the operation input unit 22. In step S1, if it is determined that the confirmation process of the calibration result is not instructed, the process of step S1 is repeated until it is determined that the confirmation process of the calibration result is instructed.
When it is determined in step S1 that the confirmation processing of the calibration result has been instructed, in step S2, the controller 23 instructs the image processor 24, the light emission controller 25, the superimposed image generator 26, the image determination processor 27, and the display controller 28 to start the confirmation processing of the calibration result. The image processing unit 24 performs a viewpoint conversion process on the image acquired by the image acquisition unit 21 to an image viewed from a virtual overhead viewpoint, performs a synthesis process on the image, and generates an overhead image, and supplies the overhead image to the light emission control unit 25 and the superimposed image generation unit 26. The light emission control unit 25 calculates the color and brightness of the light beam pattern 42 from either or both of the ambient brightness information supplied from the sensor 13 and the overhead image supplied from the image processing unit 24.
In step S3, light emission control unit 25 irradiates light flux pattern 42 as described using fig. 2 and 3 onto the ground (floor) based on the calculation result in the processing in step S2. Then, the image processing unit 24 generates an overhead image of the image 61 including the beam pattern. At this time, when the light emitting device 14 is provided, for example, below the door mirrors 41 on both sides described with reference to fig. 2, and the operation of each part of the vehicle 1 is required in the process of determining whether the calibration is correctly performed, the light emission control unit 25 controls the control device 15 to operate each part of the vehicle 1 and irradiate the light beam pattern 42 to a predetermined position.
In step S4, the light emission control unit 25 determines whether or not the image 61 of the light flux pattern is sufficiently recognizable in the captured image, based on the overhead image of the image 61 including the light flux pattern supplied from the image processing unit 24. If it is determined in step S4 that the image 61 of the light flux pattern of the captured image cannot be sufficiently recognized, the process returns to step S2, and the subsequent processes are repeated.
If it is determined in step S4 that the image 61 of the light flux pattern of the captured image can be sufficiently recognized, the superimposed image generation unit 26 calculates the color and transparency of the superimposed image 62 from the image 61 of the captured light flux pattern based on the overhead image supplied from the image processing unit 24 in step S5, for example, as described with reference to fig. 4 to 6.
In step S6, the superimposed image generating section 26 generates the superimposed image 62 based on the calculation result in the processing of step S5, and supplies it to the image processing section 24. The image processing unit 24 superimposes the superimposed image 62 on the generated overhead view image, and supplies the resultant image to the image determination processing unit 27 and the display control unit 28.
In step S7, the image determination processing unit 27 performs image recognition on the overlapping portion 63 where the image 61 of the beam pattern and the superimposed image 62 overlap, as described with reference to fig. 7 and 8.
In step S8, the image determination processing unit 27 determines whether or not the calibration result is correct based on the width and shape of the overlapping part 63 recognized by the image.
If it is determined in step S8 that the calibration result is correct, in step S9, the image determination processing unit 27 supplies information notifying that the calibration is correctly performed to the display control unit 28. The display control unit 28 generates a message, an image, and the like for notifying that the alignment is correctly performed, supplies the message, the image, and the like to the display device 12 together with the overhead image, displays the message, and ends the process.
If it is determined in step S8 that the calibration result is not correct, in step S10, the image determination processing unit 27 supplies information notifying the display control unit 28 of the error in calibration. The display control unit 28 generates a warning message for notifying that the calibration has not been performed correctly, an image indicating a portion where the deformation or displacement has occurred, and the like, supplies the generated image to the display device 12 together with the overhead image, displays the generated image, and ends the process.
That is, the calibration result determination method executed by the in-vehicle apparatus 16 includes: a light emission control step of controlling light emission of the light beam pattern 42; an image generation step of generating an overhead image including the beam pattern 42 whose light emission is controlled by the processing of the light emission control step, using the image captured by the camera 11; a superimposed image generation step of generating a superimposed image 62 at a position corresponding to the image 61 of the beam pattern included in the overhead image generated by the processing of the image generation step; an image determination step of determining whether or not the camera parameters for generating the overhead image are correctly calibrated by detecting a repeated portion 63 of the image 61 of the superimposed image 62 and the beam pattern in the overhead image on which the superimposed image 62 generated by the processing of the superimposed image generation step is superimposed, and the calibration result can be confirmed without performing a troublesome operation such as marking a predetermined position around the vehicle 1 (or taping) after using a predetermined calibration such as a mark or a calibration chart at the time of an inspection by a dealer or the like, because of the processing including the above steps.
Further, since the color and intensity of the light beam pattern 42 used for checking the calibration result are set based on the color of the floor or the floor and the brightness of the surroundings, there is no need for the man-hour of distinguishing the material used for forming the white line or the like as the linear structure for calibration from the brightness of the surroundings and the color of the floor or the floor. Further, the determination as to whether or not the calibration result is correct is performed not by visual observation but by image processing, and the color and transparency of the superimposed image 62 used for the determination are set so that the superimposed portion 63 can be clearly determined with respect to the color and intensity of the image 61 of the beam pattern, so that a determination result with higher reliability can be obtained as compared with the visual observation of the conventional overhead image.
In addition, when the user of the vehicle 1 feels a sense of discomfort to the overhead image, the calibration state of the camera parameters for generating the overhead image can be confirmed by a simple process, and it is possible to avoid continuing to use the overhead image in an incorrectly calibrated state or to easily determine whether or not the overhead image needs to be taken to a dealer for calibration.
The series of processes can be executed by hardware or software. When a series of processes are executed by software, a program constituting the software is installed from a computer recording medium to a computer embedded in dedicated hardware, or to a general-purpose personal computer or the like that can execute various functions by installing various programs.
The computer-executable program may be a program that performs processing in time series in the order described in the present specification, or may be a program that performs processing in parallel or at a necessary timing such as when calling is necessary.
The embodiments of the present invention are not limited to the above-described embodiments, and various modifications can be made without departing from the scope of the present invention.
Description of the symbols
A vehicle 1 …, cameras 11-1 to 11-4 …, a 12 … display device, a 13 … sensor, light emitting devices 14-1 to 14-4 …, a 15.. control device, a 16 … in-vehicle device, a 21 … image acquisition unit, a 22 … operation input unit, a 23 … control unit, a 24 … image processing unit, a 25 … light emitting control unit, a 26 … superimposed image generation unit, a 27 … image determination processing unit, a 28 … display control unit, a 41 … door mirror, a 42 … beam pattern, an image of a 61 … beam pattern, a 62 … superimposed image, and a 63 … superimposed portion.

Claims (8)

1. An apparatus for a vehicle, comprising:
an image acquisition unit that acquires an image captured by a camera that captures the surroundings of a vehicle;
a light emission control unit that controls light emission of a light beam pattern of a light emitting device that emits the light beam pattern to the ground or floor at a predetermined position with respect to the vehicle;
an image processing unit that generates an overhead image including the beam pattern whose light emission is controlled by the light emission control unit, using the image acquired by the image acquisition unit;
a superimposed image generating unit that generates a superimposed image for a position corresponding to the beam pattern included in the overhead image generated by the image processing unit; and
an image determination processing unit that determines whether or not a camera parameter for generating the overhead image is correctly calibrated by detecting a portion where the superimposed image and the beam pattern overlap in the overhead image on which the superimposed image generated by the superimposed image generation unit is superimposed.
2. The device for a vehicle according to claim 1,
the light emission control unit controls the light emission color of the beam pattern to be a complementary color or a color close to the complementary color of the floor or the floor, based on the overhead image generated by the image processing unit.
3. The device for a vehicle according to claim 1 or 2,
the light emission control section controls the light emission intensity of the light beam pattern based on the overhead image generated by the image processing section.
4. The device for a vehicle according to claim 1 or 2,
the light emission control portion acquires brightness information of the surroundings of the vehicle, and controls the light emission intensity of the light beam pattern based on the brightness information.
5. The device for a vehicle according to any one of claims 1 to 4,
the superimposed image generation unit determines the color and transparency of the superimposed image constituting the superimposed image based on the color of the beam pattern included in the overhead image generated by the image processing unit.
6. A calibration result determination system, comprising:
the device for a vehicle of any one of claims 1 to 5;
a camera that photographs surroundings of a vehicle; and
and a light emitting device that emits a beam pattern to the vehicle, the beam pattern being irradiated onto the ground or the floor at a predetermined position.
7. A calibration result determination method for determining a calibration result of a camera parameter of an overhead image in a vehicle, the calibration result determination method comprising:
a light emission control step of controlling light emission of a light beam pattern of a light emitting device that emits the light beam pattern to the ground or floor at a predetermined position with respect to the vehicle;
an image generation step of generating an overhead image including the beam pattern controlled to emit light by the processing of the light emission control step, using the image captured by a camera that captures the surroundings of the vehicle;
a superimposed image generation step of generating a superimposed image for a position corresponding to the beam pattern included in the overhead image generated by the processing of the image generation step; and
an image determination step of determining whether or not a camera parameter for generating the overhead image is correctly calibrated by detecting an overlapping portion of the line and the beam pattern in the overhead image on which the superimposed image generated by the processing of the superimposed image generation step is superimposed.
8. A program for causing a computer mounted on a vehicle to execute processing, the processing comprising:
a light emission control step of controlling light emission of a light beam pattern of a light emitting device that emits the light beam pattern to the ground or floor at a predetermined position with respect to the vehicle;
an image generation step of generating an overhead image including the beam pattern controlled to emit light by the processing of the light emission control step, using the image captured by a camera that captures the surroundings of the vehicle;
a superimposed image generation step of generating a superimposed image for a position corresponding to the beam pattern included in the overhead image generated by the processing of the image generation step; and
an image determination step of determining whether or not a camera parameter for generating the overhead image is correctly calibrated by detecting an overlapping portion of the line and the beam pattern in the overhead image on which the superimposed image generated by the processing of the superimposed image generation step is superimposed.
CN201880051609.8A 2017-09-12 2018-03-22 Vehicle device, calibration result determination system, calibration result determination method, and computer-readable storage medium storing program Active CN111052179B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017-175195 2017-09-12
JP2017175195A JP6897442B2 (en) 2017-09-12 2017-09-12 Vehicle equipment, calibration result determination system, calibration result determination method, and program
PCT/JP2018/011473 WO2019053929A1 (en) 2017-09-12 2018-03-22 Vehicle device, calibration result determination system, calibration result determination method, and program

Publications (2)

Publication Number Publication Date
CN111052179A true CN111052179A (en) 2020-04-21
CN111052179B CN111052179B (en) 2023-10-24

Family

ID=65722529

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880051609.8A Active CN111052179B (en) 2017-09-12 2018-03-22 Vehicle device, calibration result determination system, calibration result determination method, and computer-readable storage medium storing program

Country Status (5)

Country Link
US (1) US10861192B2 (en)
EP (1) EP3654280B1 (en)
JP (1) JP6897442B2 (en)
CN (1) CN111052179B (en)
WO (1) WO2019053929A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112257711A (en) * 2020-10-26 2021-01-22 哈尔滨市科佳通用机电股份有限公司 Method for detecting damage fault of railway wagon floor

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6958163B2 (en) * 2017-09-20 2021-11-02 株式会社アイシン Display control device
DE112019005747T5 (en) * 2018-11-15 2021-08-19 Panasonic Intellectual Property Management Co., Ltd. Camera system and vehicle
FR3105542B1 (en) 2019-12-20 2022-08-12 Valeo Vision Method for assisting the maneuvering of a motor vehicle and motor vehicle lighting device
US20230419536A1 (en) * 2022-06-27 2023-12-28 Inuitive Ltd. Determination of Changes in Autonomous Vehicle Location Under Adverse Weather Conditions

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008210359A (en) * 2007-01-30 2008-09-11 Toyota Motor Corp Operation device
US20090268943A1 (en) * 2008-04-25 2009-10-29 Sony Corporation Composition determination device, composition determination method, and program
CN101911127A (en) * 2008-01-11 2010-12-08 科乐美数码娱乐株式会社 Image processing device, image processing method, information recording medium, and program
WO2013074604A2 (en) * 2011-11-15 2013-05-23 Magna Electronics, Inc. Calibration system and method for vehicular surround vision system
CN103988499A (en) * 2011-09-27 2014-08-13 爱信精机株式会社 Vehicle surroundings monitoring device
JP2014225819A (en) * 2013-05-17 2014-12-04 京セラ株式会社 Calibration processor, camera calibrator, and camera calibration method
CN104271406A (en) * 2012-05-08 2015-01-07 丰田自动车株式会社 Overhead view image display device
CN104898894A (en) * 2014-03-03 2015-09-09 精工爱普生株式会社 Position detecting device and position detecting method

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080106591A1 (en) * 2006-11-02 2008-05-08 Border John N Two way communication system
US7714923B2 (en) * 2006-11-02 2010-05-11 Eastman Kodak Company Integrated display and capture apparatus
US7697053B2 (en) * 2006-11-02 2010-04-13 Eastman Kodak Company Integrated display having multiple capture devices
US7808540B2 (en) * 2007-01-09 2010-10-05 Eastman Kodak Company Image capture and integrated display apparatus
WO2010099416A1 (en) * 2009-02-27 2010-09-02 Magna Electronics Alert system for vehicle
US8463035B2 (en) * 2009-05-28 2013-06-11 Gentex Corporation Digital image processing for calculating a missing color value
US8456327B2 (en) * 2010-02-26 2013-06-04 Gentex Corporation Automatic vehicle equipment monitoring, warning, and control system
US9800794B2 (en) * 2013-06-03 2017-10-24 Magna Electronics Inc. Vehicle vision system with enhanced low light capabilities
JP6232994B2 (en) * 2013-12-16 2017-11-22 ソニー株式会社 Image processing apparatus, image processing method, and program
WO2015129280A1 (en) * 2014-02-26 2015-09-03 京セラ株式会社 Image processing device and image processing method
JP6371185B2 (en) 2014-09-30 2018-08-08 クラリオン株式会社 Camera calibration device and camera calibration system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008210359A (en) * 2007-01-30 2008-09-11 Toyota Motor Corp Operation device
CN101911127A (en) * 2008-01-11 2010-12-08 科乐美数码娱乐株式会社 Image processing device, image processing method, information recording medium, and program
US20090268943A1 (en) * 2008-04-25 2009-10-29 Sony Corporation Composition determination device, composition determination method, and program
CN103988499A (en) * 2011-09-27 2014-08-13 爱信精机株式会社 Vehicle surroundings monitoring device
WO2013074604A2 (en) * 2011-11-15 2013-05-23 Magna Electronics, Inc. Calibration system and method for vehicular surround vision system
CN104271406A (en) * 2012-05-08 2015-01-07 丰田自动车株式会社 Overhead view image display device
JP2014225819A (en) * 2013-05-17 2014-12-04 京セラ株式会社 Calibration processor, camera calibrator, and camera calibration method
CN104898894A (en) * 2014-03-03 2015-09-09 精工爱普生株式会社 Position detecting device and position detecting method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
XIAOGANG WANG: "《Infrared thermography coupled with digital image correlation in studying plastic deformation on the mesoscale level》", 《OPTICS AND LASERS IN ENGINEERING》, pages 264 *
那田: "《基于图像像素点的前方车辆距离检测研究》", 《农业装备与车辆工程》, no. 7, pages 15 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112257711A (en) * 2020-10-26 2021-01-22 哈尔滨市科佳通用机电股份有限公司 Method for detecting damage fault of railway wagon floor

Also Published As

Publication number Publication date
JP2019053361A (en) 2019-04-04
EP3654280B1 (en) 2021-07-14
EP3654280A1 (en) 2020-05-20
CN111052179B (en) 2023-10-24
US20200175722A1 (en) 2020-06-04
US10861192B2 (en) 2020-12-08
EP3654280A4 (en) 2020-08-05
JP6897442B2 (en) 2021-06-30
WO2019053929A1 (en) 2019-03-21

Similar Documents

Publication Publication Date Title
CN111052179B (en) Vehicle device, calibration result determination system, calibration result determination method, and computer-readable storage medium storing program
US10192309B2 (en) Camera calibration device
EP3368373B1 (en) Filling in surround view areas blocked by mirrors or other vehicle parts
JP5339124B2 (en) Car camera calibration system
EP2045132B1 (en) Driving support device, driving support method, and computer program
US7974444B2 (en) Image processor and vehicle surrounding visual field support device
US20100194886A1 (en) Camera Calibration Device And Method, And Vehicle
US8498770B2 (en) Vehicle parking assist system and method
EP1954063A2 (en) Apparatus and method for camera calibration, and vehicle
CN107848417B (en) Display device for vehicle
JP2007261463A (en) Calibration system of vehicle-mounted camera
US20090179916A1 (en) Method and apparatus for calibrating a video display overlay
CN109941277A (en) The method, apparatus and vehicle of display automobile pillar A blind image
CN110706282A (en) Automatic calibration method and device for panoramic system, readable storage medium and electronic equipment
CN110796711B (en) Panoramic system calibration method and device, computer readable storage medium and vehicle
KR101547415B1 (en) Monitoring the close vicinity around a commercial vehicle
US20170028917A1 (en) Driving assistance device and driving assistance method
JP2008222153A (en) Merging support device
US20130215280A1 (en) Camera calibration device, camera and camera calibration method
CN104517096A (en) Image processing method and system of around view monitoring system
JP6855254B2 (en) Image processing device, image processing system, and image processing method
JP6624383B2 (en) Calibration device and in-vehicle camera system
CN114022450A (en) Splicing effect judgment method for vehicle panoramic all-round inspection test
JP2005039599A (en) Vehicle periphery surveillance apparatus
KR20170140017A (en) A Parking Guide System and Method based on Physical Measurement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant