WO2019233330A1 - 车载摄像头自标定方法及装置和车辆驾驶方法及装置 - Google Patents
车载摄像头自标定方法及装置和车辆驾驶方法及装置 Download PDFInfo
- Publication number
- WO2019233330A1 WO2019233330A1 PCT/CN2019/089033 CN2019089033W WO2019233330A1 WO 2019233330 A1 WO2019233330 A1 WO 2019233330A1 CN 2019089033 W CN2019089033 W CN 2019089033W WO 2019233330 A1 WO2019233330 A1 WO 2019233330A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- camera
- calibration
- self
- lane line
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/04—Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/101—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using cameras with adjustable capturing direction
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/40—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the details of the power supply or the coupling to vehicle components
- B60R2300/402—Image calibration
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/602—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint
- B60R2300/605—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint the adjustment being automatic
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/804—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for lane monitoring
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
- G06T2207/30208—Marker matrix
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Definitions
- the present disclosure relates to the field of image processing technologies, and in particular, to a method and device for self-calibration of a vehicle camera and a method and device for driving a vehicle.
- the traditional vehicle camera calibration method requires manual calibration of specific reference objects under the set camera model, then image processing, calculation and optimization using a series of mathematical transformation formulas, and finally obtaining the camera model parameters.
- the camera is calibrated.
- the present disclosure proposes a technical solution for self-calibration of a vehicle camera.
- a method for self-calibration of an on-board camera includes: starting the self-calibration of the on-board camera, so that a vehicle on which the on-board camera is installed is in a running state; The on-board camera collects information required for self-calibration of the on-board camera; and self-calibrates the on-board camera based on the collected information.
- a vehicle camera self-calibration device includes: a self-calibration starting module for starting the vehicle camera self-calibration, so that a vehicle installed with the vehicle camera is in a running state; information An acquisition module is configured to collect information required for self-calibration of the on-board camera via the on-board camera during the driving of the vehicle; a self-calibration calculation module is used to self-calibrate the on-board camera based on the collected information.
- a vehicle driving device configured to perform vehicle driving by using the self-calibrated vehicle camera described in any one of the above-mentioned vehicle camera self-calibration methods.
- an electronic device including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: execute the above-mentioned vehicle camera self-calibration method.
- a computer-readable storage medium on which computer program instructions are stored, and when the computer program instructions are executed by a processor, the method for self-calibrating a vehicle-mounted camera is implemented.
- a computer program including computer-readable code, and when the computer-readable code is run in an electronic device, a processor in the electronic device executes a program for implementing the vehicle-mounted camera described above. Self-calibration method.
- the on-vehicle camera can perform self-calibration according to the information collected by the on-vehicle camera during the implementation of the vehicle.
- the self-calibration process of the vehicle camera can be conveniently completed in the actual use environment of the vehicle camera without affecting the use of the vehicle.
- the calibration result is accurate, the calibration efficiency is high, and the application range is wide.
- FIG. 1 shows a flowchart of a self-calibration method for a vehicle camera according to an embodiment of the present disclosure
- FIG. 2 shows a flowchart of a self-calibration method for a vehicle camera according to an embodiment of the present disclosure
- FIG. 3 shows a flowchart of a self-calibration method for a vehicle camera according to an embodiment of the present disclosure
- FIG. 4 shows a flowchart of a self-calibration method for a vehicle camera according to an embodiment of the present disclosure
- FIG. 5 shows a schematic diagram of an intersection point in a self-calibration method of a vehicle camera according to an embodiment of the present disclosure
- FIG. 6 shows a schematic diagram of a horizon in a self-calibration method of a vehicle camera according to an embodiment of the present disclosure
- FIG. 7 is a schematic diagram of key points in a self-calibration method of a vehicle camera according to an embodiment of the present disclosure
- FIG. 8 illustrates a block diagram of a self-calibration device for a vehicle camera according to an embodiment of the present disclosure
- FIG. 9 illustrates a block diagram of an electronic device according to an exemplary embodiment of the present disclosure.
- exemplary means “serving as an example, embodiment, or illustration.” Any embodiment described herein as “exemplary” is not necessarily to be construed as superior or better than other embodiments.
- FIG. 1 shows a flowchart of a method for self-calibration of a vehicle camera according to an embodiment of the present disclosure. As shown in FIG. 1, the method for self-calibration of a vehicle camera includes:
- step S10 the self-calibration of the on-board camera is started, so that the vehicle on which the on-board camera is installed is in a running state.
- the vehicle is a device with a driving function.
- the vehicle may include one or any combination of the following devices: a motor vehicle, a non-motor vehicle, a train, a toy car, a robot .
- Motor vehicles may include large vehicles, trams, battery cars, motorcycles, tractors, and other vehicles that are equipped with a power unit and can be driven by the power unit.
- Non-motor vehicles may include bicycles, tricycles, scooters, animal-powered vehicles and other vehicles that need to be driven by human or animal power.
- the toy vehicle may include a vehicle-shaped toy such as a remote-control toy vehicle and an electric toy vehicle.
- the robot may include a humanoid traveling robot and a non-humanoid traveling robot.
- the non-humanoid traveling robot may include a sweeping robot, a carrying robot, and the like.
- the on-board camera can be a camera configured by the vehicle itself, or a camera outside the vehicle.
- the vehicle camera may include various types of cameras. For example, it may include a visible light camera, an infrared camera, a binocular camera, and the like. The disclosure does not limit the type of the vehicle camera.
- Car camera self-calibration including the car camera can complete the actual work of self-calibration.
- the self-calibration process of the on-board camera does not require human involvement.
- the on-board camera can calculate the calibration parameters based on the set self-calibration start conditions, using the position information of the target object in the image taken by the on-board camera while the vehicle is running, and the saved initial calibration information, and complete the calibration according to the calibration parameters .
- the entire calibration process does not require manual operation of the on-board camera, nor does it require manual input of calibration parameters or location information.
- the self-calibration startup method of the vehicle camera may include a manual input instruction startup method. It may also include a method of automatically starting according to a preset starting condition.
- the preset startup conditions may include that the driving conditions of the vehicle meet the set driving conditions, or that the shooting conditions of the on-board camera meet the set shooting conditions.
- the startup condition may include a combination of multiple startup conditions.
- the starting conditions and starting methods of the self-calibration process of the vehicle camera can be determined according to the requirements, and the implementation method is very flexible.
- the vehicle with the on-board camera needs to be in a driving state.
- a prompt message can be sent to make the vehicle in a driving state to improve the user experience.
- the on-board camera is on and can capture images.
- the car camera can capture still images or video streams.
- the shooting mode of the vehicle camera and the format of the captured image can be determined according to requirements to obtain an image that meets the requirements.
- step S20 during the running of the vehicle, information required for self-calibration of the vehicle camera is collected via the vehicle camera.
- the information required for the self-calibration of the on-board camera can be collected according to the image taken by the on-board camera.
- the information required for the self-calibration of the vehicle camera may include images captured by the vehicle camera, or may include processed images obtained by processing the images captured by the vehicle camera. It may also include information of the target object detected in the image captured by the vehicle camera.
- the target object may include various types of objects such as a building, a vehicle, or a pedestrian.
- the vehicle can continuously collect the information required for the self-calibration of the on-board camera during the driving process, and can also collect the information required for the self-calibration of the on-board camera according to the set collection cycle.
- the implementation method is very flexible to meet different application requirements.
- Step S30 Self-calibrating the on-board camera based on the collected information.
- the on-board camera needs to be re-calibrated, so that the image taken by the on-board camera can be used for accurate self-calibration of the on-board camera.
- the vehicle camera can be self-calibrated based on the information collected by the vehicle camera without manual calibration by the user.
- the vehicle camera can perform self-calibration according to the information collected by the vehicle camera during the implementation of the vehicle.
- the self-calibration process of the vehicle camera can be conveniently completed in the actual use environment of the vehicle camera without affecting the use of the vehicle.
- the calibration result is accurate, the calibration efficiency is high, and the application range is wide.
- the starting the self-calibration of the vehicle camera includes:
- the angle of view of the on-board camera may include an angle formed by a line connecting the lens center point of the on-board camera and two ends of the diagonal of the imaging plane.
- the focal length of the vehicle camera includes the distance from the main point to the intersection point after the lens of the vehicle camera is optical.
- the angle of view of a car camera is inversely proportional to the focal length of the car camera. For the same imaging area, the shorter the focal length of the lens of the vehicle camera, the larger the viewing angle.
- the self-calibration of the vehicle camera when the angle of view or focal length of the vehicle camera changes, the self-calibration of the vehicle camera is started.
- the vehicle camera can perform self-calibration in time.
- the self-calibration of the vehicle camera does not affect the normal driving and normal use of the vehicle.
- the on-board camera can maintain accurate calibration status at all times.
- the starting the self-calibration of the vehicle camera includes:
- the installation position of the vehicle-mounted camera may include an installation position of the vehicle-mounted camera on the vehicle.
- the in-vehicle camera can be installed at any part of the vehicle that can capture the road surface.
- the shooting angle of the vehicle camera may include the angle between the lens plane and the ground plane of the vehicle camera. The installation position and shooting angle of the vehicle camera can be determined according to the requirements and the use environment.
- the self-calibration of the vehicle camera is started.
- the vehicle camera can perform self-calibration in time.
- the self-calibration process of the vehicle camera does not affect the normal driving and normal use of the vehicle, and the vehicle camera can maintain accurate calibration status at all times.
- the starting the self-calibration of the vehicle camera includes:
- the cumulative mileage of a vehicle can be determined by reading the mileage of the vehicle. For example, when it is read that the mileage of the vehicle changes more than M kilometers, the self-calibration of the on-board camera can be started.
- the self-calibration of the on-board camera is started. After the vehicle is used for a certain period of time, the vibration generated by the vehicle during actual use may cause the shooting angle or installation position of the vehicle camera to change. Therefore, when the cumulative mileage of the vehicle exceeds the mileage threshold, the vehicle camera can perform self-calibration in time. .
- the self-calibration process of the vehicle camera does not affect the normal driving and normal use of the vehicle, and the vehicle camera can maintain accurate calibration status at all times.
- the starting the self-calibration of the vehicle camera includes:
- self-calibration start instruction self-calibration of the on-board camera is started.
- the embodiment of the present disclosure may start a self-calibration of a vehicle camera when receiving a self-calibration start instruction.
- the received self-calibration start instruction may be a self-calibration start instruction input by a human, or a self-calibration start instruction automatically sent by a preset execution program.
- the self-calibration start instruction can be received by inputting the self-calibration start instruction.
- a button for a self-calibration start instruction may be provided, or a user may be prompted to input a self-calibration start instruction by providing a prompt message.
- a self-calibration of the on-vehicle camera may be started.
- the self-calibration of the on-board camera can be started in time according to the use requirements.
- the self-calibration of the on-board camera can also be applied to different use environments in time.
- FIG. 2 shows a flowchart of a method for self-calibration of a vehicle camera according to an embodiment of the present disclosure. As shown in FIG. 2, the method for self-calibration of a vehicle camera further includes:
- step S40 collection progress information is provided, and the collection progress information includes progress information collected by the vehicle-mounted camera from information required for calibration.
- the vehicle-mounted camera needs to collect sufficient information.
- the user of the vehicle can be notified of the collection progress of the on-board camera to improve the user experience.
- Collection progress information can be provided through one or any combination of the following prompts: voice prompts, text prompts, or image prompts. You can actively provide collection progress information, or you can provide collection progress information according to prompt instructions.
- the progress information can be provided by displaying a progress bar on the vehicle's central control screen. It is also possible to provide information on the acquisition progress by way of voice broadcast "the current acquisition progress is: 20% has been collected”.
- the present disclosure does not limit the form and the manner of providing the progress information.
- Step S30 includes:
- step S31 when it is determined that the information required for self-calibration is collected based on the collection progress information, the vehicle-mounted camera is self-calibrated based on the collected information.
- the vehicle-mounted camera can collect sufficient information required for self-calibration.
- the vehicle camera can be calibrated based on the collected information.
- the process of self-calibration of the on-board camera can be made clearer, and the user of the vehicle can more easily grasp the progress of self-calibration of the on-board camera. Can improve the success rate and accuracy of self-calibration of vehicle cameras.
- FIG. 3 shows a flowchart of a method for self-calibration of a vehicle camera according to an embodiment of the present disclosure. As shown in FIG. 3, the method for self-calibration of a vehicle camera further includes:
- Step S50 providing prompting information of the collecting conditions, the prompting information of the collecting conditions including prompting information of whether the vehicle camera has a collecting condition, and the collecting conditions including conditions of the vehicle camera collecting information required for calibration.
- the vehicle camera needs to meet the collection conditions for collecting the information required by the vehicle camera to collect the self-calibration.
- the user of the vehicle can be prompted to check whether the on-board camera meets the collection conditions by providing prompt information of the collection conditions.
- Collection condition prompt information can be provided through one or any combination of the following prompt information: voice prompt information, text prompt information, or image prompt information. It can provide prompt information of collection conditions, or provide prompt information of collection conditions according to prompt instructions.
- the present disclosure does not limit the form and providing manner of the collection condition prompt information.
- Step S20 includes:
- step S21 when it is determined that the in-vehicle camera has the acquisition conditions according to the prompting information of the acquisition conditions, during the running of the vehicle, the vehicle camera collects information required for the on-board camera self-calibration.
- the vehicle user may adjust the on-vehicle camera accordingly, or perform self-calibration of the on-vehicle camera after changing the environment.
- the vehicle-mounted camera is used to collect information required for the self-calibration of the on-board camera.
- the vehicle camera by collecting the condition prompt information, the vehicle camera can collect the information required for accurate self-calibration, and the success rate and accuracy of the vehicle camera self-calibration can be improved.
- step S20 includes:
- the vehicle-mounted camera When the elevation angle of the lens of the vehicle-mounted camera is within the range of the photographing elevation angle, during the running of the vehicle, the vehicle-mounted camera is used to collect information required for the vehicle-mounted camera to self-calibrate.
- shooting the range of the elevation angle may include a set of elevation angles with the minimum elevation angle as a lower limit and the maximum elevation angle as an upper limit. According to the installation position, shooting angle and use environment of the vehicle camera, the corresponding shooting elevation range can be set.
- the in-vehicle camera when the shooting elevation angle of the in-vehicle camera is within the shooting elevation angle range, the in-vehicle camera can collect information required for the in-vehicle camera self-calibration and obtain an accurate in-vehicle camera self-calibration result.
- the information includes a lane line of a road on which the vehicle travels
- step S20 includes:
- the vehicle-mounted camera collects information required by the on-vehicle camera for self-calibration.
- the lane line may include a white lane line or a yellow lane line, may include a solid line or a dotted line, and may include a single line or a double line.
- the lane line may be a white dotted line, a solid white line, a yellow solid line, a yellow dotted line, a double white dotted line, a double yellow solid line, and the like.
- the lane line may also be a shoulder of a road surface on which a motor vehicle travels. In the images taken by the on-board camera, lane lines have features such as clear targets and uniform shapes.
- the image captured by the on-board camera needs to include the lane line.
- the vehicle-mounted camera collects information required by the on-vehicle camera for self-calibration. According to the collected information, the on-board camera can perform self-calibration.
- the vehicle camera collects information required for the on-board camera self-calibration. Ensure that the on-board camera captures the lane lines of the road on which the vehicle is traveling. You can self-calibrate the on-board camera based on the collected lane lines. This makes the self-calibration of the vehicle camera widely applicable, and the calibration process is simple and reliable.
- collecting information required by the vehicle camera for self-calibration via the vehicle camera includes:
- the vehicle-mounted camera When the vehicle-mounted camera captures the horizon or lane line vanishing point of the road on which the vehicle travels, during the driving of the vehicle, the vehicle-mounted camera collects information required for the vehicle-mounted camera to self-calibrate.
- the vehicle-mounted camera when the vehicle-mounted camera captures the horizon or lane line vanishing point of the road on which the vehicle travels, it can be determined that there is no obstruction in front of the vehicle-mounted camera.
- the lane lines captured by the vehicle camera are complete and clear. Accurate self-calibration of on-board cameras based on complete and clear lane lines.
- the vehicle-mounted camera when the vehicle-mounted camera captures the horizon or lane line vanishing point of the road on which the vehicle is traveling, the vehicle-mounted camera can obtain a more accurate self-calibration of the vehicle-mounted camera according to the captured complete and clear lane line result.
- step S20 includes:
- the vehicle-mounted camera is used to collect the information required for the vehicle-mounted camera to perform self-calibration within a collection time range.
- the collection duration range may include a duration range with a minimum collection duration as a lower limit and a maximum collection duration as an upper limit.
- the collection time of the vehicle camera to collect the information required for self-calibration is less than the minimum collection time, the collected information is not enough to support the vehicle camera to perform the self-calibration calculation.
- the collection time of the information collected by the vehicle camera for self-calibration exceeds the maximum collection time, the information collected by the vehicle camera after exceeding the maximum collection time may not participate in the calculation of self-calibration.
- the preset collection duration range can be set according to requirements, for example, the collection duration range can be 10 minutes to 25 minutes.
- the vehicle camera can terminate the information required for self-calibration. Avoid unnecessary waste of system resources.
- the information collected by the on-board camera can be sufficient to support the calculation of self-calibration without causing waste of system resources.
- collecting information required by the vehicle camera for self-calibration via the vehicle camera includes:
- the vehicle-mounted camera is used to collect information required for the vehicle-mounted camera to self-calibrate.
- the driving distance range may include a distance range in which a vehicle has traveled with a minimum driving distance as a lower limit and a maximum driving distance as an upper limit.
- the greater the distance traveled by the vehicle the more information the vehicle camera collects.
- the vehicle's driving distance is less than the minimum driving distance, the information collected by the vehicle camera is insufficient to support the calculation of self-calibration.
- the vehicle's driving distance is greater than the maximum driving distance, the information collected by the vehicle camera after exceeding the maximum driving distance may not be involved in the calculation of self-calibration.
- the driving distance range can be determined according to the requirements, the settings of the vehicle camera, and the actual use environment of the vehicle camera. For example, the driving distance can range from 5 km to 8 km.
- the vehicle-mounted camera can terminate the information required for self-calibration. Avoid unnecessary waste of system resources.
- the information collected by the on-board camera can be sufficient to support the calculation of self-calibration without causing waste of system resources.
- FIG. 4 shows a flowchart of a method for self-calibration of a vehicle camera according to an embodiment of the present disclosure.
- step S30 in the method for self-calibration of a vehicle camera includes:
- Step S32 Update the homography matrix of the vehicle camera based on the collected information, and the homography matrix of the vehicle camera reflects the posture of the vehicle camera.
- the pose of the vehicle camera may include a rotation parameter and a translation parameter of the vehicle camera.
- the homography matrix of the vehicle camera can be established according to the pose of the vehicle camera.
- the homography matrix of the vehicle camera may include a conversion parameter or a conversion matrix of the vehicle camera.
- the process of establishing the homography matrix of the vehicle camera may include: using the camera configured by the vehicle to take a real road image, using the point set on the road image, and the corresponding point set on the real road to construct the homography matrix.
- the specific method may include: 1. Establishing a coordinate system: the vehicle's left front wheel is used as the origin, the driver's perspective to the right is the positive direction of the X axis, and the forward direction is the positive direction of the Y axis to establish the vehicle body Coordinate System. 2. Select points, select points in the body coordinate system of the vehicle, and get the selected point set. For example, (0,5), (0,10), (0,15), (1.85,5), (1.85,10), (1.85,15), the unit of each point is meter.
- the homography matrix of the built-in camera the coordinates of the target object in the image captured by the vehicle camera at a known angle of view can be converted into the image coordinate system and the world coordinate system of the image captured at a known angle of view. .
- the homography matrix of the vehicle camera can be updated.
- the homography matrix of the vehicle camera may be updated according to the lane lines captured by the vehicle camera during driving.
- the coordinates of the target object in the image captured by the vehicle camera at the driving shooting angle can be performed between the image coordinate system and the world coordinate system in the image captured at the driving shooting angle. Convert each other.
- step S33 the vehicle camera is self-calibrated according to the homography matrix of the vehicle camera before and after the update.
- the homography matrix of the vehicle camera before the update may include a conversion relationship between an image coordinate system of an image captured by the vehicle camera under a known perspective and a world coordinate system.
- the homography matrix of the vehicle camera before the update may include known parameters or known matrices of the vehicle camera.
- the updated homography matrix of the vehicle camera may include the conversion relationship between the image coordinate system of the image captured by the vehicle camera under the driving shooting angle and the world coordinate system.
- the updated homography matrix of the vehicle camera may include the transformation parameters or transformation matrix of the vehicle camera.
- the first image coordinates of the target object in the first image taken by the in-vehicle camera at a known angle of view can be captured with the in-vehicle camera at the driving shooting angle
- the second image coordinates of the target object in the second image are converted to each other.
- the on-board camera can be self-calibrated.
- the homography matrix of the vehicle camera is updated, and the automobile camera is self-calibrated according to the homography matrix of the vehicle camera before and after the update.
- the vehicle camera can be calibrated accurately and quickly.
- step S32 includes:
- a lane line is detected in an image captured by the on-board camera, and detection position information of the lane line is obtained.
- an image recognition technology may be used to detect lane lines in an image captured by a vehicle-mounted camera.
- the image captured by the vehicle camera may also be input to a neural network, and the lane lines may be detected in the image captured by the vehicle camera based on the output of the neural network.
- the images taken by the on-board camera include images taken by the on-board camera from a driving shooting angle when the vehicle is running.
- the detection position information of the lane line includes position information in an image coordinate system in an image captured by the lane line at a driving shooting angle.
- the homography matrix of the on-vehicle camera can be updated according to the position information of the lane line at the driving shooting angle.
- the updated homography matrix can convert the position information of the lane line in the image taken at the driving shooting angle and the position information of the lane line in the image taken at the known angle of view.
- the homography matrix of the on-board camera is updated based on the detected position of the lane.
- the accurate lane line can be detected in the image, and the accurate detection position information of the lane line can be obtained.
- An accurate updated homography matrix can be obtained according to the accurate detection position information of the lane line, and the updated homography matrix can be used for self-calibration of the vehicle camera, and accurate self-calibration results can be obtained.
- step S33 includes:
- Step S331 Obtain the known position information of the lane line according to the homography matrix of the on-board camera before the update.
- the homography matrix of the in-vehicle camera before the update may include a homography matrix constructed for the in-vehicle camera at a known angle when the in-vehicle camera is first used, first installed, or shipped from the factory.
- the homography matrix of the built-in vehicle camera the coordinates of the target object in the image captured by the vehicle camera at a known angle of view can be converted to each other in the image coordinate system and the world coordinate system of the image captured at a known angle of view. .
- the known position information of the lane line can be obtained, including the position information of the lane line in the image coordinate system of the image taken at a known perspective, and the position information of the lane line in the world coordinate system. .
- Step S332 Determine calibration parameters of the vehicle-mounted camera according to the detected position information of the lane line and the known position information of the lane line.
- the parameters of the vehicle camera may include internal parameters and external parameters.
- the internal parameters may include parameters related to the characteristics of the vehicle camera, such as the focal length and pixel size of the vehicle camera, and the internal parameters of each vehicle camera are unique.
- the external parameters include the position parameters and rotation direction parameters of the vehicle camera in the world coordinate system.
- the calibration parameters of the on-board camera may include a rotation direction parameter of the on-board camera. Using the calibration parameters, a mapping relationship between the world coordinate system and the image coordinate system of the image captured by the vehicle camera can be constructed. Using the mapping relationship constructed by the calibration parameters, the position information of the target object in the image coordinate system and the position information in the world coordinate system can be converted to each other.
- the known parameters of the on-vehicle camera can be obtained according to the known position information of the lane line in the world coordinate system and the known position information of the lane line in the image coordinate system of the image taken by the vehicle camera at a known perspective. Then according to the known parameters of the vehicle camera, the conversion parameters between the driving shooting angle of the vehicle camera and the known viewing angle, the calibration parameters of the vehicle camera at the driving shooting angle can be obtained.
- Step S333 Self-calibrating the on-vehicle camera according to the calibration parameters.
- the coordinate information of the object in the image captured by the vehicle camera under the driving shooting angle can be the image coordinate system of the image captured under the driving shooting angle. And world coordinate system.
- calibration parameters of the vehicle camera are obtained according to the detected position information and known position information of the lane line, and the vehicle camera is self-calibrated according to the calibration parameters.
- the on-board camera is calibrated according to the lane line, so that the on-board camera can easily complete self-calibration, the calibration efficiency is high, and the application range is wide.
- detecting a lane line in an image captured by the on-board camera, and obtaining detection position information of the lane line includes:
- Determining calibration parameters of the on-vehicle camera according to the detected position information of the lane line and the known position information of the lane line includes: determining according to detected coordinates of the key point and known coordinates of the key point Calibration parameters of the on-board camera.
- the key point may include a point at a specified position on a lane line, or may include a point with a set characteristic on the lane line.
- Key points on the lane line can be determined based on demand.
- the key points on the lane line may include one or more, and the number of key points may be determined according to requirements.
- the detection coordinates of the key points may include position information of the key points in an image coordinate system of an image captured under the driving angle of the vehicle camera.
- the detection coordinates of key point 1 on the lane line are (X 1 , Y 1 )
- the detection coordinates of key point 2 are (X 2 , Y 2 ).
- the known coordinates of the key points include the known coordinates of the key points in the world coordinate system and the known coordinates in the image coordinate system of the image captured by the vehicle camera at a known perspective.
- the conversion parameters of the vehicle camera can be obtained by using the known coordinates and detection coordinates of the key points, and the known parameters of the vehicle camera can be obtained by using the known coordinates of the key points. Finally, the known parameters and conversion parameters of the vehicle camera are used to obtain the vehicle camera's Calibration parameters.
- the calibration parameters of the vehicle-mounted camera can be obtained by using the detection coordinates of the key points and the known coordinates. According to the calibration parameters of the key point detection coordinates and known coordinates, the calculation amount is small.
- the on-board camera can complete self-calibration quickly and accurately, with high calibration efficiency and wide application range.
- detecting a lane line in an image captured by the on-board camera, and obtaining detection position information of the lane line includes:
- the lane lines to be fitted in each image are fitted to obtain lane lines and detection position information of the lane lines.
- lane line detection may be performed in multiple images captured by a vehicle-mounted camera.
- the number and location of lane lines detected can be determined based on demand. For example, a lane line on the left side of the vehicle may be detected, or a lane line on the right side of the vehicle may be detected.
- the road surface has multiple lanes, the lane lines on the left and right sides closest to the vehicle itself can be detected, or only the two lane lines on the right side of the vehicle can be detected.
- the position of the lane line in the image is relatively fixed.
- the lane line to be fitted can be detected in each image, and then the lane line to be fitted in multiple images is fitted to obtain the lane line, and the detection position information of the lane line is obtained.
- the lane lines to be detected are the two nearest lane lines on the left and right sides of the vehicle. According to the lane lines to be fitted detected in the multiple images taken by the vehicle camera, the left and right lane lines of the lane where the vehicle is located can be fitted, and the detected position information of the left and right lane lines can be obtained.
- lane lines to be fitted are detected in 100 images taken by a vehicle-mounted camera, which are lane lines within a distance of 5 meters forward of the front end of the vehicle in each image.
- the lane line to be fitted detected in 100 images can be fitted to obtain the lane line of the road on which the vehicle is traveling, and the detection position information of the lane line can be obtained.
- the calibration parameters of the vehicle camera can be obtained according to the known position information and detected position information of the lane line.
- the lane line includes a first lane line and a second lane line
- determining a key point on the detected lane line to obtain detection coordinates of the key point includes:
- a key point is determined according to the horizon, the first lane line, and the second lane line, and detection coordinates of the key point are obtained.
- the first lane line and the second lane line may be lane lines on the left or right side of the vehicle, respectively.
- the first lane line may be the left lane line nearest to the vehicle
- the second lane line may be the right lane line nearest to the vehicle.
- the first lane line and the second lane line may be two parallel lines on a real road.
- the first lane line and the second lane line may have an intersection in front of the motor vehicle, or the extension line of the first lane line and the extension of the second lane line may have an intersection point in front of the vehicle.
- the first lane line and the second lane line can be detected in each image captured by the vehicle camera, and the intersection of the first lane line and the second lane line can be determined in each image.
- FIG. 5 shows a schematic diagram of intersection points in a self-calibration method for a vehicle camera according to an embodiment of the present disclosure.
- the coordinate system in FIG. 5 is an image coordinate system
- the first lane line is a left lane line
- the second lane line is The right lane line
- the extension of the right lane line have intersections in front of the vehicle.
- the first lane line and the second lane line may intersect on the horizon. Therefore, based on the intersection points in each image, the position of the horizon in front of the vehicle can be fitted. The sum of the distances between the horizons obtained by fitting the intersection points in each image is the smallest.
- FIG. 6 shows a schematic diagram of a horizon in a self-calibration method for a vehicle camera according to an embodiment of the present disclosure.
- the coordinate system in FIG. 6 is an image coordinate system.
- the intersection points define the horizon.
- the sum of the distances from the intersections to the determined horizon is the smallest.
- the position of the horizon determined in each image is relatively fixed.
- the lane line includes the first lane line and the second lane line
- the The determined horizon is used as a reference, and key points are determined on the first lane line and the second lane line.
- the length from the horizon is M
- the key point 1 and the key point 2 are respectively determined on the first lane line and the second lane line
- the length from the horizon is N
- the first lane line is determined.
- Key point 3 and key point 4 are determined respectively on the second lane line.
- M is the same as the N unit, and the values are different.
- the detection coordinates of each key point can be obtained according to the determined key points.
- the detection coordinates of each key point include the coordinates in the image coordinate system of the image captured by the key point at the driving shooting angle.
- the position of the horizon is obtained according to the two lane lines, and the position of the key point is determined according to the position of the horizon to obtain the detection coordinates of the key point. According to the horizon, the accurate positions of key points in each image can be obtained, thereby obtaining more accurate correction parameters.
- determining a key point according to the horizon, the first lane line, and the second lane line, and obtaining detection coordinates of each of the key points include:
- intersection of the detection line and the first lane line and the intersection of the detection line and the second lane line are determined as key points, and detection coordinates of each of the key points are obtained.
- FIG. 7 shows a schematic diagram of key points in a self-calibration method for a vehicle camera according to an embodiment of the present disclosure.
- the coordinate system in the figure is an image coordinate system, and there are two detection lines parallel to the horizon below the horizon. The four intersections of the line with the left lane line and the right lane line are determined as key points.
- determining calibration parameters of the on-vehicle camera according to the detection coordinates of the key points and known coordinates of the key points includes:
- the conversion parameters of the camera are determined according to the detected coordinates of the key point and the known coordinates of the key point.
- the detected coordinates of the key point include the coordinates of the key point in the driving shooting angle.
- the known coordinates include the coordinates of the key points in a known perspective;
- a calibration parameter of the camera is determined.
- the known coordinates of the key points include known coordinates of the key points in an image coordinate system of an image captured by the vehicle camera under a known perspective.
- the known coordinates of the key points of the in-vehicle camera in the image at a known perspective A are (X A , Y A ).
- the known parameters H A of the on-vehicle camera at the known perspective A can be obtained.
- the known parameters H A may include parameters in the form of a matrix.
- the known parameter H A may include a rotation direction parameter of the vehicle camera.
- the known parameter H A can transform the coordinates of key points in the image coordinate system of the image taken at the known angle of view A to the world coordinate system.
- the detection coordinates of the key points include the detection coordinates of the key points in the image coordinate system of the image captured by the vehicle camera at the driving shooting angle.
- the detection coordinates of the key points in the image of the vehicle camera under the driving shooting angle B are (X B , Y B ).
- the conversion parameter H AB of the vehicle camera from the known angle A to the driving shooting angle B can be obtained.
- the conversion parameters H AB may include parameters in the form of a matrix.
- the conversion parameter H AB may include a rotation direction parameter of the vehicle camera.
- the conversion parameter H AB can convert the coordinates of the key points between the image coordinate system of the image captured at the known angle of view A and the image coordinate system of the image captured at the driving angle of view B.
- a detection parameter H B of the vehicle camera under the driving shooting angle B can be obtained.
- the detection parameter H B may include a parameter in the form of a matrix.
- the detection parameter H B may include a rotation direction parameter of the vehicle camera.
- the detection parameter H B can convert the coordinates of key points in the image coordinate system of the image captured under the driving shooting angle B to the world coordinate system.
- the detection parameter H B is a calibration parameter of the vehicle camera under the driving shooting angle B.
- the known coordinates of the key point include: a first known coordinate of the key point in the image coordinate system under a known perspective, and the key point in world coordinates under a known perspective.
- the second known coordinate in the system includes:
- Determining the calibration parameters of the vehicle camera according to the conversion parameters and known parameters may include:
- a known parameter is determined according to the first known coordinate and the second known coordinate; a calibration parameter of the camera is determined according to the conversion parameter and the known parameter.
- the known coordinates of the key point include known coordinates (X A , Y A ) in the image coordinate system of the image captured by the vehicle camera under the known angle of view A, and the key point is at Known world coordinates (X, Y, 1) in the world coordinate system.
- a known parameter H A of the vehicle camera at a known viewing angle A can be obtained.
- the detection parameter H B at the driving shooting angle B can be obtained from the known parameter H A and the conversion parameter H AB .
- the calibration parameters of the vehicle camera are determined based on the known coordinates of the vehicle camera at a known angle of view and the detection coordinates of the vehicle camera at a driving shooting angle of view. According to the known coordinates and detection coordinates of key points, it is convenient The on-board camera is calibrated, the calculation process is simple, and the calculation result is accurate.
- the method further includes:
- the calibration parameters are calibrated by using a perspective principle or a triangle principle.
- the calibration parameters of the on-vehicle camera may include calibration parameters calculated according to a focal length and a unit pixel of the on-vehicle camera.
- Each vehicle camera has a different focal length and unit pixel, and calibration parameters can be calculated based on the vehicle camera's own parameters. The principle of perspective or triangle can be used to more accurately calibrate the conversion parameters of the vehicle camera.
- f is the focal length (mm) of the vehicle camera
- pm is the pixel / mm of the vehicle camera.
- the calibrated H ' BA is:
- k ′ is a first calibration coefficient
- k is a second calibration coefficient
- b is a third calibration coefficient
- An embodiment of the present disclosure also provides a method for driving a vehicle.
- the method includes:
- driving the vehicle may include active driving of the vehicle and assisted driving of the vehicle.
- the self-calibrated on-board camera can provide accurate positioning information for the vehicle, making the active driving and assisted driving of the vehicle safer and more reliable.
- the calibration parameters can be stored in the driving assistance system using the on-board camera as a sensor to provide effective calibration parameters for the subsequent image processing of assisted driving.
- the driving assistance system may include a system that assists driving according to the position information of a specific target object.
- the assisted driving system may include a lane keeping assist system, a brake assist system, an automatic parking assist system, and a reverse assist system.
- the lane keeping assist system can assist driving according to the lane line where the vehicle is traveling, so that the vehicle is kept driving in the current lane.
- the brake assist system can send a braking instruction to the vehicle according to the set distance of the target object, so that the vehicle and the target object maintain a safe distance.
- the automatic parking assist system can dump the vehicle into the garage based on the detected parking line.
- the reversing assistance system can send a reversing instruction to the vehicle based on the distance between the motor vehicle and the obstacles behind it, so that the vehicle can avoid reversing the obstacle.
- the vehicle can obtain the accurate position information of the target object in the image captured by the on-board camera.
- the assisted driving system can obtain accurate assisted driving instructions.
- the on-board camera after self-calibration can be used without affecting the actual use of the vehicle, and the driving of the vehicle is safer and more reliable.
- the present disclosure also provides an image processing apparatus, an electronic device, a computer-readable storage medium, and a program.
- the foregoing can be used to implement any one of the image processing methods provided by the present disclosure. ,No longer.
- FIG. 8 shows a block diagram of a vehicle camera self-calibration device according to an embodiment of the present disclosure.
- the vehicle camera self-calibration device includes a self-calibration starting module 10 for starting the self-calibration of the vehicle camera to enable The vehicle on which the on-board camera is installed is in a running state; an information acquisition module 20 is configured to collect the information required for the on-board camera self-calibration via the on-camera camera during the driving of the vehicle; Used for self-calibrating the on-board camera based on the collected information.
- the vehicle camera can perform self-calibration according to the information collected by the vehicle camera during the implementation of the vehicle.
- the self-calibration process of the vehicle camera can be conveniently completed in the actual use environment of the vehicle camera without affecting the use of the vehicle.
- the calibration result is accurate, the calibration efficiency is high, and the application range is wide.
- the self-calibration startup module 10 includes: a first self-calibration startup sub-module, which is configured to start the on-board camera when a change in a viewing angle or a focal length of the on-board camera is detected. Self-calibrating. This enables the vehicle camera to maintain accurate calibration status at all times.
- the self-calibration starting module 10 includes: a second self-calibration starting sub-module, which is configured to start an IP camera when a change in an installation position and / or a shooting angle of the on-board camera is detected.
- the self-calibration of the on-board camera is described. This enables the vehicle camera to maintain accurate calibration status at all times.
- the self-calibration starting module 10 includes: a third self-calibration starting sub-module for determining a cumulative mileage of a vehicle on which the vehicle-mounted camera is installed; when the cumulative mileage is greater than When the mileage threshold is set, self-calibration of the on-board camera is started. This enables the vehicle camera to maintain accurate calibration status at all times.
- the self-calibration startup module 10 includes: a fourth self-calibration startup sub-module, configured to start a self-calibration of the vehicle camera according to a self-calibration startup instruction.
- the self-calibration of the on-board camera can be started in time according to the use requirements.
- the self-calibration of the on-board camera can also be applied to different use environments in time.
- the device further includes: a progress information providing module for providing collection progress information, the collection progress information including progress information collected by the vehicle camera from information required for calibration;
- the calibration operation module 30 includes: a first self-calibration operation sub-module, configured to self-calibrate the vehicle-mounted camera based on the collected information when it is determined that the information required for self-calibration is collected according to the collection progress information.
- the device further includes: collection condition prompt information, which is used to provide collection condition prompt information, and the collection condition prompt information includes prompt information of whether the vehicle camera has a collection condition, and the collection The condition includes a condition that the vehicle camera collects information required for calibration;
- the information acquisition module 20 includes: a first information acquisition sub-module for determining that the vehicle camera has the information according to the collection condition prompt information
- the vehicle camera is used to collect the information required for the vehicle camera to self-calibrate.
- the vehicle camera collects information required for the on-board camera self-calibration.
- the on-board camera captures the lane lines of the road on which the vehicle is traveling. You can self-calibrate the on-board camera based on the collected lane lines. This makes the self-calibration of the vehicle camera widely applicable, and the calibration process is simple and reliable.
- the information acquisition module 20 includes: a second information acquisition submodule, configured to, when the lens elevation angle of the vehicle-mounted camera is within the shooting elevation angle range, during the driving process of the vehicle In the process, information required for self-calibration of the vehicle camera is collected via the vehicle camera.
- the information includes a lane line of a road on which the vehicle travels
- the information acquisition module 20 includes a third information acquisition submodule, configured to capture the vehicle when the vehicle-mounted camera captures the vehicle.
- the vehicle-mounted camera is used to collect information required for the vehicle-mounted camera to self-calibrate.
- the information collection module 20 includes: a fourth information collection submodule, configured to: when the on-board camera captures a horizon or a vanishing point of a lane on which the vehicle travels, During the running of the vehicle, information required for self-calibration of the vehicle camera is collected via the vehicle camera.
- the information acquisition module 20 includes: a fifth information acquisition sub-module, configured to acquire the vehicle-mounted camera through the vehicle-mounted camera during a collection period during the running of the vehicle. Information needed for self-calibration.
- the information acquisition module 20 includes: a sixth information acquisition sub-module, configured to, when the vehicle travels within a travel distance range during the travel of the vehicle,
- the on-vehicle camera collects information required for self-calibration of the on-vehicle camera.
- the self-calibration computing module 30 includes a homography matrix update submodule for updating the homography matrix of the vehicle camera based on the collected information, and the homography matrix of the vehicle camera Reflect the posture of the on-board camera; a second self-calibration operation sub-module for self-calibrating the on-board camera according to the homography matrix of the on-board camera before and after the update.
- a homography matrix update submodule for updating the homography matrix of the vehicle camera based on the collected information, and the homography matrix of the vehicle camera Reflect the posture of the on-board camera
- a second self-calibration operation sub-module for self-calibrating the on-board camera according to the homography matrix of the on-board camera before and after the update.
- the information includes lane lines of the road on which the vehicle travels
- the homography matrix update sub-module includes: a lane line detection position information acquisition unit, configured to capture images from the on-board camera. A lane line is detected in the image to obtain the detected position information of the lane line; a homography matrix update unit is configured to update the homography matrix of the vehicle camera based on the detected position information of the lane line.
- the second self-calibration operation sub-module includes: a lane line known position information obtaining unit, configured to obtain the lane line's Known position information; a calibration parameter acquisition unit configured to determine a calibration parameter of the vehicle camera based on the detected position information of the lane line and the known position information of the lane line; a self-calibration unit configured to The calibration parameter self-calibrates the on-board camera.
- the lane line detection position information acquisition unit is configured to: detect a lane line in an image captured by the vehicle camera; determine a key point on the detected lane line, and obtain a key point. Detection coordinates; the calibration parameter acquisition unit is configured to determine calibration parameters of the on-board camera according to the detection coordinates of the key points and known coordinates of the key points.
- calibration parameters of the vehicle camera are obtained according to the detected position information and known position information of the lane line, and the vehicle camera is self-calibrated according to the calibration parameters.
- the on-board camera is calibrated according to the lane line, so that the on-board camera can easily complete self-calibration, the calibration efficiency is high, and the application range is wide.
- the lane line detection position information obtaining unit is configured to: perform lane line detection in each image captured by the on-board camera to obtain lane lines to be fitted in each image; The lane line to be fitted in the image is fitted to obtain a lane line and detection position information of the lane line.
- the lane line includes a first lane line and a second lane line, determining a key point on the detected lane line, and obtaining detection coordinates of the key point, including: detecting according to each image Determine the intersection of the first and second lane lines in each image; determine the horizon based on the intersections in each image; determine the horizon based on the horizon, the first lane line, and the The second lane line determines a key point and obtains the detection coordinates of the key point.
- determining a key point according to the horizon, the first lane line, and the second lane line, and obtaining detection coordinates of each of the key points include: determining that it is parallel to the horizon, And a detection line that intersects the first lane line and the second lane line respectively; the intersection of the detection line and the first lane line, and the detection line and the second lane line The intersection is determined as a key point, and detection coordinates of each of the key points are obtained.
- determining calibration parameters of the vehicle-mounted camera according to the detection coordinates of the key point and the known coordinates of the key point includes: according to the detection coordinates of the key point and the key
- the known coordinates of the point determine the transformation parameters of the camera, the detected coordinates of the key point include the coordinates of the key point at the driving shooting angle, and the known coordinates of the key point include the key point at the known angle of view. Coordinates; determining a calibration parameter of the camera according to the conversion parameter and a known parameter.
- the known coordinates of the key point include: a first known coordinate of the key point in the image coordinate system under a known perspective, and the key point in world coordinates under a known perspective.
- a second known coordinate in the system determining the calibration parameter of the camera according to the conversion parameter and the known parameter, including: determining the known parameter according to the first known coordinate and the second known coordinate Determining a calibration parameter of the camera according to the conversion parameter and the known parameter.
- the device further includes: a calibration module, configured to calibrate the calibration parameters by using a perspective principle or a triangle principle according to the calibration parameters of the vehicle camera.
- the vehicle includes one or any combination of the following devices: a motor vehicle, a non-motor vehicle, a train, a toy vehicle, and a robot.
- An embodiment of the present disclosure also provides a computer-readable storage medium having computer program instructions stored thereon, and the computer program instructions implement any of the foregoing method embodiments when executed by a processor.
- the computer-readable storage medium may be a non-volatile computer-readable storage medium or a volatile computer-readable storage medium.
- An embodiment of the present disclosure further provides an electronic device including a processor and a memory for storing processor-executable instructions; wherein the processor implements any method embodiment of the present disclosure by calling the executable instructions, specifically For the working process and the setting method, reference may be made to the specific description of the foregoing corresponding method embodiments of the present disclosure, which is limited in space and will not be repeated here.
- FIG. 9 illustrates a block diagram of an electronic device according to an exemplary embodiment of the present disclosure.
- the electronic device is provided as a terminal, a server, or other forms of equipment.
- the electronic device may include a vehicle camera self-calibration device, and the vehicle camera self-calibration device 800 may be a mobile phone, a computer, a digital broadcasting terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, etc. terminal.
- the device 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input / output (I / O) interface 812, a sensor component 814, And communication component 816.
- the processing component 802 generally controls the overall operations of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
- the processing component 802 may include one or more processors 820 to execute instructions to complete all or part of the steps of the method described above.
- the processing component 802 may include one or more modules to facilitate the interaction between the processing component 802 and other components.
- the processing component 802 may include a multimedia module to facilitate the interaction between the multimedia component 808 and the processing component 802.
- the memory 804 is configured to store various types of data to support operation at the device 800. Examples of such data include instructions for any application or method for operating on the device 800, contact data, phone book data, messages, pictures, videos, and the like.
- the memory 804 may be implemented by any type of volatile or non-volatile storage devices or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), Programming read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic disk or optical disk.
- SRAM static random access memory
- EEPROM electrically erasable programmable read-only memory
- EPROM Programming read-only memory
- PROM programmable read-only memory
- ROM read-only memory
- magnetic memory flash memory
- flash memory magnetic disk or optical disk.
- the power component 806 provides power to various components of the device 800.
- the power component 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
- the multimedia component 808 includes a screen that provides an output interface between the device 800 and a user.
- the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user.
- the touch panel includes one or more touch sensors to sense touch, swipe, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure related to the touch or slide operation.
- the multimedia component 808 includes a front camera and / or a rear camera. When the device 800 is in an operation mode, such as a shooting mode or a video mode, the front camera and / or the rear camera can receive external multimedia data. Each front camera and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
- the audio component 810 is configured to output and / or input audio signals.
- the audio component 810 includes a microphone (MIC) that is configured to receive an external audio signal when the device 800 is in an operation mode, such as a call mode, a recording mode, and a voice recognition mode.
- the received audio signal may be further stored in the memory 804 or transmitted via the communication component 816.
- the audio component 810 further includes a speaker for outputting audio signals.
- the I / O interface 812 provides an interface between the processing component 802 and a peripheral interface module.
- the peripheral interface module may be a keyboard, a click wheel, a button, or the like. These buttons can include, but are not limited to: a home button, a volume button, a start button, and a lock button.
- the sensor component 814 includes one or more sensors for providing status assessment of various aspects of the device 800.
- the sensor component 814 can detect the on / off state of the device 800 and the relative positioning of the components, such as the display and keypad of the device 800.
- the sensor component 814 can also detect the change of the position of the device 800 or a component of the device 800 , The presence or absence of the user's contact with the device 800, the orientation or acceleration / deceleration of the device 800, and the temperature change of the device 800.
- the sensor component 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
- the sensor component 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
- the sensor component 814 may further include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
- the communication component 816 is configured to facilitate wired or wireless communication between the device 800 and other devices.
- the device 800 can access a wireless network based on a communication standard, such as WiFi, 2G, or 3G, or a combination thereof.
- the communication component 816 receives a broadcast signal or broadcast-related information from an external broadcast management system via a broadcast channel.
- the communication component 816 further includes a near field communication (NFC) module to facilitate short-range communication.
- the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
- RFID radio frequency identification
- IrDA infrared data association
- UWB ultra wideband
- Bluetooth Bluetooth
- the device 800 may be implemented by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable A gate array (FPGA), controller, microcontroller, microprocessor, or other electronic component implementation is used to perform the above method.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGA field programmable A gate array
- controller microcontroller, microprocessor, or other electronic component implementation is used to perform the above method.
- a non-volatile computer-readable storage medium such as a memory 804 including computer program instructions, and the computer program instructions may be executed by the processor 820 of the device 800 to complete the foregoing method.
- each block in the flowchart or block diagram may represent a module, a program segment, or a part of an instruction that contains one or more components for implementing a specified logical function.
- Executable instructions may also occur in a different order than those marked in the drawings. For example, two consecutive blocks may actually be executed substantially in parallel, and they may sometimes be executed in the reverse order, depending on the functions involved.
- each block in the block diagrams and / or flowcharts, and combinations of blocks in the block diagrams and / or flowcharts can be implemented in a dedicated hardware-based system that performs the specified function or action. , Or it can be implemented with a combination of dedicated hardware and computer instructions.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Mechanical Engineering (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Studio Devices (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims (51)
- 一种车载摄像头自标定方法,其特征在于,所述方法包括:启动车载摄像头的自标定,使安装有所述车载摄像头的车辆处于行驶状态;在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息;基于采集的信息自标定所述车载摄像头。
- 根据权利要求1所述的方法,其特征在于,所述启动车载摄像头的自标定,包括:当检测到所述车载摄像头的视角或焦距发生变化时,启动所述车载摄像头的自标定。
- 根据权利要求1或2所述的方法,其特征在于,所述启动车载摄像头的自标定,包括:当检测到所述车载摄像头的安装位置和/或拍摄角度发生变化时,启动所述车载摄像头的自标定。
- 根据权利要求1至3中任一项所述的方法,其特征在于,所述启动车载摄像头的自标定,包括:确定安装有所述车载摄像头的车辆的累计行驶里程;当所述累计行驶里程大于里程阈值时,启动所述车载摄像头的自标定。
- 根据权利要求1至4中任一项所述的方法,其特征在于,所述启动车载摄像头的自标定,包括:根据自标定启动指令,启动所述车载摄像头的自标定。
- 根据权利要求1至5中任一项所述的方法,其特征在于,所述方法还包括:提供采集进度信息,所述采集进度信息包括所述车载摄像头采集自标定所需信息的进度信息;基于采集的信息自标定所述车载摄像头,包括:当根据所述采集进度信息确定自标定所需信息采集完毕时,基于采集的信息自标定所述车载摄像头。
- 根据权利要求1至6中任一项所述的方法,其特征在于,所述方法还包括:提供采集条件提示信息,所述采集条件提示信息包括所述车载摄像头是否具备采集条件的提示信息,所述采集条件包括所述车载摄像头采集自标定所需信息的条件;在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息,包括:当根据所述采集条件提示信息,确定所述车载摄像头具备所述采集条件时,在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息。
- 根据权利要求1至7中任一项所述的方法,其特征在于,在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息,包括:当所述车载摄像头的镜头俯仰角在拍摄俯仰角范围内时,在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息。
- 根据权利要求1至8中任一项所述的方法,其特征在于,所述信息包括所述车辆行驶道路的车道线,在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息,包括:当所述车载摄像头拍摄到所述车辆行驶道路的车道线时,在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息。
- 根据权利要求9所述的方法,其特征在于,在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息,包括:当所述车载摄像头拍摄到所述车辆行驶道路的地平线或车道线灭点时,在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息。
- 根据权利要求1至10任一项所述的方法,其特征在于,在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息,包括:在所述车辆行驶过程中,经所述车载摄像头在采集时长范围内采集所述车载摄像头自标定所需的信息。
- 根据权利要求1至11任一项所述的方法,其特征在于,在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息,包括:在所述车辆行驶过程中,当车辆的行驶距离在行驶距离范围内时,经所述车载摄像头采集所述车载摄像头自标定所需的信息。
- 根据权利要求1至12中任一项所述的方法,其特征在于,基于采集的信息自标定所述车载摄像头,包括:基于采集的信息更新所述车载摄像头的单应矩阵,所述车载摄像头的单应矩阵反应所述车载摄像头的位姿;根据更新前和更新后的所述车载摄像头的单应矩阵自标定所述车载摄像头。
- 根据权利要求13所述的方法,其特征在于,所述信息包括所述车辆行驶道路的车道线,基于采集的信息更新所述车载摄像头的单应矩阵,包括:在所述车载摄像头拍摄的图像中检测车道线,得到所述车道线的检测位置信息;基于所述车道线的检测位置信息更新所述车载摄像头的单应矩阵。
- 根据权利要求14所述的方法,其特征在于,根据更新前和更新后的所述车载摄像头的单应矩阵自标定所述车载摄像头,包括:根据更新前的所述车载摄像头的单应矩阵得到所述车道线的已知位置信息;根据所述车道线的检测位置信息和所述车道线的已知位置信息,确定所述车载摄像头的标定参数;根据所述标定参数自标定所述车载摄像头。
- 根据权利要求15所述的方法,其特征在于,在所述车载摄像头拍摄的图像中检测车道线,得到所述车道线的检测位置信息,包括:在所述车载摄像头拍摄的图像中检测车道线;在检测出的车道线上确定关键点,得到关键点的检测坐标;根据所述车道线的检测位置信息和所述车道线的已知位置信息,确定所述车载摄像头的标定参数,包括:根据所述关键点的检测坐标和所述关键点的已知坐标,确定所述车载摄像头的标定参数。
- 根据权利要求14至16中任一项所述的方法,其特征在于,在所述车载摄像头拍摄的图像中检测车道线,得到所述车道线的检测位置信息,包括:在所述车载摄像头拍摄的各图像中进行车道线检测,得到各图像中的待拟合车道线;将各图像中的待拟合车道线进行拟合,得到车道线以及所述车道线的检测位置信息。
- 根据权利要求16或17所述的方法,其特征在于,所述车道线包括第一车道线和第二车道线,在检测出的车道线上确定关键点,得到关键点的检测坐标,包括:根据各图像中检测出的第一车道线和第二车道线,确定各图像中第一车道线和第二车道线的交点;根据各图像中的交点确定地平线;根据所述地平线、所述第一车道线和所述第二车道线确定关键点,得到所述关键点的检测坐标。
- 根据权利要求18所述的方法,其特征在于,根据所述地平线、所述第一车道线和所述第二车道线确定关键点,得到各所述关键点的检测坐标,包括:确定与所述地平线平行,且与所述第一车道线和所述第二车道线分别交叉的检测线;将所述检测线和所述第一车道线的交叉点,以及所述检测线和所述第二车道线的交叉点确定为关键点,得到各所述关键点的检测坐标。
- 根据权利要求16至19中任一项所述的方法,其特征在于,根据所述关键点的检测坐标和所述关键点的已知坐标,确定所述车载摄像头的标定参数,包括:根据所述关键点的检测坐标和所述关键点的已知坐标确定所述摄像头的转换参数,所述关键点的检测坐标包括行驶拍摄视角下所述关键点的坐标,所述关键点的已知坐标包括已知视角下所述关键点的坐标;根据所述转换参数和已知参数,确定所述摄像头的标定参数。
- 根据权利要求20所述的方法,其特征在于,所述关键点的已知坐标包括:已知视角下所述关键点在图像坐标系中的第一已知坐标,已知视角下所述关键点在世界坐标系中的第二已知坐标;根据所述转换参数和已知参数,确定所述摄像头的标定参数,包括:根据所述第一已知坐标和所述第二已知坐标,确定已知参数;根据所述转换参数和所述已知参数,确定所述摄像头的标定参数。
- 根据权利要求15至21中任一项所述的方法,其特征在于,所述方法还包括:根据所述车载摄像头的校准参数,利用透视原理或三角原理对所述标定参数进行校准。
- 根据权利要求1至22中任一项所述的方法,其特征在于,所述车辆包括以下设备中的其中一种或任意组合:机 动车、非机动车、火车、玩具车、机器人。
- 一种车辆驾驶方法,其特征在于,所述方法包括:利用如权利要求1至23中任一项所述的自标定后的车载摄像头,进行车辆驾驶。
- 一种车载摄像头自标定装置,其特征在于,所述装置包括:自标定启动模块,用于启动车载摄像头的自标定,使安装有所述车载摄像头的车辆处于行驶状态;信息采集模块,用于在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息;自标定运算模块,用于基于采集的信息自标定所述车载摄像头。
- 根据权利要求25所述的装置,其特征在于,所述自标定启动模块,包括:第一自标定启动子模块,用于当检测到所述车载摄像头的视角或焦距发生变化时,启动所述车载摄像头的自标定。
- 根据权利要求25或26所述的装置,其特征在于,所述自标定启动模块,包括:第二自标定启动子模块,用于当检测到所述车载摄像头的安装位置和/或拍摄角度发生变化时,启动所述车载摄像头的自标定。
- 根据权利要求25至27中任一项所述的装置,其特征在于,所述自标定启动模块,包括:第三自标定启动子模块,用于确定安装有所述车载摄像头的车辆的累计行驶里程;当所述累计行驶里程大于里程阈值时,启动所述车载摄像头的自标定。
- 根据权利要求25至28中任一项所述的装置,其特征在于,所述自标定启动模块,包括:第四自标定启动子模块,用于根据自标定启动指令,启动所述车载摄像头的自标定。
- 根据权利要求25至29中任一项所述的装置,其特征在于,所述装置还包括:进度信息提供模块,用于提供采集进度信息,所述采集进度信息包括所述车载摄像头采集自标定所需信息的进度信息;所述自标定运算模块,包括:第一自标定运算子模块,用于当根据所述采集进度信息确定自标定所需信息采集完毕时,基于采集的信息自标定所述车载摄像头。
- 根据权利要求25至30中任一项所述的装置,其特征在于,所述装置还包括:采集条件提示信息,用于提供采集条件提示信息,所述采集条件提示信息包括所述车载摄像头是否具备采集条件的提示信息,所述采集条件包括所述车载摄像头采集自标定所需信息的条件;所述信息采集模块,包括:第一信息采集子模块,用于当根据所述采集条件提示信息,确定所述车载摄像头具备所述采集条件时,在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息。
- 根据权利要求25至31中任一项所述的装置,其特征在于,所述信息采集模块,包括:第二信息采集子模块,用于当所述车载摄像头的镜头俯仰角在拍摄俯仰角范围内时,在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息。
- 根据权利要求25至32中任一项所述的装置,其特征在于,所述信息包括所述车辆行驶道路的车道线,所述信息采集模块,包括:第三信息采集子模块,用于当所述车载摄像头拍摄到所述车辆行驶道路的车道线时,在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息。
- 根据权利要求33所述的装置,其特征在于,所述信息采集模块,包括:第四信息采集子模块,用于当所述车载摄像头拍摄到所述车辆行驶道路的地平线或车道线灭点时,在所述车辆行驶过程中,经所述车载摄像头采集所述车载摄像头自标定所需的信息。
- 根据权利要求25至34任一项所述的装置,其特征在于,所述信息采集模块,包括:第五信息采集子模块,用于在所述车辆行驶过程中,经所述车载摄像头在采集时长范围内采集所述车载摄像头自标定所需的信息。
- 根据权利要求25至35任一项所述的装置,其特征在于,所述信息采集模块,包括:第六信息采集子模块,用于在所述车辆行驶过程中,当车辆的行驶距离在行驶距离范围内时,经所述车载摄像头 采集所述车载摄像头自标定所需的信息。
- 根据权利要求25至36中任一项所述的装置,其特征在于,所述自标定运算模块,包括:单应矩阵更新子模块,用于基于采集的信息更新所述车载摄像头的单应矩阵,所述车载摄像头的单应矩阵反应所述车载摄像头的位姿;第二自标定运算子模块,用于根据更新前和更新后的所述车载摄像头的单应矩阵自标定所述车载摄像头。
- 根据权利要求37所述的装置,其特征在于,所述信息包括所述车辆行驶道路的车道线,所述单应矩阵更新子模块,包括:车道线检测位置信息获取单元,用于在所述车载摄像头拍摄的图像中检测车道线,得到所述车道线的检测位置信息;单应矩阵更新单元,用于基于所述车道线的检测位置信息更新所述车载摄像头的单应矩阵。
- 根据权利要求38所述的装置,其特征在于,所述第二自标定运算子模块,包括:车道线已知位置信息获取单元,用于根据更新前的所述车载摄像头的单应矩阵得到所述车道线的已知位置信息;标定参数获取单元,用于根据所述车道线的检测位置信息和所述车道线的已知位置信息,确定所述车载摄像头的标定参数;自标定单元,用于根据所述标定参数自标定所述车载摄像头。
- 根据权利要求39所述的装置,其特征在于,所述车道线检测位置信息获取单元,用于:在所述车载摄像头拍摄的图像中检测车道线;在检测出的车道线上确定关键点,得到关键点的检测坐标;所述标定参数获取单元,用于:根据所述关键点的检测坐标和所述关键点的已知坐标,确定所述车载摄像头的标定参数。
- 根据权利要求38至40中任一项所述的装置,其特征在于,所述车道线检测位置信息获取单元,用于:在所述车载摄像头拍摄的各图像中进行车道线检测,得到各图像中的待拟合车道线;将各图像中的待拟合车道线进行拟合,得到车道线以及所述车道线的检测位置信息。
- 根据权利要求40或41所述的装置,其特征在于,所述车道线包括第一车道线和第二车道线,在检测出的车道线上确定关键点,得到关键点的检测坐标,包括:根据各图像中检测出的第一车道线和第二车道线,确定各图像中第一车道线和第二车道线的交点;根据各图像中的交点确定地平线;根据所述地平线、所述第一车道线和所述第二车道线确定关键点,得到所述关键点的检测坐标。
- 根据权利要求42所述的装置,其特征在于,根据所述地平线、所述第一车道线和所述第二车道线确定关键点,得到各所述关键点的检测坐标,包括:确定与所述地平线平行,且与所述第一车道线和所述第二车道线分别交叉的检测线;将所述检测线和所述第一车道线的交叉点,以及所述检测线和所述第二车道线的交叉点确定为关键点,得到各所述关键点的检测坐标。
- 根据权利要求40至43中任一项所述的装置,其特征在于,根据所述关键点的检测坐标和所述关键点的已知坐标,确定所述车载摄像头的标定参数,包括:根据所述关键点的检测坐标和所述关键点的已知坐标确定所述摄像头的转换参数,所述关键点的检测坐标包括行驶拍摄视角下所述关键点的坐标,所述关键点的已知坐标包括已知视角下所述关键点的坐标;根据所述转换参数和已知参数,确定所述摄像头的标定参数。
- 根据权利要求44所述的装置,其特征在于,所述关键点的已知坐标包括:已知视角下所述关键点在图像坐标系中的第一已知坐标,已知视角下所述关键点在世界坐标系中的第二已知坐标;根据所述转换参数和已知参数,确定所述摄像头的标定参数,包括:根据所述第一已知坐标和所述第二已知坐标,确定已知参数;根据所述转换参数和所述已知参数,确定所述摄像头的标定参数。
- 根据权利要求39至45中任一项所述的装置,其特征在于,所述装置还包括:校准模块,用于根据所述车载摄像头的校准参数,利用透视原理或三角原理对所述标定参数进行校准。
- 根据权利要求25至46中任一项所述的装置,其特征在于,所述车辆包括以下设备中的其中一种或任意组合:机动车、非机动车、火车、玩具车、机器人。
- 一种车辆驾驶装置,其特征在于,所述装置用于:利用如权利要求25至47中任一项所述的自标定后的车载摄像头,进行车辆驾驶。
- 一种电子设备,其特征在于,包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器通过调用所述可执行指令实现如权利要求1至24中任意一项所述的方法。
- 一种计算机可读存储介质,其上存储有计算机程序指令,其特征在于,所述计算机程序指令被处理器执行时实现权利要求1至24中任意一项所述的方法。
- 一种计算机程序,其特征在于,包括计算机可读代码,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器执行用于实现权利要求1-24中任意一项所述的方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020541881A JP7082671B2 (ja) | 2018-06-05 | 2019-05-29 | 車載カメラ自己校正方法、車載カメラ自己校正装置、電子機器および記憶媒体 |
KR1020207027687A KR20200125667A (ko) | 2018-06-05 | 2019-05-29 | 차재 카메라 자기 교정 방법과 장치 및 차량 운전 방법과 장치 |
SG11202007195TA SG11202007195TA (en) | 2018-06-05 | 2019-05-29 | Vehicle-mounted camera self-calibration method and apparatus, and vehicle driving method and apparatus |
US16/942,965 US20200357138A1 (en) | 2018-06-05 | 2020-07-30 | Vehicle-Mounted Camera Self-Calibration Method and Apparatus, and Storage Medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810578736.5A CN110570475A (zh) | 2018-06-05 | 2018-06-05 | 车载摄像头自标定方法及装置和车辆驾驶方法及装置 |
CN201810578736.5 | 2018-06-05 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/942,965 Continuation US20200357138A1 (en) | 2018-06-05 | 2020-07-30 | Vehicle-Mounted Camera Self-Calibration Method and Apparatus, and Storage Medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019233330A1 true WO2019233330A1 (zh) | 2019-12-12 |
Family
ID=68769241
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/089033 WO2019233330A1 (zh) | 2018-06-05 | 2019-05-29 | 车载摄像头自标定方法及装置和车辆驾驶方法及装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20200357138A1 (zh) |
JP (1) | JP7082671B2 (zh) |
KR (1) | KR20200125667A (zh) |
CN (1) | CN110570475A (zh) |
SG (1) | SG11202007195TA (zh) |
WO (1) | WO2019233330A1 (zh) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112541952A (zh) * | 2020-12-08 | 2021-03-23 | 北京精英路通科技有限公司 | 停车场景相机标定方法、装置、计算机设备及存储介质 |
WO2021215672A1 (en) * | 2020-04-24 | 2021-10-28 | StradVision, Inc. | Method and device for calibrating pitch of camera on vehicle and method and device for continual learning of vanishing point estimation model to be used for calibrating the pitch |
CN113824880A (zh) * | 2021-08-26 | 2021-12-21 | 国网浙江省电力有限公司双创中心 | 一种基于目标检测和uwb定位的车辆跟踪方法 |
CN113822944A (zh) * | 2021-09-26 | 2021-12-21 | 中汽创智科技有限公司 | 一种外参标定方法、装置、电子设备及存储介质 |
CN114347917A (zh) * | 2021-12-28 | 2022-04-15 | 华人运通(江苏)技术有限公司 | 一种车辆、车载摄像系统的校准方法和装置 |
CN114622469A (zh) * | 2022-01-28 | 2022-06-14 | 南通威而多专用汽车制造有限公司 | 一种自动敷旧线控制系统及其控制方法 |
CN115550555A (zh) * | 2022-11-28 | 2022-12-30 | 杭州华橙软件技术有限公司 | 云台校准方法及相关装置、摄像器件和存储介质 |
US11842623B1 (en) | 2022-05-17 | 2023-12-12 | Ford Global Technologies, Llc | Contextual calibration of connected security device |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102628012B1 (ko) * | 2018-10-23 | 2024-01-22 | 삼성전자주식회사 | 캘리브레이션 방법 및 장치 |
CN111141311B (zh) * | 2019-12-31 | 2022-04-08 | 武汉中海庭数据技术有限公司 | 一种高精度地图定位模块的评估方法及系统 |
DE102020202964A1 (de) * | 2020-03-09 | 2021-09-09 | Continental Automotive Gmbh | Die Erfindung betrifft ein Verfahren zur Erhöhung der Sicherheit von Fahrfunktionen. |
CN112115968B (zh) * | 2020-08-10 | 2024-04-19 | 北京智行者科技股份有限公司 | 一种智能清扫车垃圾识别方法及系统 |
US11341683B2 (en) * | 2020-09-22 | 2022-05-24 | AiFi Corp | Smart self calibrating camera system |
CN112419423A (zh) * | 2020-10-30 | 2021-02-26 | 上海商汤临港智能科技有限公司 | 一种标定方法、装置、电子设备及存储介质 |
CN112529968A (zh) * | 2020-12-22 | 2021-03-19 | 上海商汤临港智能科技有限公司 | 摄像设备标定方法、装置、电子设备及存储介质 |
CN112614192B (zh) * | 2020-12-24 | 2022-05-17 | 亿咖通(湖北)技术有限公司 | 一种车载相机的在线标定方法和车载信息娱乐系统 |
CN112785653A (zh) * | 2020-12-30 | 2021-05-11 | 中山联合汽车技术有限公司 | 车载相机姿态角标定方法 |
CN113284186B (zh) * | 2021-04-13 | 2022-04-15 | 武汉光庭信息技术股份有限公司 | 一种基于惯导姿态和灭点的相机标定方法及系统 |
CN112991433B (zh) * | 2021-04-26 | 2022-08-02 | 吉林大学 | 基于双目深度感知和车辆位置的货车外廓尺寸测量方法 |
CN113766211B (zh) * | 2021-08-24 | 2023-07-25 | 武汉极目智能技术有限公司 | 一种adas设备的摄像头安装角度检测系统及方法 |
CN115214694B (zh) * | 2021-09-13 | 2023-09-08 | 广州汽车集团股份有限公司 | 摄像头标定触发控制方法、车载控制器和智能驾驶系统 |
US11875580B2 (en) * | 2021-10-04 | 2024-01-16 | Motive Technologies, Inc. | Camera initialization for lane detection and distance estimation using single-view geometry |
CN113838149B (zh) * | 2021-10-09 | 2023-08-18 | 智道网联科技(北京)有限公司 | 自动驾驶车辆的相机内参标定方法、服务器及系统 |
US11750791B2 (en) * | 2021-10-19 | 2023-09-05 | Mineral Earth Sciences Llc | Automatically determining extrinsic parameters of modular edge computing devices |
FR3129511A1 (fr) * | 2021-11-23 | 2023-05-26 | Multitec Innovation | Système d’assistance au stationnement d’un véhicule équipé d’une plateforme mobile, telle un hayon élévateur arrière ou latéral, et procédé correspondant |
CN114170324A (zh) * | 2021-12-09 | 2022-03-11 | 深圳市商汤科技有限公司 | 标定方法及装置、电子设备和存储介质 |
CN114663524B (zh) * | 2022-03-09 | 2023-04-07 | 禾多科技(北京)有限公司 | 多相机在线标定方法、装置、电子设备和计算机可读介质 |
CN114882115B (zh) * | 2022-06-10 | 2023-08-25 | 国汽智控(北京)科技有限公司 | 车辆位姿的预测方法和装置、电子设备和存储介质 |
JP7394934B1 (ja) | 2022-08-16 | 2023-12-08 | 株式会社デンソーテン | 情報処理装置、情報処理方法、およびプログラム |
CN116704046B (zh) * | 2023-08-01 | 2023-11-10 | 北京积加科技有限公司 | 一种跨镜图像匹配方法及装置 |
CN117036505B (zh) * | 2023-08-23 | 2024-03-29 | 长和有盈电子科技(深圳)有限公司 | 车载摄像头在线标定方法及系统 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105976377A (zh) * | 2016-05-09 | 2016-09-28 | 西安电子科技大学 | 车载鱼眼摄像头自标定的方法 |
CN106651963A (zh) * | 2016-12-29 | 2017-05-10 | 清华大学苏州汽车研究院(吴江) | 一种用于驾驶辅助系统的车载摄像头的安装参数标定方法 |
CN106981082A (zh) * | 2017-03-08 | 2017-07-25 | 驭势科技(北京)有限公司 | 车载摄像头标定方法、装置及车载设备 |
CN107133985A (zh) * | 2017-04-20 | 2017-09-05 | 常州智行科技有限公司 | 一种基于车道线消逝点的车载摄像机自动标定方法 |
US20170278270A1 (en) * | 2016-03-24 | 2017-09-28 | Magna Electronics Inc. | Targetless vehicle camera calibration system |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002135765A (ja) | 1998-07-31 | 2002-05-10 | Matsushita Electric Ind Co Ltd | カメラキャリブレーション指示装置及びカメラキャリブレーション装置 |
JP2008187566A (ja) * | 2007-01-31 | 2008-08-14 | Sanyo Electric Co Ltd | カメラ校正装置及び方法並びに車両 |
JP5124147B2 (ja) | 2007-02-01 | 2013-01-23 | 三洋電機株式会社 | カメラ校正装置及び方法並びに車両 |
JP4863922B2 (ja) | 2007-04-18 | 2012-01-25 | 三洋電機株式会社 | 運転支援システム並びに車両 |
WO2012143036A1 (en) | 2011-04-18 | 2012-10-26 | Connaught Electronics Limited | Online vehicle camera calibration based on continuity of features |
JP5971939B2 (ja) | 2011-12-21 | 2016-08-17 | アルパイン株式会社 | 画像表示装置、画像表示装置における撮像カメラのキャリブレーション方法およびキャリブレーションプログラム |
JP6141601B2 (ja) | 2012-05-15 | 2017-06-07 | 東芝アルパイン・オートモティブテクノロジー株式会社 | 車載カメラ自動キャリブレーション装置 |
CN103927754B (zh) * | 2014-04-21 | 2016-08-31 | 大连理工大学 | 一种车载摄像机的标定方法 |
JP6371185B2 (ja) | 2014-09-30 | 2018-08-08 | クラリオン株式会社 | カメラキャリブレーション装置及びカメラキャリブレーションシステム |
JP6682767B2 (ja) | 2015-03-23 | 2020-04-15 | 株式会社リコー | 情報処理装置、情報処理方法、プログラムおよびシステム |
JP6694281B2 (ja) | 2016-01-26 | 2020-05-13 | 株式会社日立製作所 | ステレオカメラおよび撮像システム |
CN107145825A (zh) * | 2017-03-31 | 2017-09-08 | 纵目科技(上海)股份有限公司 | 地平面拟合、摄像头标定方法及系统、车载终端 |
CN108052908A (zh) * | 2017-12-15 | 2018-05-18 | 郑州日产汽车有限公司 | 车道保持方法 |
-
2018
- 2018-06-05 CN CN201810578736.5A patent/CN110570475A/zh active Pending
-
2019
- 2019-05-29 KR KR1020207027687A patent/KR20200125667A/ko not_active Application Discontinuation
- 2019-05-29 SG SG11202007195TA patent/SG11202007195TA/en unknown
- 2019-05-29 JP JP2020541881A patent/JP7082671B2/ja active Active
- 2019-05-29 WO PCT/CN2019/089033 patent/WO2019233330A1/zh active Application Filing
-
2020
- 2020-07-30 US US16/942,965 patent/US20200357138A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170278270A1 (en) * | 2016-03-24 | 2017-09-28 | Magna Electronics Inc. | Targetless vehicle camera calibration system |
CN105976377A (zh) * | 2016-05-09 | 2016-09-28 | 西安电子科技大学 | 车载鱼眼摄像头自标定的方法 |
CN106651963A (zh) * | 2016-12-29 | 2017-05-10 | 清华大学苏州汽车研究院(吴江) | 一种用于驾驶辅助系统的车载摄像头的安装参数标定方法 |
CN106981082A (zh) * | 2017-03-08 | 2017-07-25 | 驭势科技(北京)有限公司 | 车载摄像头标定方法、装置及车载设备 |
CN107133985A (zh) * | 2017-04-20 | 2017-09-05 | 常州智行科技有限公司 | 一种基于车道线消逝点的车载摄像机自动标定方法 |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021215672A1 (en) * | 2020-04-24 | 2021-10-28 | StradVision, Inc. | Method and device for calibrating pitch of camera on vehicle and method and device for continual learning of vanishing point estimation model to be used for calibrating the pitch |
CN112541952A (zh) * | 2020-12-08 | 2021-03-23 | 北京精英路通科技有限公司 | 停车场景相机标定方法、装置、计算机设备及存储介质 |
CN113824880A (zh) * | 2021-08-26 | 2021-12-21 | 国网浙江省电力有限公司双创中心 | 一种基于目标检测和uwb定位的车辆跟踪方法 |
CN113822944A (zh) * | 2021-09-26 | 2021-12-21 | 中汽创智科技有限公司 | 一种外参标定方法、装置、电子设备及存储介质 |
CN113822944B (zh) * | 2021-09-26 | 2023-10-31 | 中汽创智科技有限公司 | 一种外参标定方法、装置、电子设备及存储介质 |
CN114347917A (zh) * | 2021-12-28 | 2022-04-15 | 华人运通(江苏)技术有限公司 | 一种车辆、车载摄像系统的校准方法和装置 |
CN114347917B (zh) * | 2021-12-28 | 2023-11-10 | 华人运通(江苏)技术有限公司 | 一种车辆、车载摄像系统的校准方法和装置 |
CN114622469A (zh) * | 2022-01-28 | 2022-06-14 | 南通威而多专用汽车制造有限公司 | 一种自动敷旧线控制系统及其控制方法 |
US11842623B1 (en) | 2022-05-17 | 2023-12-12 | Ford Global Technologies, Llc | Contextual calibration of connected security device |
CN115550555A (zh) * | 2022-11-28 | 2022-12-30 | 杭州华橙软件技术有限公司 | 云台校准方法及相关装置、摄像器件和存储介质 |
Also Published As
Publication number | Publication date |
---|---|
US20200357138A1 (en) | 2020-11-12 |
JP7082671B2 (ja) | 2022-06-08 |
SG11202007195TA (en) | 2020-08-28 |
JP2021513247A (ja) | 2021-05-20 |
KR20200125667A (ko) | 2020-11-04 |
CN110570475A (zh) | 2019-12-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019233330A1 (zh) | 车载摄像头自标定方法及装置和车辆驾驶方法及装置 | |
JP7007497B2 (ja) | 測距方法、知能制御方法及び装置、電子機器ならびに記憶媒体 | |
US20200344421A1 (en) | Image pickup apparatus, image pickup control method, and program | |
JP4380550B2 (ja) | 車載用撮影装置 | |
WO2020038118A1 (zh) | 车载摄像头的姿态估计方法、装置和系统及电子设备 | |
CN109002754A (zh) | 车辆远程停车系统和方法 | |
WO2020168787A1 (zh) | 确定车体位姿的方法及装置、制图方法 | |
JP2020043400A (ja) | 周辺監視装置 | |
JP2013154730A (ja) | 画像処理装置、画像処理方法、及び、駐車支援システム | |
JP2019089476A (ja) | 駐車支援装置及びコンピュータプログラム | |
WO2022110653A1 (zh) | 一种位姿确定方法及装置、电子设备和计算机可读存储介质 | |
JP6375633B2 (ja) | 車両周辺画像表示装置、車両周辺画像表示方法 | |
JP2016149613A (ja) | カメラパラメータ調整装置 | |
KR20170057684A (ko) | 전방 카메라를 이용한 주차 지원 방법 | |
KR20150128140A (ko) | 어라운드 뷰 시스템 | |
CN116385528B (zh) | 标注信息的生成方法、装置、电子设备、车辆及存储介质 | |
CN110301133B (zh) | 信息处理装置、信息处理方法和计算机可读记录介质 | |
CN115170630B (zh) | 地图生成方法、装置、电子设备、车辆和存储介质 | |
CN114608591B (zh) | 车辆定位方法、装置、存储介质、电子设备、车辆及芯片 | |
CN107886472B (zh) | 全景泊车系统的图像拼接校准方法和图像拼接校准装置 | |
JP2020166689A (ja) | 車両遠隔監視システム、車両遠隔監視装置、及び車両遠隔監視方法 | |
CN116883496B (zh) | 交通元素的坐标重建方法、装置、电子设备及存储介质 | |
JP2019117581A (ja) | 車両 | |
JP6869452B2 (ja) | 距離測定装置及び距離測定方法 | |
JP4975568B2 (ja) | 地物識別装置、地物識別方法及び地物識別プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19814898 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020541881 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20207027687 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19814898 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 25.03.2021) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19814898 Country of ref document: EP Kind code of ref document: A1 |