CN114255280A - Data processing method and computing device - Google Patents

Data processing method and computing device Download PDF

Info

Publication number
CN114255280A
CN114255280A CN202011022321.3A CN202011022321A CN114255280A CN 114255280 A CN114255280 A CN 114255280A CN 202011022321 A CN202011022321 A CN 202011022321A CN 114255280 A CN114255280 A CN 114255280A
Authority
CN
China
Prior art keywords
camera
vehicle
coordinate system
calibration characteristic
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011022321.3A
Other languages
Chinese (zh)
Inventor
单国航
贾双成
朱磊
李倩
李成军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mushroom Car Union Information Technology Co Ltd
Original Assignee
Mushroom Car Union Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mushroom Car Union Information Technology Co Ltd filed Critical Mushroom Car Union Information Technology Co Ltd
Priority to CN202011022321.3A priority Critical patent/CN114255280A/en
Publication of CN114255280A publication Critical patent/CN114255280A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Abstract

The application relates to a data processing method and computing equipment. The method comprises the following steps: the method comprises the steps of obtaining pixel coordinate data of a plurality of calibration characteristic points in a current frame image acquired by a vehicle-mounted camera, and obtaining a rotation transformation relation and a camera installation height which enable the overall error to be minimum through least square optimization according to the pixel coordinate data of the plurality of calibration characteristic points, camera internal reference data, pixel coordinates of preset calibration characteristic points, vehicle coordinates, a rotation transformation relation between a camera coordinate system and a vehicle coordinate system, a function relation between camera internal references and a constraint relation between preset calibration characteristic points.

Description

Data processing method and computing device
Technical Field
The present application relates to computer application technologies, and in particular, to a data processing method and a computing device.
Background
Camera calibration is one of the bases of computer vision. Through camera calibration, the correlation between the three-dimensional geometric position of a certain point on the surface of the space object and the corresponding point in the image can be determined. The parameters that the camera needs to be calibrated usually include internal parameters and external parameters. The intrinsic parameter is a parameter related to a characteristic of the camera itself, such as a focal length, a pixel size, and the like of the camera. The external parameters determine the position and orientation of the camera in some three-dimensional space.
In the vehicle-mounted camera system, since a vehicle may vibrate due to a road or the like during traveling, external parameters of the vehicle-mounted camera may change. Thus, if the three-dimensional scene reconstruction is performed by using the picture taken by the vehicle-mounted camera, if the external parameters of the camera are not calibrated in time, the reliability and accuracy of the final result will be affected.
Disclosure of Invention
The application provides a data processing method and computing equipment, which can realize timely calibration of external parameters of a vehicle-mounted camera.
One aspect of the present application provides a data processing method, including:
acquiring pixel coordinate data of a plurality of calibration feature points in a current frame image acquired by a vehicle-mounted camera;
and obtaining the current external parameters of the camera according to the pixel coordinate data of the calibration characteristic points.
In some embodiments, obtaining the current camera external parameter from the pixel coordinate data of the plurality of calibration feature points includes:
and obtaining the rotation transformation relation and the camera installation height according to the pixel coordinate data and the camera internal reference data of the calibration characteristic points, the pixel coordinates of the preset calibration characteristic points, the vehicle coordinates, the rotation transformation relation between the camera coordinate system and the vehicle coordinate system and the function relation between the camera internal references.
In some embodiments, obtaining the rotation transformation relationship is further based on a constraint relationship between preset calibration feature points.
In some embodiments, obtaining the rotational transformation relationship and camera mounting height comprises:
and obtaining the rotation transformation relation and the camera installation height which enable the preset overall error to be minimum through least square optimization.
In some embodiments, the functional relationship comprises:
and the vehicle coordinate of the calibration characteristic point is equal to the inverse matrix of the camera internal reference matrix, and the inverse matrix of the rotation transformation matrix between the camera coordinate system and the vehicle coordinate system is multiplied by the pixel coordinate of the calibration characteristic point.
In some embodiments, the plurality of calibration feature points are located on at least three lane lines, and the constraint relationship comprises:
the z coordinates of the calibration characteristic points under the vehicle coordinate system are equal;
x coordinates of all the calibration characteristic points on the same lane line under a vehicle coordinate system are equal;
the distance between each pair of adjacent lane lines is equal to a preset lane line distance value.
In some embodiments, the overall error comprises the sum of the absolute values of the following error terms:
the mean square error Di of the x coordinate of each calibration characteristic point on each lane line in a vehicle coordinate system is shown, wherein i =1, 2, ¼, K; and
maximum error E (j, j + 1) in the spacing of each pair of adjacent lane lines, where j =1, 2, ¼, K-1;
and K is the number of lane lines in the current frame image.
In some embodiments, further comprising:
and obtaining the vehicle coordinate data of the calibration characteristic points according to the pixel coordinate data of the calibration characteristic points, the camera internal reference data, the functional relation and the camera installation height.
In some embodiments, the onboard camera is a camera of a vehicle event recorder.
Another aspect of the present application provides a computing device comprising:
a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method as described above.
A third aspect of the application provides a non-transitory machine-readable storage medium having stored thereon executable code which, when executed by a processor of an electronic device, causes the processor to perform a method as described above.
In some embodiments of the present application, the corresponding camera external parameter can be obtained according to the pixel coordinate data of the plurality of calibration feature points in different frame images acquired by the vehicle-mounted camera, so as to realize the timely calibration of the vehicle-mounted camera external parameter.
In some embodiments of the present application, only the rotation transformation relationship between the camera coordinate system and the vehicle coordinate system needs to be obtained, and the pixel coordinates of the calibration feature points, the vehicle coordinates, the rotation transformation relationship between the camera coordinate system and the vehicle coordinate system, the functional relationship between the camera internal parameters, and the constraint relationship between the calibration feature points are relatively simple, so that the calculation amount is relatively small, the efficiency can be improved, and the calculation resources can be saved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The foregoing and other objects, features and advantages of the application will be apparent from the following more particular descriptions of exemplary embodiments of the application, as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts throughout the exemplary embodiments of the application.
FIG. 1 is a schematic flow chart diagram illustrating a data processing method according to an embodiment of the present application;
FIG. 2 illustrates a selected target picture from a video captured by an onboard camera;
FIG. 3 is a schematic flow chart diagram of a data processing method according to another embodiment of the present application;
fig. 4 is a schematic structural diagram of a computing device according to an embodiment of the present application.
Detailed Description
Preferred embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present application are shown in the drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It should be understood that although the terms "first," "second," "third," etc. may be used herein to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
The technical solutions of the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flow chart of a data processing method according to an embodiment of the present application. Referring to fig. 1, the method includes:
in S101, pixel coordinate data of a plurality of calibration feature points in a current frame image acquired by an onboard camera is obtained.
In the present application, the vehicle-mounted camera may be, for example, a camera of a drive recorder, but is not limited thereto, and may be a camera of another device mounted on the vehicle.
The vehicle-mounted camera can acquire videos of roads and environments on two sides when a vehicle runs, and target pictures meeting preset conditions can be selected from the videos acquired by the vehicle-mounted camera and serve as current frame images to be processed.
In this embodiment, as shown in fig. 2, the selected target image may be an image of a relatively flat road surface with relatively clear lane lines, each image has four lane lines, lanes are formed between adjacent lane lines, and three lanes are formed by the four lane lines. It is understood that the number of lane lines in the target picture selected by the present application is not limited to four, but may be three, or more than four.
And selecting points on the lane line in the target picture as calibration characteristic points in the current frame image. Several points may be selected on each lane line. The selected calibration feature point is preferably located at the center of the width of the corresponding lane line.
In one implementation, the target picture may be opened in a picture processing tool, the calibration feature points are specified on the target picture by an operator, and the picture processing tool outputs pixel coordinates of the specified calibration feature points. It is understood that the pixel coordinates of the calibration feature point may be obtained by other methods, and is not limited thereto.
It will be appreciated that in other embodiments, calibration feature points on other static objects, such as buildings, etc., may also be selected.
In S102, the current camera external parameters are obtained according to the pixel coordinate data of the plurality of calibration feature points.
The vehicle coordinate system is a special coordinate system used to describe the motion of the vehicle, and may be in meters (m), when the vehicle is in a stationary state on a horizontal road surface, the X-axis of the vehicle coordinate system points to the right of the driver, the Y-axis points in front of the vehicle parallel to the ground, and the Z-axis points above the vehicle. The coordinates of a point in space in the vehicle coordinate system (referred to as vehicle coordinates in this application) can be denoted as Pcar (x, y, z), i.e. a position x meters to the right of the origin of the vehicle coordinate system, y meters ahead, and z meters high.
The pixel coordinate system takes pixels as a unit, and the origin is the upper left corner of the image. The coordinates of a point in the pixel coordinate system (referred to herein simply as image coordinates) may be denoted as Puv (u, v).
In the present application, the camera external reference refers to a transformation relationship between a camera coordinate system and a vehicle coordinate system, including a rotation transformation relationship and a translation transformation relationship. In this embodiment, it is assumed that the origin of the vehicle coordinate system coincides with the origin of the camera coordinate system, so that in the transformation relationship between the camera coordinate system and the vehicle coordinate system, the translation vector is a zero vector, and only the rotation transformation relationship needs to be determined.
In one implementation, obtaining the current camera external parameter according to the pixel coordinate data of the plurality of calibration feature points includes:
and obtaining the rotation transformation relation between the camera coordinate system and the vehicle coordinate system and the camera installation height according to the pixel coordinate data and the camera internal reference data of the plurality of calibration characteristic points, the preset pixel coordinates of the calibration characteristic points, the vehicle coordinates, the rotation transformation relation between the camera coordinate system and the vehicle coordinate system and the function relation between the camera internal references.
In another implementation, obtaining the current camera external parameter according to the pixel coordinate data of the plurality of calibration feature points includes:
and obtaining the rotation transformation relation between the camera coordinate system and the vehicle coordinate system and the camera installation height according to the pixel coordinate data and the camera internal reference data of the calibration characteristic points, the pixel coordinates of the preset calibration characteristic points, the vehicle coordinates, the rotation transformation relation between the camera coordinate system and the vehicle coordinate system, the functional relation between the camera internal references and the constraint relation between the preset calibration characteristic points.
In this embodiment, the corresponding camera external parameter can be obtained according to the pixel coordinate data of the plurality of calibration feature points in different frame images acquired by the vehicle-mounted camera, so that the vehicle-mounted camera external parameter can be calibrated in time.
Fig. 3 is a schematic flow chart of a data processing method according to another embodiment of the present application. Referring to fig. 3, the method of the present embodiment includes:
in S301, pixel coordinate data of a plurality of calibration feature points in the current frame image acquired by the onboard camera is obtained.
It is understood that S301 can be implemented with reference to the related description in S101, and is not described herein again.
In S302, a rotation transformation relation and a camera installation height that minimize an overall error are obtained through least square optimization according to pixel coordinate data of the plurality of calibration feature points, camera internal reference data, pixel coordinates of preset calibration feature points, vehicle coordinates, a rotation transformation relation between a camera coordinate system and a vehicle coordinate system, a functional relation between camera internal references, and a constraint relation between preset calibration feature points.
In this embodiment, it is assumed that the origin of the vehicle coordinate system coincides with the origin of the camera coordinate system, and thus, in the conversion relationship between the camera coordinate system and the vehicle coordinate system, the translation vector cam _ T _ car is [0, 0, 0], and only the rotation portion cam _ R _ car needs to be determined.
At this time, the camera coordinate pcm of any calibration feature point in the current frame image is equal to the product of the rotation transformation matrix cam _ R _ car between the camera coordinate system and the vehicle coordinate Pcar of the calibration feature point, that is:
pcm = cam _ R _ car Pcar formula (1)
From the projection calculation formula of the camera and the above equation (1), it can be known that: the pixel coordinate Puv of any calibration feature point in the current frame image is equal to the multiplication of the camera internal reference matrix a, the rotation transformation matrix cam _ R _ car between the camera coordinate system and the vehicle coordinate system, and the vehicle coordinate Pcar of the calibration feature point, that is:
puv = A × Pcam = A × cam _ R _ car × Ppar formula (2)
In this application, the camera internal reference matrix a and the pixel coordinates Puv of the calibration feature point are known, and the pixel coordinates of the calibration feature point, the vehicle coordinates, the rotational transformation relationship between the camera coordinate system and the vehicle coordinate system, and the functional relationship between the camera internal references can be obtained by transforming the above equation (2): the vehicle coordinate Pcar of any calibration feature point in the current frame image is equal to the inverse matrix A.inv of the camera internal reference matrix, and the inverse matrix cam _ R _ car.inv of the rotation transformation matrix between the camera coordinate system and the vehicle coordinate system, multiplied by the pixel coordinate of the calibration feature point, namely:
pcar = a.inv cam _ R _ car.inv Puv formula (3)
In this embodiment, the rotational transformation matrix cam _ R _ car that minimizes the overall error can be obtained by solving a joint equation system composed of the functional relationship of the above formula (3) and the constraint relationship between the calibration feature points, and by least squares optimization.
It will be appreciated that in other embodiments, the rotational transformation matrix may be solved by other methods, not limited to least squares optimization.
In one implementation, the constraint relationship between calibration feature points includes the following:
(a) and the z coordinates of all the calibration characteristic points under the vehicle coordinate system are equal.
The calibration feature points selected from the current frame image are all points on the ground lane line, so that the z coordinates of the calibration feature points under the vehicle coordinate system are equal.
(b) The x coordinates of the calibration characteristic points on the same lane line under the vehicle coordinate system are equal;
each lane line in the current frame image is parallel to the y axis of the vehicle coordinate system, and the calibration feature points selected from the current frame image are all points on the lane lines, so that the x coordinates of the calibration feature points on the same lane line in the vehicle coordinate system are equal.
(c) And the distance between each pair of adjacent lane lines is equal to the preset lane line distance d.
The lane line spacing d may be obtained in advance, for example, from the lane line distance measurement in the target picture.
In one implementation, the overall error Cost function may comprise the sum of the absolute values of the following error terms:
the mean square error Di of the x coordinate of each calibration characteristic point on each lane line in a vehicle coordinate system is shown, wherein i =1, 2, ¼, K; and
maximum error E (j, j + 1) in the spacing of each pair of adjacent lane lines, where j =1, 2, ¼, K-1;
and K is the number of lane lines in the current frame image.
Namely:
Figure 439872DEST_PATH_IMAGE001
in this embodiment, the maximum error E (j, j + 1) of the distance between adjacent lane lines can be obtained by the following formula:
Figure 501500DEST_PATH_IMAGE002
the method comprises the following steps of obtaining a calibration characteristic point on a jth lane line and a jth +1 th lane line, wherein Dmax is the maximum distance of the calibration characteristic point on the jth and j +1 th adjacent lane lines in the x direction of a vehicle coordinate system, Dmin is the minimum distance of the calibration characteristic point on the jth and j +1 th adjacent lane lines in the x direction of the vehicle coordinate system, and d is the preset lane line distance.
In this embodiment, through least square optimization, the rotation transformation matrix cam _ R _ car that minimizes the overall error can be obtained, and the z-coordinate of the calibration feature point in the vehicle coordinate system can also be obtained. Since it is assumed that the origin of the vehicle coordinate system coincides with the origin of the camera coordinate system, the z-coordinate of the calibration feature point in the vehicle coordinate system corresponds to the camera mounting height. Therefore, the camera mounting height can be obtained through the least square optimization, and further the vehicle coordinate data of the calibration feature points can be obtained according to the pixel coordinate data of the calibration feature points, the camera internal reference data, the functional relation (1) and the camera mounting height.
In the embodiment, only the rotation transformation relation between the camera coordinate system and the vehicle coordinate system needs to be obtained, and the pixel coordinates of the calibration feature points, the vehicle coordinates, the rotation transformation relation between the camera coordinate system and the vehicle coordinate system, the functional relation between the camera internal parameters and the constraint relation between the calibration feature points are simpler, so that the calculation amount is smaller, the efficiency can be improved, and the calculation resources can be saved.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 4 is a schematic structural diagram of a computing device according to an embodiment of the present application.
Referring to fig. 4, computing device 50 includes memory 501 and processor 502.
The Processor 502 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 501 may include various types of storage units, such as system memory, Read Only Memory (ROM), and permanent storage. Wherein the ROM may store static data or instructions for the processor 502 or other modules of the computer. The persistent storage device may be a read-write storage device. The persistent storage may be a non-volatile storage device that does not lose stored instructions and data even after the computer is powered off. In some embodiments, the persistent storage device employs a mass storage device (e.g., magnetic or optical disk, flash memory) as the persistent storage device. In other embodiments, the permanent storage may be a removable storage device (e.g., floppy disk, optical drive). The system memory may be a read-write memory device or a volatile read-write memory device, such as a dynamic random access memory. The system memory may store instructions and data that some or all of the processors require at runtime. Further, the memory 501 may comprise any combination of computer-readable storage media, including various types of semiconductor memory chips (DRAM, SRAM, SDRAM, flash, programmable read-only memory), magnetic and/or optical disks, may also be employed. In some embodiments, memory 501 may include a removable storage device that is readable and/or writable, such as a Compact Disc (CD), a read-only digital versatile disc (e.g., DVD-ROM, dual layer DVD-ROM), a read-only Blu-ray disc, an ultra-density optical disc, a flash memory card (e.g., SD card, min SD card, Micro-SD card, etc.), a magnetic floppy disk, or the like. Computer-readable storage media do not contain carrier waves or transitory electronic signals transmitted by wireless or wired means.
The memory 501 has stored thereon executable code that, when processed by the processor 502, may cause the processor 502 to perform some or all of the methods described above.
The aspects of the present application have been described in detail hereinabove with reference to the accompanying drawings. In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments. Those skilled in the art should also appreciate that the acts and modules referred to in the specification are not necessarily required in the present application. In addition, it can be understood that the steps in the method of the embodiment of the present application may be sequentially adjusted, combined, and deleted according to actual needs, and the modules in the device of the embodiment of the present application may be combined, divided, and deleted according to actual needs.
Furthermore, the method according to the present application may also be implemented as a computer program or computer program product comprising computer program code instructions for performing some or all of the steps of the above-described method of the present application.
Alternatively, the present application may also be embodied as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) having stored thereon executable code (or a computer program, or computer instruction code) which, when executed by a processor of an electronic device (or electronic device, server, etc.), causes the processor to perform part or all of the various steps of the above-described method according to the present application.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the applications disclosed herein may be implemented as electronic hardware, computer software, or combinations of both.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present application, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (10)

1. A data processing method, comprising:
acquiring pixel coordinate data of a plurality of calibration feature points in a current frame image acquired by a vehicle-mounted camera;
and obtaining the current external parameters of the camera according to the pixel coordinate data of the calibration characteristic points.
2. The method of claim 1, wherein obtaining the current camera outlier from the pixel coordinate data of the plurality of calibration feature points comprises:
and obtaining the rotation transformation relation and the camera installation height according to the pixel coordinate data and the camera internal reference data of the calibration characteristic points, the pixel coordinates of the preset calibration characteristic points, the vehicle coordinates, the rotation transformation relation between the camera coordinate system and the vehicle coordinate system and the function relation between the camera internal references.
3. The method of claim 2, wherein obtaining the rotational transformation relationship is further based on a constraint relationship between predetermined calibration feature points.
4. The method of claim 3, wherein obtaining the rotational transformation relationship and camera mounting height comprises:
and obtaining the rotation transformation relation and the camera installation height which enable the preset overall error to be minimum through least square optimization.
5. The method of any of claims 2 to 4, wherein the functional relationship comprises:
and the vehicle coordinate of the calibration characteristic point is equal to the inverse matrix of the camera internal reference matrix, and the inverse matrix of the rotation transformation matrix between the camera coordinate system and the vehicle coordinate system is multiplied by the pixel coordinate of the calibration characteristic point.
6. The method of claim 5,
the plurality of calibration characteristic points are positioned on at least three lane lines;
the constraint relationship includes:
the z coordinates of the calibration characteristic points under the vehicle coordinate system are equal;
x coordinates of all the calibration characteristic points on the same lane line under a vehicle coordinate system are equal;
the distance between each pair of adjacent lane lines is equal to a preset lane line distance value.
7. The method of claim 6, wherein the overall error comprises a sum of absolute values of error terms:
the mean square error Di of the x coordinate of each calibration characteristic point on each lane line in a vehicle coordinate system is shown, wherein i =1, 2, ¼, K; and
maximum error E (j, j + 1) in the spacing of each pair of adjacent lane lines, where j =1, 2, ¼, K-1;
and K is the number of lane lines in the current frame image.
8. The method of claim 4, further comprising:
and obtaining the vehicle coordinate data of the calibration characteristic points according to the pixel coordinate data of the calibration characteristic points, the camera internal reference data, the functional relation and the camera installation height.
9. The method of claim 1, wherein the onboard camera is a camera of a tachograph.
10. A computing device, comprising:
a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method of any one of claims 1-9.
CN202011022321.3A 2020-09-25 2020-09-25 Data processing method and computing device Pending CN114255280A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011022321.3A CN114255280A (en) 2020-09-25 2020-09-25 Data processing method and computing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011022321.3A CN114255280A (en) 2020-09-25 2020-09-25 Data processing method and computing device

Publications (1)

Publication Number Publication Date
CN114255280A true CN114255280A (en) 2022-03-29

Family

ID=80790352

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011022321.3A Pending CN114255280A (en) 2020-09-25 2020-09-25 Data processing method and computing device

Country Status (1)

Country Link
CN (1) CN114255280A (en)

Similar Documents

Publication Publication Date Title
US9727793B2 (en) System and method for image based vehicle localization
CN111862179B (en) Three-dimensional object modeling method and apparatus, image processing device, and medium
CN110927708B (en) Calibration method, device and equipment of intelligent road side unit
US20120308114A1 (en) Voting strategy for visual ego-motion from stereo
CN112118537B (en) Method and related device for estimating movement track by using picture
CN111930877B (en) Map guideboard generation method and electronic equipment
US20190339705A1 (en) Transition map between lidar and high-definition map
CN114241062A (en) Camera external parameter determination method and device for automatic driving and computer readable storage medium
CN113284194A (en) Calibration method, device and equipment for multiple RS (remote sensing) equipment
CN114255280A (en) Data processing method and computing device
CN112348903B (en) Method and device for calibrating external parameters of automobile data recorder and electronic equipment
CN114897987B (en) Method, device, equipment and medium for determining vehicle ground projection
CN113284193B (en) Calibration method, device and equipment of RS equipment
CN114255281A (en) Data processing method and computing device
Obdržálek et al. A voting strategy for visual ego-motion from stereo
CN112556702A (en) Height correction method for vehicle moving track and related device
CN112183378A (en) Road slope estimation method and device based on color and depth image
CN115523929B (en) SLAM-based vehicle-mounted integrated navigation method, device, equipment and medium
CN113009533A (en) Vehicle positioning method and device based on visual SLAM and cloud server
CN113781661B (en) Immersion scene-oriented multi-projection space layout evaluation method and system
CN115311370A (en) Camera external parameter calibration and evaluation method and device, electronic equipment and storage medium
CN113538546B (en) Target detection method, device and equipment for automatic driving
CN116304142B (en) Point cloud data acquisition method, device, equipment and storage medium
Abi Farraj et al. Non-iterative planar visual odometry using a monocular camera
US11948327B2 (en) Method and system for joint object location and ground plane estimation in computer vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination