CN112184822B - Camera pitch angle adjusting method and device, storage medium and electronic equipment - Google Patents

Camera pitch angle adjusting method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN112184822B
CN112184822B CN201910585360.5A CN201910585360A CN112184822B CN 112184822 B CN112184822 B CN 112184822B CN 201910585360 A CN201910585360 A CN 201910585360A CN 112184822 B CN112184822 B CN 112184822B
Authority
CN
China
Prior art keywords
vehicle
camera
vanishing point
coordinate
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910585360.5A
Other languages
Chinese (zh)
Other versions
CN112184822A (en
Inventor
陈波
宋巍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Horizon Robotics Technology Research and Development Co Ltd
Original Assignee
Beijing Horizon Robotics Technology Research and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Horizon Robotics Technology Research and Development Co Ltd filed Critical Beijing Horizon Robotics Technology Research and Development Co Ltd
Priority to CN201910585360.5A priority Critical patent/CN112184822B/en
Publication of CN112184822A publication Critical patent/CN112184822A/en
Application granted granted Critical
Publication of CN112184822B publication Critical patent/CN112184822B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The embodiment of the disclosure discloses a method and a device for adjusting a camera pitch angle, a storage medium and electronic equipment, wherein the method comprises the following steps: obtaining at least one frame of image according to a camera arranged on a vehicle, and determining first vanishing point coordinates of the camera based on the at least one frame of image; wherein the image includes at least one front vehicle located in front of the vehicle; determining a second vanishing point coordinate of the camera according to a preset pitch angle of the camera; determining the coordinate difference based on the first vanishing point coordinates and the second vanishing point coordinates; and adjusting a preset pitch angle of the camera according to the coordinate difference value. According to the embodiment of the disclosure, the real vanishing point is calculated by carrying out structural sensing on the vehicle, so that dynamic real-time calibration of the pitch angle can be conveniently carried out in a straight-line scene.

Description

Camera pitch angle adjusting method and device, storage medium and electronic equipment
Technical Field
The disclosure relates to camera external parameter calibration technology, in particular to a method and a device for adjusting a camera pitch angle, a storage medium and electronic equipment.
Background
The camera has installation errors after being installed on the vehicle, and the estimation of the installation errors is an important content in camera external parameter calibration. Since calibration has a great influence on the accuracy of visual perception results, and the Pitch angle Pitch is the most important parameter in external parameters of a camera, a stable and reliable Pitch calibration scheme is very important.
In the prior art, the pitch angle is adjusted by a static calibration offline calibration method, the calibration method needs to stop the vehicle at rest and perform calculation of camera external parameters according to a reference object arranged on site under a certain scene, and the calibration process is time-consuming and labor-consuming due to very high requirements on the relative positions of the vehicle and the reference object.
Disclosure of Invention
The present disclosure has been made in order to solve the above technical problems. The embodiment of the disclosure provides a method and a device for adjusting a camera pitch angle, a storage medium and electronic equipment.
According to an aspect of the embodiments of the present disclosure, there is provided a method for adjusting a pitch angle of a camera, including:
obtaining at least one frame of image according to a camera arranged on a vehicle, and determining first vanishing point coordinates of the camera based on the at least one frame of image; wherein the image includes at least one front vehicle located in front of the vehicle;
Determining a second vanishing point coordinate of the camera according to a preset pitch angle of the camera;
determining a coordinate difference between the first vanishing point coordinate and the second vanishing point coordinate based on the first vanishing point coordinate and the second vanishing point coordinate;
and adjusting a preset pitch angle of the camera according to the coordinate difference value.
According to another aspect of the embodiments of the present disclosure, there is provided an adjusting apparatus for a camera pitch angle, including:
a first coordinate determining module for obtaining at least one frame of image according to a camera arranged on a vehicle, and determining a first vanishing point coordinate of the camera based on the at least one frame of image; wherein the image includes at least one front vehicle located in front of the vehicle;
the second coordinate determining module is used for determining a second vanishing point coordinate of the camera according to a preset pitch angle of the camera;
a coordinate difference determining module, configured to determine a coordinate difference between the first vanishing point coordinate and the second vanishing point coordinate based on the first vanishing point coordinate determined by the first coordinate determining module and the second vanishing point coordinate determined by the second coordinate determining module;
and the pitch angle adjusting module is used for adjusting the preset pitch angle of the camera according to the coordinate difference value determined by the coordinate difference value determining module.
According to still another aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium storing a computer program for executing the method for adjusting a camera pitch angle provided by the above embodiments.
According to still another aspect of the embodiments of the present disclosure, there is provided an electronic device including:
a processor;
a memory for storing the processor-executable instructions;
the processor is configured to read the executable instruction from the memory, and execute the instruction to implement the method for adjusting a camera pitch angle provided in the foregoing embodiment.
According to the method and the device for adjusting the pitch angle of the camera, the storage medium and the electronic equipment provided by the embodiment of the disclosure, at least one frame of image is obtained according to the camera arranged on the vehicle, and the first vanishing point coordinate of the camera is determined based on the at least one frame of image; wherein the image includes at least one front vehicle located in front of the vehicle; determining a second vanishing point coordinate of the camera according to a preset pitch angle of the camera; determining the coordinate difference based on the first vanishing point coordinates and the second vanishing point coordinates; the preset pitch angle of the camera is adjusted according to the coordinate difference value, the real vanishing point is calculated by carrying out structural sensing on the vehicle, dynamic real-time calibration of the pitch angle can be conveniently carried out in a straight-line scene, and compared with static calibration, the calculation speed of the adjustment method is faster and the real-time performance is better.
The technical scheme of the present disclosure is described in further detail below through the accompanying drawings and examples.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing embodiments thereof in more detail with reference to the accompanying drawings. The accompanying drawings are included to provide a further understanding of embodiments of the disclosure, and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure, without limitation to the disclosure. In the drawings, like reference numerals generally refer to like parts or steps.
Fig. 1 is a block diagram of a system architecture provided by an embodiment of the present disclosure.
Fig. 2 is a schematic diagram of vanishing point calculation according to an embodiment of the present disclosure.
Fig. 3 is a schematic diagram of a calculation flow of vanishing points according to an embodiment of the disclosure.
Fig. 4 is a schematic flow chart of step 301 in the embodiment shown in fig. 3 of the present disclosure.
Fig. 5 is a logic diagram for determining straight running of a vehicle according to an embodiment of the present disclosure.
Fig. 6 is a flowchart illustrating a method for adjusting a pitch angle of a camera according to an exemplary embodiment of the present disclosure.
Fig. 7 is a schematic flow chart of step 601 in the embodiment shown in fig. 6 of the present disclosure.
Fig. 8 is a schematic flow chart of step 6012 in the embodiment of fig. 7 of the disclosure.
Fig. 9 is a schematic flow chart of step 6013 in the embodiment of fig. 7 of the disclosure.
Fig. 10 is a schematic flow chart of step 604 in the embodiment shown in fig. 6 of the present disclosure.
Fig. 11 is a schematic structural view of a camera pitch angle adjusting device according to an exemplary embodiment of the present disclosure.
Fig. 12 is a schematic structural view of an adjusting apparatus for a camera pitch angle according to another exemplary embodiment of the present disclosure.
Fig. 13 is a block diagram of an electronic device provided in an exemplary embodiment of the present disclosure.
Detailed Description
Hereinafter, example embodiments according to the present disclosure will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present disclosure and not all of the embodiments of the present disclosure, and that the present disclosure is not limited by the example embodiments described herein.
It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless it is specifically stated otherwise.
It will be appreciated by those of skill in the art that the terms "first," "second," etc. in embodiments of the present disclosure are used merely to distinguish between different steps, devices or modules, etc., and do not represent any particular technical meaning nor necessarily logical order between them.
It should also be understood that in embodiments of the present disclosure, "plurality" may refer to two or more, and "at least one" may refer to one, two or more.
It should also be appreciated that any component, data, or structure referred to in the presently disclosed embodiments may be generally understood as one or more without explicit limitation or the contrary in the context.
In addition, the term "and/or" in this disclosure is merely an association relationship describing an association object, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" in the present disclosure generally indicates that the front and rear association objects are an or relationship.
It should also be understood that the description of the various embodiments of the present disclosure emphasizes the differences between the various embodiments, and that the same or similar features may be referred to each other, and for brevity, will not be described in detail.
Meanwhile, it should be understood that the sizes of the respective parts shown in the drawings are not drawn in actual scale for convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
Embodiments of the present disclosure may be applicable to electronic devices such as terminal devices, computer systems, servers, etc., which may operate with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known terminal devices, computing systems, environments, and/or configurations that may be suitable for use with the terminal device, computer system, server, or other electronic device include, but are not limited to: personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, microprocessor-based systems, set-top boxes, programmable consumer electronics, network personal computers, minicomputer systems, mainframe computer systems, and distributed cloud computing technology environments that include any of the above systems, and the like.
Electronic devices such as terminal devices, computer systems, servers, etc. may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc., that perform particular tasks or implement particular abstract data types. The computer system/server may be implemented in a distributed cloud computing environment in which tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computing system storage media including memory storage devices.
Summary of the application
In the process of implementing the present disclosure, the inventor finds that the existing camera pitch angle calibration scheme is static calibration (offine), but the calibration method has at least the following problems: the vehicle is required to be stopped still and 'in a certain scene', camera external parameters are calculated according to a reference object arranged on site, and the calibration process is time-consuming and labor-consuming due to very high requirements on the relative positions of the vehicle and the reference object.
Exemplary System
Fig. 1 is a block diagram of a system architecture provided by an embodiment of the present disclosure. As shown in fig. 1, the system includes:
the PID control submodule 101 uses a classical PID control algorithm to calculate a pitch angle increment scaling factor (scale) according to the real-time deviation (the difference between the estimated vanishing point y coordinate of the vehicle in front and the vanishing point y coordinate calculated from the current image using the pitch angle).
The increment Step calculation sub-module 102 selects a proper basic increment value (Step) according to the current real-time deviation, and selects a larger Step value when the deviation (Error) is larger; when the deviation (Error) is small, a small Step value is selected.
The increment direction determination submodule 103 determines whether the current controlled object should be increased or decreased based on a deviation (Error) or other characteristic. Specifically, the control object: according to the difference between the estimated vanishing point y coordinate of the vehicle in front and the vanishing point y coordinate calculated by using Pitch of the current image; increment direction: when the difference between the y coordinates of the "true" vanishing point and the "false" vanishing point is positive, pitch needs to be reduced; when the difference between the y coordinates of the "true" vanishing point and the "false" vanishing point is negative, pitch needs to be increased.
The limiting submodule 104 is used for limiting the maximum and minimum values of the increment scale, limiting the maximum and minimum values of the output result of the PID control submodule 101 (the case of the increment scale being larger than the maximum value is limited to the maximum value, the case of the increment scale being smaller than the minimum value is limited to the minimum value), and preventing the Pitch value from becoming large or small, so that the subsequent control is abnormal.
The increment calculating sub-module 105 performs increment calculation according to the step output by the limit sub-module 104 and the increment direction output by the increment step calculating sub-module 102 and the increment direction output by the increment direction determining sub-module 103, as shown in formula (1):
delta=scale step (Delta) is equal to the Delta scaling factor scale multiplied by the base Delta value (step)) equation (1)
And, there are different step values for different incremental directions, i.e. there are two steps from the direction, namely a step representing an increasing positive value and a step representing a decreasing negative value, respectively, step direction calculation is performed by the 103 module; there are two levels of step values, large step and small step, in each direction, and the calculation is performed by the 102 module.
The filtering sub-module 106 performs filtering processing on the calculation result of the increment calculation sub-module 105, ensures smooth output and obtains a final increment value; a simple first order filtering may be used here, for example: y is Y n =(1-a)*Y n-1 +a*X n
Finally, pitch of the camera is adjusted in the increment value output by the filtering submodule 106 and the increment direction determined by the increment direction determining submodule 103.
Optionally, after determining the increment direction and increment value of the pitch angle of the camera in the above embodiments, feedback adjustment is implemented on the basis of the Image processing flow. Namely, the lane line sensing result is extracted from the image which has completed the sensing process, and the increment direction and the increment value of the Pitch angle of the camera are determined based on the difference of the lane line spacing before and after lane change, so that the Pitch used by the new input image is changed until the increment no longer needs to be calculated.
The key of the present disclosure is how to calculate the y-coordinate of the real vanishing point from the vehicle in front of the vehicle, and fig. 2 is a schematic diagram of vanishing point calculation according to an embodiment of the present disclosure. As shown in fig. 2, the projection analysis of the vanishing point is performed with the ground plane (ground plane) as the horizontal axis and the vertical y direction of the image plane as the vertical axis, it can be seen that if the height of the camera in the image space can be determined, the y coordinate of the vanishing point in the image plane can be calculated from a certain reference point on the ground plane, and can be calculated by the formula (2):
vanishing point y coordinate = ground plane reference point y coordinate-camera image height (note that the y-axis direction of the image plane is vertically downward) formula (2)
Because the vehicle is the object that is always running on the ground plane, the point of the vehicle bottom directly in front of the camera field of view can be selected as the reference point on the ground plane.
The specific flowchart is shown in fig. 3, and fig. 3 is a schematic diagram of a calculation flow of vanishing points according to an embodiment of the disclosure. Comprising the following steps: step 301, filtering vehicles in front according to a certain method, and selecting a proper reference vehicle; step 302, calculating the actual height of the vehicle: assuming that the actual widths of all the vehicles are a certain default value (for example, an empirical value), the actual vehicle height is calculated according to the following formula (3):
vehicle actual height/vehicle actual width=vehicle pixel height/vehicle pixel width formula (3);
step 303, calculating the pixel height of the camera: the camera actual mounting height is a known parameter, so the camera pixel height can be calculated according to equation (4):
camera pixel height/camera actual height = vehicle pixel height/vehicle actual height formula (4);
step 304, calculating the y-coordinate of the real vanishing point based on equation (5):
vanishing point y coordinate = lower right corner y coordinate of vehicle binding box (ground plane reference point) -camera pixel height formula (5)
The coordinates of the right lower corner of the vehicle bounding box are known inputs, and specifically, the coordinates of the right lower corner of the vehicle can be obtained from a structural sensing result of the vehicle, which comprises the coordinates of the right lower corner;
step 305, calculating the y coordinate of a vanishing point for each filtered vehicle by the method, and then obtaining an average value;
in step 306, in order to ensure smooth output of the result, filtering processing is required to be performed on the result, considering that the position of the front vehicle may change frequently under the actual road condition. That is, the average value obtained in step 305 is a calculation result of a single frame, and the vehicle position obtained in step 301 may be frequently changed, so that the calculation results of a plurality of continuous single frames are filtered according to the moving average filtering method, thereby reducing the jitter of the final value.
Fig. 4 is an alternative example flow chart of step 301 in the embodiment shown in fig. 3. As shown in fig. 4, the vehicle filtering process includes:
step 3011, for the same vehicle, there are two models, namely a head model and a tail model, in the 2D sensing result, and considering that the default actual width of the vehicle needs to be uniformly selected, the tail model is more suitable;
step 3012, the observed vehicle in front is likely not on the same ground plane as the current vehicle, and does not conform to the theoretical basis of calculating vanishing points in fig. 2, so filtering is needed, and the filtering method is to judge whether the vehicle is in the range of the drivable region, i.e. according to the image space, if the drivable region boundary point (the type of the point is the vehicle) falls in the ground box of the vehicle, the vehicle is considered to be on the same ground plane as the current vehicle;
Step 3013, determining the distance between other vehicles and the current vehicle by using the pixel area of the vehicle's binding box, for Pitch, preferably selecting a far reference object for calibration, and using the pixel area of the vehicle's binding box as a judging standard of distance;
step 3014, taking a passenger car as an example, and taking the pixel height-to-width ratio of a vehicle to be detected as a binding box smaller than a certain threshold value to determine the type of the vehicle (i.e. consider that a vehicle with a larger pixel width ratio is a passenger car, such as a general Sedan, SUV, etc., all of which are large in vehicle width ratio;
in step 3015, the N vehicles farthest from the filtering result are selected as the final reference vehicles, for example, N is 1 to 3.
The pitch angle adjusting method provided by the disclosure can realize adjustment of the vehicle under the condition of non-straight running theoretically, but because the vehicle tail model is needed to be used, namely the complete vehicle tail is needed to be observed as much as possible, the observation effect is optimal during straight running, and the calibration result under the straight running is more stable than that under the non-straight running under the actual test, therefore, the pitch angle adjustment of the vehicle under the straight running condition can be limited, and the calibration of the system under the curved road and other scenes can be avoided by judging the current running condition of the vehicle. The calculation of the vehicle state is mainly based on the vehicle speed and the yaw rate signals in the vehicle chassis CAN message, the logic of which is shown in fig. 5, and fig. 5 is a judgment logic diagram of the vehicle straight-road running according to the embodiment of the disclosure. Specifically, when the vehicle speed is sufficiently fast (greater than or equal to the speed threshold) and the steering wheel is substantially stable (the yaw rate is less than the yaw rate threshold), the vehicle is considered to be traveling on a straight road, otherwise the vehicle would be considered not to be traveling on a straight road, and the scenario is not suitable for calibration.
Exemplary method
Fig. 6 is a flowchart illustrating a method for adjusting a pitch angle of a camera according to an exemplary embodiment of the present disclosure. The embodiment can be applied to an electronic device, as shown in fig. 6, and includes the following steps:
at step 601, at least one frame of image is obtained from a camera disposed on the vehicle, and first vanishing point coordinates of the camera are determined based on the at least one frame of image.
Wherein the image comprises at least one vehicle in front of the vehicle, wherein the first vanishing point coordinates may be y-axis coordinates of the real vanishing point. The vanishing point refers to any group of parallel straight lines which are not parallel to the projection plane, and the straight lines obtained after the projection on the projection plane are converged at the same point, and the vanishing point is the vanishing point.
Step 602, determining the second vanishing point coordinates of the camera according to the preset pitch angle of the camera.
In an embodiment, the second vanishing point coordinate is a y-axis coordinate of the vanishing point corresponding to the current pitch angle.
Step 603, determining a coordinate difference between the first vanishing point coordinate and the second vanishing point coordinate based on the first vanishing point coordinate and the second vanishing point coordinate.
For example, the coordinate difference refers to a coordinate difference in the y-axis direction.
And step 604, adjusting the preset pitch angle of the camera according to the coordinate difference value.
Optionally, the preset pitch angle of the camera is increased or decreased according to the coordinate difference.
According to the method for adjusting the pitch angle of the camera, at least one frame of image is obtained according to the camera arranged on the vehicle, and the first vanishing point coordinate of the camera is determined based on the at least one frame of image; wherein the image includes at least one front vehicle located in front of the vehicle; determining a second vanishing point coordinate of the camera according to a preset pitch angle of the camera; determining the coordinate difference based on the first vanishing point coordinates and the second vanishing point coordinates; the preset pitch angle of the camera is adjusted according to the coordinate difference value, the real vanishing point is calculated by carrying out structural sensing on the vehicle, dynamic real-time calibration of the pitch angle can be conveniently carried out in a straight-line scene, and compared with static calibration, the calculation speed of the adjustment method is faster and the real-time performance is better.
As shown in fig. 7, on the basis of the embodiment shown in fig. 6, step 601 may include the following steps:
step 6011, filtering from a plurality of front vehicles included in at least one frame of image to obtain a preset number of reference vehicles.
Alternatively, a preset number of reference vehicles may be obtained by filtering through step 301 in the embodiment shown in fig. 3, wherein the preset number may be set to different values according to different situations.
Step 6012, for each frame of image in the at least one frame of image, determining a vanishing point of the camera for each reference vehicle in the preset number of reference vehicles, respectively, to obtain at least one vanishing point coordinate.
Alternatively, one vanishing point coordinate may be determined for each reference vehicle through steps 302, 303 and 304 in the embodiment shown in fig. 3, and in particular, coordinates of vanishing points, which may be vanishing point y coordinates, may be obtained through equations (3), (4) and (5).
Step 6013, determining a first vanishing point coordinate of the camera based on the at least one vanishing point coordinate.
At least one vanishing point coordinate (for each reference vehicle, respectively) may be obtained through the above steps, and in order to determine the first vanishing point coordinate of one camera, alternatively, the at least one vanishing point coordinate may be averaged by a method provided in step 305 in the embodiment shown in fig. 3, and the average value is taken as the first vanishing point coordinate of the camera; the accuracy of the obtained vanishing point coordinates is improved by determining the first vanishing point coordinates of the camera by a set number of reference vehicles.
In some alternative embodiments, step 6011 may comprise: and filtering the plurality of front vehicles based on the set conditions to obtain a preset number of reference vehicles.
Wherein the setting conditions include, but are not limited to, at least one of: whether the front vehicle is at the same ground level as the vehicle, the distance between the front vehicle and the vehicle, the types of a plurality of front vehicles, whether the front vehicle is a tail in the image, and the like.
Alternatively, considering that the default actual width of the vehicle needs to be selected uniformly, the vehicle tail model is more suitable, and the width of the vehicle head cannot be uniform, so that the front vehicle is the vehicle tail in the image as one of the set conditions. Since the vehicle width of some vehicle types is difficult to determine, for example, a commercial vehicle; therefore, the vehicle is regarded as one of the setting conditions, and the vehicle type is limited to a vehicle type in which the vehicle width can be determined, for example, a passenger vehicle. Since the preceding vehicle and the vehicle have different vehicle speeds, respectively, when the distance between the preceding vehicle and the vehicle is too short, it may occur that the preceding vehicle disappears after several frames of images (e.g., is overtaken by the own vehicle), and therefore, the distance between the preceding vehicle and the vehicle is taken as one of the setting conditions.
Alternatively, filtering a plurality of front vehicles may be implemented by the embodiment provided in fig. 4, where in this embodiment, if all the front vehicles are used as reference vehicles, there may be an increased calculation amount, and the deviation of the finally calculated first vanishing point coordinates is relatively large due to the inappropriateness of the reference vehicles; therefore, the present embodiment filters the preceding vehicles by setting conditions, obtains a preset number of preceding vehicles as reference vehicles, reduces the calculation amount, and reduces the deviation at the same time.
In some alternative embodiments, filtering the plurality of front vehicles based on the set condition to obtain a preset number of reference vehicles includes:
determining whether the front vehicle is at the same ground plane as the vehicle according to whether at least one point in the drivable area included in the image is within a detection frame of the front vehicle; in response to the preceding vehicle being at the same ground level as the vehicle, the preceding vehicle is determined to be a reference vehicle.
The observed front vehicle is likely not to be on the same ground plane (ground plane) as the current vehicle, at this time, the theoretical basis of calculating vanishing points does not accord with fig. 2, so filtering is needed, the filtering method is to judge whether the vehicle is in the range of a drivable region (drivable region), namely, if the drivable region boundary point (the type of the point is the vehicle) falls in the boundary box (bounding box) of the vehicle according to the image space, the observed front vehicle is considered to be on the same ground plane as the current vehicle, and the vehicle which does not belong to the same ground plane as the vehicle is filtered through the embodiment, namely, the point which does not accord with the theory of calculating the vanishing point is removed, so that the accuracy of the first vanishing point coordinate obtained by calculation is improved.
As shown in fig. 8, on the basis of the embodiment shown in fig. 7, step 6012 may include the steps of:
step 801, for each reference vehicle, determines the actual vehicle height of the reference vehicle based on the actual vehicle width, the vehicle pixel width, and the vehicle pixel height.
Alternatively, the actual vehicle height is determined by equation (3) in the embodiment shown in fig. 3 described above.
Step 802, determining a pixel height of the camera based on the actual height of the camera, the actual height of the vehicle, and the vehicle pixel height.
Alternatively, the pixel height of the camera is determined by equation (4) in the embodiment shown in fig. 3 above, where the camera actual mounting height is a known parameter (e.g., vehicle height, etc.).
Step 803, determining a vanishing point of the camera based on the lower right corner coordinates of the detection frame of the vehicle and the pixel height of the camera, and obtaining at least one vanishing point coordinate.
Alternatively, the vanishing point coordinates are determined by the formula (5) in the embodiment shown in fig. 3 described above, in which the lower right corner coordinates of the vehicle bounding box are known inputs (for example, the structured sensing result obtained by the image processing may include the lower right corner coordinates of the vehicle detection frame). The present embodiment improves the accuracy of the obtained vanishing point coordinates by performing formula-based calculation of vanishing point coordinates (y-axis coordinates) based on the known data and the relation between the data.
As shown in fig. 9, on the basis of the embodiment shown in fig. 7, step 6013 may include the steps of:
step 901, averaging at least one vanishing point coordinate corresponding to each frame of image to obtain an average coordinate.
And step 902, smoothing at least one average coordinate corresponding to at least one frame of image to obtain a first vanishing point coordinate of the camera.
After obtaining at least one vanishing point coordinate, the embodiment optionally may average the at least one vanishing point coordinate through step 305 provided in fig. 3, and filter the average coordinate through step 306 provided in fig. 3 to ensure smooth output, where the position of the reference vehicle may change frequently, and filter the average coordinate corresponding to a plurality of continuous single-frame images, so as to reduce jitter of the final value.
As shown in fig. 10, step 604 may include the following steps, based on the embodiment shown in fig. 6, described above:
step 6041, determining an increment value and an increment direction of the pitch angle of the camera according to the coordinate difference value.
Optionally, determining an increment scaling factor and a base increment value of the pitch angle of the camera according to the coordinate difference value; an increment value is determined based on the increment scale factor and the base increment value.
Wherein, in response to the absolute value of the coordinate difference being greater than or equal to a set threshold, determining a first set value as a base increment value; and in response to the absolute value of the coordinate difference being less than the set threshold, determining a second set point as a base increment value, the first set point being greater than the second set point. The two basic increment values are determined by setting the threshold value, and each adjustment of the pitch angle of the camera is ensured to be in a controllable range by taking the set value as the basic increment value.
Wherein, in response to the incremental scaling factor being greater than the set maximum value, the set maximum value is used as the incremental scaling factor; in response to the delta scaling factor being less than the set minimum value, the set minimum value is used as the delta scaling factor.
Optionally, in response to the coordinate difference being a positive value, the incremental direction of the pitch angle of the camera is reduced; in response to the coordinate difference being a negative value, the direction of the increment of the pitch angle of the camera is increased.
Step 6042 adjusts the pitch angle of the camera based on the delta direction and the delta value.
The method for determining the increment value according to this embodiment may be implemented by the embodiment provided in fig. 1, where the increment scaling factor is obtained and the base increment value is determined by different modules, where the base increment value may be preset, for example, two base increment values (large step and small step) are set as provided in the embodiment provided in fig. 1; the increment value can be determined by the formula delta=scale, namely the increment value (Delta) is equal to the increment scaling factor scale multiplied by the basic increment value (step), the increment value is determined by combining the increment scaling factor and the basic increment value, so that the diversity adjustability of the increment value is realized, the multiple increment of the basic increment value is realized by the increment scaling factor, the increment value is more regular, and the operation is easy; the embodiment realizes the rapid adjustment of the pitch angle in the determined increment direction through the obtained increment value.
In some alternative embodiments, prior to step 601, further comprising:
and acquiring speed information and yaw rate information of the vehicle, and determining whether the vehicle is in a straight running state.
Alternatively, determining whether the vehicle is in a straight running state may be accomplished by the judgment logic diagram shown in fig. 5.
Step 601 in this embodiment comprises: in response to the vehicle being in a straight-ahead state, at least one frame of image is obtained from a camera disposed on the vehicle, and first vanishing point coordinates of the camera are determined based on the at least one frame of image.
In the method provided by the embodiment, the observation effect is optimal when the first vanishing point coordinate is determined to be in the straight running state, and the calibration result is more stable under the straight running state than that under the non-straight running state after the actual test, so that the embodiment limits the pitch angle adjustment of the vehicle under the straight running state, and the method can specifically realize that the current running working condition of the vehicle is judged, and the system is prevented from calibrating under the scenes such as curves. For example, when the vehicle speed is sufficiently fast (greater than or equal to the speed threshold) and the steering wheel is substantially stable (the yaw rate is less than the yaw rate threshold), the vehicle is considered to be traveling on a straight road, otherwise the vehicle would be considered not to be traveling on a straight road, and the scenario is not suitable for calibration.
Any of the methods for adjusting camera pitch angle provided by the embodiments of the present disclosure may be performed by any suitable device having data processing capabilities, including, but not limited to: terminal equipment, servers, etc. Alternatively, any of the methods for adjusting a camera pitch angle provided in the embodiments of the present disclosure may be executed by a processor, for example, the processor executes any of the methods for adjusting a camera pitch angle mentioned in the embodiments of the present disclosure by calling corresponding instructions stored in a memory. And will not be described in detail below.
Exemplary apparatus
Fig. 11 is a schematic structural view of a camera pitch angle adjusting device according to an exemplary embodiment of the present disclosure. The device provided by the embodiment comprises:
the first coordinate determining module 111 is configured to obtain at least one frame of image from a camera disposed on the vehicle, and determine first vanishing point coordinates of the camera based on the at least one frame of image.
Wherein the image includes at least one vehicle in front of the vehicle.
The second coordinate determining module 112 is configured to determine a second vanishing point coordinate of the camera according to a preset pitch angle of the camera.
A coordinate difference determining module 113 for determining a coordinate difference between the first vanishing point coordinate and the second vanishing point coordinate based on the first vanishing point coordinate determined by the first coordinate determining module 111 and the second vanishing point coordinate determined by the second coordinate determining module 112.
The pitch angle adjusting module 114 is configured to adjust a preset pitch angle of the camera according to the coordinate difference value determined by the coordinate difference value determining module 113.
According to the adjusting device for the pitch angle of the camera, at least one frame of image is obtained according to the camera arranged on the vehicle, and the first vanishing point coordinate of the camera is determined based on the at least one frame of image; wherein the image includes at least one front vehicle located in front of the vehicle; determining a second vanishing point coordinate of the camera according to a preset pitch angle of the camera; determining the coordinate difference based on the first vanishing point coordinates and the second vanishing point coordinates; the preset pitch angle of the camera is adjusted according to the coordinate difference value, the real vanishing point is calculated by carrying out structural sensing on the vehicle, dynamic real-time calibration of the pitch angle can be conveniently carried out in a straight-line scene, and compared with static calibration, the calculation speed of the adjustment method is faster and the real-time performance is better.
Fig. 12 is a schematic structural view of an adjusting apparatus for a camera pitch angle according to another exemplary embodiment of the present disclosure. The device provided by the embodiment comprises:
The first coordinate determination module 111 includes:
the vehicle filtering unit 1111 is configured to obtain a preset number of reference vehicles from filtering a plurality of front vehicles included in at least one frame of image.
In an embodiment, the vehicle filtering unit 1111 is specifically configured to filter a plurality of front vehicles based on a set condition to obtain a preset number of reference vehicles; the setting conditions include, but are not limited to, at least one of: whether the front vehicle is at the same ground level as the vehicle, the distance between the front vehicle and the vehicle, the types of a plurality of front vehicles, whether the front vehicle is a tail in the image, and the like.
Optionally, the vehicle filtering unit 1111 is specifically configured to determine whether the front vehicle is at the same ground plane as the vehicle according to whether at least one point in the drivable area included in the image is within a detection frame of the front vehicle; in response to the preceding vehicle being at the same ground level as the vehicle, the preceding vehicle is determined to be a reference vehicle.
A vanishing point determining unit 1112 configured to determine, for each of the at least one frame of images, vanishing points of the cameras for each of the preset number of reference vehicles obtained by the vehicle filtering unit 1111, respectively, and obtain at least one vanishing point coordinate.
Alternatively, the vanishing point determining unit 1112 is specifically configured to determine, for each reference vehicle, a vehicle actual height of the reference vehicle based on the vehicle actual width, the vehicle pixel width and the vehicle pixel height; determining a pixel height of the camera based on the actual height of the camera, the actual height of the vehicle, and the vehicle pixel height; and determining a vanishing point of the camera based on the lower right corner coordinates of the detection frame of the vehicle and the pixel height of the camera, and obtaining at least one vanishing point coordinate.
A coordinate determination unit 1113 for determining a first vanishing point coordinate of the camera based on at least one vanishing point coordinate obtained by the vanishing point determination unit 1112.
Alternatively, the coordinate determining unit 1113 is specifically configured to average at least one vanishing point coordinate corresponding to each frame of image, to obtain an average coordinate; and smoothing at least one average coordinate corresponding to at least one frame of image to obtain a first vanishing point coordinate of the camera.
In this embodiment, the pitch angle adjustment module 114 is specifically configured to determine an increment value and an increment direction of the pitch angle of the camera according to the coordinate difference value; the pitch angle of the camera is adjusted based on the delta direction and the delta value.
Optionally, the apparatus provided by this embodiment further includes:
The straight running determining module 121 is configured to obtain vehicle speed information and yaw rate information of the vehicle, and determine whether the vehicle is in a straight running state.
At this time, the first coordinate determination module 111 is configured to obtain at least one frame of image from a camera provided on the vehicle in response to the vehicle being in a straight-going state, and determine first vanishing point coordinates of the camera based on the at least one frame of image.
Exemplary electronic device
Next, an electronic device according to an embodiment of the present disclosure is described with reference to fig. 13. The electronic device may be either or both of the first device 100 and the second device 200, or a stand-alone device independent thereof, which may communicate with the first device and the second device to receive the acquired input signals therefrom.
Fig. 13 illustrates a block diagram of an electronic device according to an embodiment of the disclosure.
As shown in fig. 13, the electronic device 130 includes one or more processors 131 and memory 132.
Processor 131 may be a Central Processing Unit (CPU) or other form of processing unit having data processing and/or instruction execution capabilities and may control other components in electronic device 130 to perform desired functions.
Memory 132 may comprise one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random Access Memory (RAM) and/or cache memory (cache), and the like. The non-volatile memory may include, for example, read Only Memory (ROM), hard disk, flash memory, and the like. One or more computer program instructions may be stored on the computer readable storage medium that may be executed by the processor 131 to implement the methods of adjusting camera pitch angle and/or other desired functions of the various embodiments of the present disclosure described above. Various contents such as an input signal, a signal component, a noise component, and the like may also be stored in the computer-readable storage medium.
In one example, electronic device 130 may further include: an input device 133 and an output device 134, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
For example, when the electronic device is the first device 100 or the second device 200, the input means 133 may be a microphone or a microphone array as described above for capturing an input signal of a sound source. When the electronic device is a stand-alone device, the input means 133 may be a communication network connector for receiving the acquired input signals from the first device 100 and the second device 200.
In addition, the input device 133 may also include, for example, a keyboard, a mouse, and the like.
The output device 134 may output various information to the outside, including the determined distance information, direction information, and the like. The output device 134 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, etc.
Of course, only some of the components of the electronic device 130 that are relevant to the present disclosure are shown in fig. 13 for simplicity, components such as buses, input/output interfaces, etc. are omitted. In addition, the electronic device 130 may include any other suitable components depending on the particular application.
Exemplary computer program product and computer readable storage Medium
In addition to the methods and apparatus described above, embodiments of the present disclosure may also be a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform steps in a method of adjusting a camera pitch angle according to various embodiments of the present disclosure described in the "exemplary methods" section of the present description.
The computer program product may write program code for performing the operations of embodiments of the present disclosure in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present disclosure may also be a computer-readable storage medium, having stored thereon computer program instructions, which when executed by a processor, cause the processor to perform the steps in a method of adjusting a camera pitch angle according to various embodiments of the present disclosure described in the above "exemplary method" section of the present disclosure.
The computer readable storage medium may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The basic principles of the present disclosure have been described above in connection with specific embodiments, however, it should be noted that the advantages, benefits, effects, etc. mentioned in the present disclosure are merely examples and not limiting, and these advantages, benefits, effects, etc. are not to be considered as necessarily possessed by the various embodiments of the present disclosure. Furthermore, the specific details disclosed herein are for purposes of illustration and understanding only, and are not intended to be limiting, since the disclosure is not necessarily limited to practice with the specific details described.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different manner from other embodiments, so that the same or similar parts between the embodiments are mutually referred to. For system embodiments, the description is relatively simple as it essentially corresponds to method embodiments, and reference should be made to the description of method embodiments for relevant points.
The block diagrams of the devices, apparatuses, devices, systems referred to in this disclosure are merely illustrative examples and are not intended to require or imply that the connections, arrangements, configurations must be made in the manner shown in the block diagrams. As will be appreciated by one of skill in the art, the devices, apparatuses, devices, systems may be connected, arranged, configured in any manner. Words such as "including," "comprising," "having," and the like are words of openness and mean "including but not limited to," and are used interchangeably therewith. The terms "or" and "as used herein refer to and are used interchangeably with the term" and/or "unless the context clearly indicates otherwise. The term "such as" as used herein refers to, and is used interchangeably with, the phrase "such as, but not limited to.
The methods and apparatus of the present disclosure may be implemented in a number of ways. For example, the methods and apparatus of the present disclosure may be implemented by software, hardware, firmware, or any combination of software, hardware, firmware. The above-described sequence of steps for the method is for illustration only, and the steps of the method of the present disclosure are not limited to the sequence specifically described above unless specifically stated otherwise. Furthermore, in some embodiments, the present disclosure may also be implemented as programs recorded in a recording medium, the programs including machine-readable instructions for implementing the methods according to the present disclosure. Thus, the present disclosure also covers a recording medium storing a program for executing the method according to the present disclosure.
It is also noted that in the apparatus, devices and methods of the present disclosure, components or steps may be disassembled and/or assembled. Such decomposition and/or recombination should be considered equivalent to the present disclosure.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the disclosure. Thus, the present disclosure is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, this description is not intended to limit the embodiments of the disclosure to the form disclosed herein. Although a number of example aspects and embodiments have been discussed above, a person of ordinary skill in the art will recognize certain variations, modifications, alterations, additions, and subcombinations thereof.

Claims (11)

1. A method for adjusting a pitch angle of a camera comprises the following steps:
obtaining at least one frame of image according to a camera arranged on a vehicle, and determining first vanishing point coordinates of the camera based on the at least one frame of image; wherein the image includes at least one front vehicle located in front of the vehicle;
determining a second vanishing point coordinate of the camera according to a preset pitch angle of the camera;
determining a coordinate difference between the first vanishing point coordinate and the second vanishing point coordinate based on the first vanishing point coordinate and the second vanishing point coordinate;
and adjusting a preset pitch angle of the camera according to the coordinate difference value.
2. The method of claim 1, wherein the obtaining at least one frame of image from a camera disposed on a vehicle, determining first vanishing point coordinates for the camera based on the at least one frame of image, comprises:
Filtering a plurality of front vehicles included in the at least one frame of image to obtain a preset number of reference vehicles;
for each frame of image in the at least one frame of image, determining a vanishing point of the camera for each reference vehicle in the preset number of reference vehicles respectively, and obtaining at least one vanishing point coordinate;
first vanishing point coordinates of the camera are determined based on the at least one vanishing point coordinate.
3. The method of claim 2, wherein the filtering the plurality of front vehicles included in the at least one frame of images to obtain a preset number of reference vehicles includes:
filtering the plurality of front vehicles based on set conditions to obtain a preset number of reference vehicles; the setting condition includes at least one of: whether the front vehicle is at the same ground level as the vehicle, a distance between the front vehicle and the vehicle, vehicle types of the plurality of front vehicles, and whether the front vehicle is a tail in the image.
4. The method of claim 3, wherein the filtering the plurality of front vehicles based on the set condition to obtain a preset number of reference vehicles comprises:
Determining whether the preceding vehicle is at the same ground plane as the vehicle according to whether at least one point in a drivable area included in the image is within a detection frame of the preceding vehicle;
and determining the front vehicle as a reference vehicle in response to the front vehicle being at the same ground level as the vehicle.
5. The method of claim 2, wherein the determining the vanishing point of the camera for each of the at least one reference vehicle, respectively, obtains at least one vanishing point coordinate, comprises:
determining, for each of the reference vehicles, a vehicle actual height of the reference vehicle based on a vehicle actual width, a vehicle pixel width, and a vehicle pixel height;
determining a pixel height of the camera based on an actual height of the camera, the vehicle actual height, and the vehicle pixel height;
and determining a vanishing point of the camera based on the lower right corner coordinates of the detection frame of the vehicle and the pixel height of the camera, and obtaining at least one vanishing point coordinate.
6. The method of claim 2, wherein the determining the first vanishing point coordinates of the camera based on the at least one vanishing point coordinate comprises:
Averaging at least one vanishing point coordinate corresponding to each frame of image to obtain an average coordinate;
and smoothing at least one average coordinate corresponding to the at least one frame of image to obtain a first vanishing point coordinate of the camera.
7. The method according to any one of claims 1-6, wherein said adjusting the preset pitch angle of the camera according to the coordinate difference comprises:
determining an increment value and an increment direction of a pitch angle of the camera according to the coordinate difference value;
and adjusting the pitch angle of the camera based on the increment direction and the increment value.
8. The method of claim 1, further comprising, prior to obtaining at least one frame of image from a camera disposed on a vehicle, determining first vanishing point coordinates for the camera based on the at least one frame of image:
acquiring speed information and yaw rate information of the vehicle, and determining whether the vehicle is in a straight running state;
the obtaining at least one frame of image according to a camera arranged on a vehicle, and determining first vanishing point coordinates of the camera based on the at least one frame of image includes:
in response to the vehicle being in a straight-ahead state, at least one frame of image is obtained from a camera disposed on the vehicle, and first vanishing point coordinates of the camera are determined based on the at least one frame of image.
9. An adjusting device for a camera pitch angle, comprising:
a first coordinate determining module for obtaining at least one frame of image according to a camera arranged on a vehicle, and determining a first vanishing point coordinate of the camera based on the at least one frame of image; wherein the image includes at least one front vehicle located in front of the vehicle;
the second coordinate determining module is used for determining a second vanishing point coordinate of the camera according to a preset pitch angle of the camera;
a coordinate difference determining module, configured to determine a coordinate difference between the first vanishing point coordinate and the second vanishing point coordinate based on the first vanishing point coordinate determined by the first coordinate determining module and the second vanishing point coordinate determined by the second coordinate determining module;
and the pitch angle adjusting module is used for adjusting the preset pitch angle of the camera according to the coordinate difference value determined by the coordinate difference value determining module.
10. A computer-readable storage medium storing a computer program for executing the method of adjusting a camera pitch angle according to any one of the preceding claims 1-8.
11. An electronic device, the electronic device comprising:
A processor;
a memory for storing the processor-executable instructions;
the processor is configured to execute the method for adjusting a pitch angle of a camera according to any one of claims 1 to 8.
CN201910585360.5A 2019-07-01 2019-07-01 Camera pitch angle adjusting method and device, storage medium and electronic equipment Active CN112184822B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910585360.5A CN112184822B (en) 2019-07-01 2019-07-01 Camera pitch angle adjusting method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910585360.5A CN112184822B (en) 2019-07-01 2019-07-01 Camera pitch angle adjusting method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112184822A CN112184822A (en) 2021-01-05
CN112184822B true CN112184822B (en) 2024-01-30

Family

ID=73914749

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910585360.5A Active CN112184822B (en) 2019-07-01 2019-07-01 Camera pitch angle adjusting method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112184822B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113160322A (en) * 2021-03-17 2021-07-23 地平线(上海)人工智能技术有限公司 Camera calibration method and device, medium and electronic device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014059793A (en) * 2012-09-19 2014-04-03 Nissan Motor Co Ltd Camera mounting error correction device and camera mounting error correction method
CN109345593A (en) * 2018-09-04 2019-02-15 海信集团有限公司 A kind of detection method and device of video camera posture
CN109685858A (en) * 2018-12-29 2019-04-26 北京茵沃汽车科技有限公司 A kind of monocular cam online calibration method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6354425B2 (en) * 2014-07-30 2018-07-11 株式会社デンソー In-vehicle camera mounting attitude detection method and apparatus
EP3174007A1 (en) * 2015-11-30 2017-05-31 Delphi Technologies, Inc. Method for calibrating the orientation of a camera mounted to a vehicle

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014059793A (en) * 2012-09-19 2014-04-03 Nissan Motor Co Ltd Camera mounting error correction device and camera mounting error correction method
CN109345593A (en) * 2018-09-04 2019-02-15 海信集团有限公司 A kind of detection method and device of video camera posture
CN109685858A (en) * 2018-12-29 2019-04-26 北京茵沃汽车科技有限公司 A kind of monocular cam online calibration method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
用于视频中车速自动检测的摄像机自动标定方法;陈珂;;计算机应用(08);全文 *

Also Published As

Publication number Publication date
CN112184822A (en) 2021-01-05

Similar Documents

Publication Publication Date Title
CN112818778B (en) Lane line fitting method, lane line fitting device, lane line fitting medium and electronic equipment
KR102240197B1 (en) Tracking objects in bowl-shaped imaging systems
JP4416039B2 (en) Striped pattern detection system, striped pattern detection method, and striped pattern detection program
JP6465982B2 (en) Image processing apparatus, image processing method, and program
JP6547452B2 (en) Lane deviation avoidance device
CN111985036A (en) House type frame line drawing method and device, storage medium and electronic equipment
CN112184822B (en) Camera pitch angle adjusting method and device, storage medium and electronic equipment
CN112184799A (en) Lane line space coordinate determination method and device, storage medium and electronic equipment
CN114092913A (en) Lane line determination method and apparatus, electronic device, and storage medium
CN111627066A (en) Method and device for adjusting external parameters of camera
EP4068220A1 (en) Image processing device, image processing method, moving device, and storage medium
CN109947101A (en) Path smooth processing method and processing device
CN115147683A (en) Pose estimation network model training method, pose estimation method and device
CN114170826B (en) Automatic driving control method and device, electronic device and storage medium
CN112308923A (en) Lane line-based camera pose adjusting method and device, storage medium and equipment
CN113160322A (en) Camera calibration method and device, medium and electronic device
JP2021529370A (en) How to determine the orientation of the target, smart operation control methods and devices and equipment
CN112184821B (en) Method and device for adjusting roll angle of camera, storage medium and electronic equipment
CN112406884B (en) Vehicle driving state recognition method and device, storage medium and electronic equipment
CN112132902B (en) Vehicle-mounted camera external parameter adjusting method and device, electronic equipment and medium
CN115205388A (en) Vehicle-mounted camera posture correction method and device, storage medium and electronic equipment
CN115965940A (en) Vehicle state determination method and device, electronic equipment and storage medium
WO2023093306A1 (en) Vehicle lane change control method and apparatus, electronic device, and storage medium
CN114620040A (en) Vehicle control method and device, electronic equipment and storage medium
CN111080792A (en) Model simplification processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant