CN111965383B - Vehicle speed information generation method and device, electronic equipment and computer readable medium - Google Patents

Vehicle speed information generation method and device, electronic equipment and computer readable medium Download PDF

Info

Publication number
CN111965383B
CN111965383B CN202010740460.3A CN202010740460A CN111965383B CN 111965383 B CN111965383 B CN 111965383B CN 202010740460 A CN202010740460 A CN 202010740460A CN 111965383 B CN111965383 B CN 111965383B
Authority
CN
China
Prior art keywords
vehicle
coordinate
coordinate data
speed
representing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010740460.3A
Other languages
Chinese (zh)
Other versions
CN111965383A (en
Inventor
戴震
倪凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heduo Technology Guangzhou Co ltd
Original Assignee
HoloMatic Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HoloMatic Technology Beijing Co Ltd filed Critical HoloMatic Technology Beijing Co Ltd
Priority to CN202010740460.3A priority Critical patent/CN111965383B/en
Publication of CN111965383A publication Critical patent/CN111965383A/en
Application granted granted Critical
Publication of CN111965383B publication Critical patent/CN111965383B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/64Devices characterised by the determination of the time taken to traverse a fixed distance
    • G01P3/68Devices characterised by the determination of the time taken to traverse a fixed distance using optical means, i.e. using infrared, visible, or ultraviolet light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the disclosure discloses a vehicle speed information generation method. One embodiment of the method comprises: acquiring an image set shot by a vehicle-mounted monocular camera, a speed value set of a monitored vehicle and camera parameter information of the vehicle-mounted monocular camera; determining coordinates of the detected vehicle displayed in each image in the set of images to generate coordinate data and generate alternative coordinate data; determining a distance difference value between every two adjacent distances in the distance set; performing instantaneous speed processing on each distance difference in the distance difference set to generate an instantaneous speed value; generating a set of speed values for the detected vehicle; selecting a speed value larger than a preset threshold value from the speed value set as a candidate speed value; and carrying out average processing on the candidate speed values in the candidate speed value set to obtain an average speed value, and taking the average speed value as the vehicle speed information. The embodiment realizes the generation of the vehicle speed information, enlarges the range of the vehicle speed monitoring area and improves the safety of road traffic.

Description

Vehicle speed information generation method and device, electronic equipment and computer readable medium
Technical Field
The embodiment of the disclosure relates to the technical field of computers, in particular to a vehicle speed information generation method, a vehicle speed information generation device, electronic equipment and a computer readable medium.
Background
The vehicle speed information refers to a vehicle speed during the running of the vehicle. The current commonly used method for generating the vehicle speed information is to obtain the vehicle speed information of the detected vehicle by a method of measuring the speed by a monitoring probe with speed detection or a radar. The method can only detect the speed information in the fixed area, and cannot detect the vehicle overspeed problem in the dead angle detection area.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose a vehicle speed information generation method, apparatus, electronic device, and computer readable medium to solve the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a vehicle speed information generation method, including: acquiring an image set shot by a vehicle-mounted monocular camera, a speed value set of a monitored vehicle and camera parameter information of the vehicle-mounted monocular camera; determining coordinates of a detected vehicle displayed in each image in the image set to generate coordinate data, and obtaining a coordinate data set, wherein the coordinates of the detected vehicle are coordinates in an image coordinate system; based on the camera parameter information of the vehicle-mounted monocular camera, performing coordinate conversion processing on each coordinate data in the coordinate data set to generate alternative coordinate data to obtain an alternative coordinate data set; determining the distance between each alternative coordinate data in the alternative coordinate data set and a coordinate origin to obtain a distance set, wherein the coordinate origin refers to the coordinate origin of a world coordinate system, the world coordinate system is a coordinate system established by taking the center of a rear shaft of the monitored vehicle as the coordinate origin, taking a line parallel to the traveling direction of the monitored vehicle as a horizontal axis, taking a line parallel to the rear shaft of the monitored vehicle as a longitudinal axis and taking a line vertical to the ground as a vertical axis; determining a distance difference value between every two adjacent distances in the distance set to obtain a distance difference value set; performing instantaneous speed processing on each distance difference value in the distance difference value set to generate an instantaneous speed value to obtain an instantaneous speed value set; generating a set of speed values of the detected vehicle based on the set of speed values of the monitored vehicle and the set of instantaneous speed values; selecting a speed value larger than a preset threshold value from the speed value set as a candidate speed value, and generating a candidate speed value set; and carrying out average processing on the candidate speed values in the candidate speed value set to obtain an average speed value, and taking the average speed value as the vehicle speed information.
In a second aspect, some embodiments of the present disclosure provide a vehicle speed information generating device, the device including: the vehicle-mounted monocular camera comprises an acquisition unit, a monitoring unit and a display unit, wherein the acquisition unit is configured to acquire a set of images shot by the vehicle-mounted monocular camera, a set of speed values of a monitored vehicle and camera parameter information of the vehicle-mounted monocular camera; a first generation unit configured to determine coordinates of a detected vehicle displayed in each image in the set of images to generate coordinate data, resulting in a set of coordinate data, wherein the coordinates of the detected vehicle are coordinates in an image coordinate system; the coordinate conversion unit is configured to perform coordinate conversion processing on each coordinate data in the coordinate data set based on the camera parameter information of the vehicle-mounted monocular camera to generate alternative coordinate data, so that an alternative coordinate data set is obtained; a second generating unit configured to determine a distance from a coordinate origin to a coordinate origin of each candidate coordinate data in the candidate coordinate data sets, to obtain a distance set, where the coordinate origin refers to a coordinate origin of a world coordinate system, the world coordinate system is a coordinate system established by taking a rear axle center of the monitored vehicle as the coordinate origin, taking a line parallel to a traveling direction of the monitored vehicle as a horizontal axis, taking a line parallel to a rear axle of the monitored vehicle as a vertical axis, and taking a line perpendicular to the ground as a vertical axis; a third generating unit configured to determine a distance difference between every two adjacent distances in the distance set to obtain a distance difference set; a fourth generating unit configured to perform instantaneous speed processing on each distance difference value in the distance difference value set to generate an instantaneous speed value, resulting in an instantaneous speed value set; a fifth generating unit configured to generate a set of speed values of the detected vehicle based on the set of speed values of the monitored vehicle and the set of instantaneous speed values; a sixth generating unit configured to select a speed value greater than a predetermined threshold from the speed value set as a candidate speed value, and generate a candidate speed value set; and the average processing unit is configured to perform average processing on the candidate speed values in the candidate speed value set to obtain an average speed value, and the average speed value is used as vehicle speed information.
In a third aspect, some embodiments of the present disclosure provide an electronic device, comprising: one or more processors; a storage device having one or more programs stored thereon which, when executed by one or more processors, cause the one or more processors to implement the method as described in the first aspect.
In a fourth aspect, some embodiments of the disclosure provide a computer readable medium having a computer program stored thereon, wherein the program, when executed by a processor, implements the method as described in the first aspect.
One of the above various embodiments of the present disclosure has the following beneficial effects: firstly, an image set shot by a vehicle-mounted monocular camera, a speed value set of a monitoring vehicle and camera parameter information of the vehicle-mounted monocular camera are obtained. Secondly, the coordinates of the detected vehicle displayed in each image in the image set are determined to generate coordinate data, and a coordinate data set is obtained, wherein the coordinates of the detected vehicle are the coordinates in an image coordinate system. Data preparation is performed for the following velocity calculations. And then, based on the camera parameter information of the vehicle-mounted monocular camera, performing coordinate conversion processing on each coordinate data in the coordinate data set to generate alternative coordinate data, so as to obtain an alternative coordinate data set. By converting the coordinate data of the image coordinate system into the coordinate data in the world coordinate system, the unification of the coordinate data is ensured, and the calculation and use in the following steps are facilitated. And then, determining the distance between each candidate coordinate data in the candidate coordinate data set and a coordinate origin to obtain a distance set, wherein the coordinate origin refers to the coordinate origin of a world coordinate system, and the world coordinate system is a coordinate system established by taking the center of the rear shaft of the monitored vehicle as the coordinate origin, the advancing direction of the vehicle as a transverse shaft, a line parallel to the rear shaft of the monitored vehicle as a longitudinal shaft and a line vertical to the ground as a vertical shaft. And further, determining a distance difference value between every two adjacent distances in the distance set to obtain a distance difference value set. By determining the distance of the detected vehicle from the coordinate center in each image, and the difference between each distance, data preparation is provided for the next instantaneous speed calculation. Further, each distance difference in the distance difference set is subjected to instantaneous speed processing to generate an instantaneous speed value, and an instantaneous speed value set is obtained. The speed value obtained is more accurate because the shooting time interval between the images is short. And secondly, obtaining a speed value set of the detected vehicle based on the speed value set of the monitored vehicle and the instantaneous speed value set. Since the speed of the detected vehicle is relative to the speed of the monitored vehicle, speed conversion is required. Then, a speed value greater than a predetermined threshold value is selected from the speed value set as a candidate speed value, and a candidate speed value set is generated. And finally, carrying out mean value processing on the candidate speed value set to obtain a mean value speed value, and taking the mean value speed value as the speed information to obtain the speed information of the detected vehicle. Due to the characteristic of monitoring flexible running of the vehicle, the vehicle speed can be detected in a monitoring probe with speed detection or an area which cannot be detected by radar speed measurement, and the range of the vehicle speed monitoring area is further enlarged.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
FIG. 1 is a schematic diagram of an application scenario of a vehicle speed information generation method according to some embodiments of the present disclosure;
FIG. 2 is a flow chart of some embodiments of a vehicle speed information generation method according to the present disclosure;
FIG. 3 is a schematic structural diagram of some embodiments of a vehicle speed information generating device according to the present disclosure;
fig. 4 is a schematic structural diagram of an electronic device of a vehicle speed information generation method according to the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a" or "an" in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will appreciate that references to "one or more" are intended to be exemplary and not limiting unless the context clearly indicates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is a schematic view of an application scenario of a vehicle speed information generation method according to some embodiments of the present disclosure.
In the application scenario of fig. 1, first, the computing device 101 may acquire camera parameter information 102 of an in-vehicle monocular camera, a detected vehicle speed set 103, and a set of images 104 captured by the in-vehicle monocular camera. Thereafter, the coordinates of the detected vehicle in the image set 104 are determined to generate coordinate data, resulting in a coordinate data set 105. Next, based on the camera parameters 102 of the on-vehicle monocular camera, coordinate conversion processing is performed on each coordinate data of the coordinate data set 105 to generate candidate coordinate data, resulting in a candidate coordinate data set 106. Further, the distance of each candidate coordinate data in the candidate coordinate data set 106 from the origin of coordinates is determined to obtain a distance set 107, where the origin of coordinates refers to the origin of coordinates of a world coordinate system, and the world coordinate system is a coordinate system established with the center of the rear axle of the monitored vehicle as the origin of coordinates, the line parallel to the traveling direction of the monitored vehicle as the horizontal axis, the line parallel to the rear axle of the monitored vehicle as the vertical axis, and the line perpendicular to the ground as the vertical axis. In addition, the distance difference between every two adjacent distances in the distance set 107 is determined, resulting in a distance difference set 108. Further, each distance difference in the distance difference set 108 is subjected to instantaneous velocity processing to generate an instantaneous velocity value, resulting in an instantaneous velocity value set 109. Further, a set of speed values 110 of the detected vehicle is obtained based on the set of monitored vehicle speed values 103 and the set of instantaneous speed values 109. Further, a velocity value greater than a predetermined threshold value is selected from the velocity value set 110 as a candidate velocity value, and a candidate velocity value set 111 is generated. Finally, the candidate speed value set 111 is averaged to obtain a mean speed value, which is used as the vehicle speed information 112.
The computing device 101 may be hardware or software. When the computing device is hardware, it may be implemented as a distributed cluster composed of multiple servers or terminal devices, or may be implemented as a single server or a single terminal device. When the computing device is embodied as software, it may be installed in the hardware devices enumerated above. It may be implemented, for example, as multiple pieces of software and software modules used to provide distributed services, or as a single piece of software or software module. And is not particularly limited herein.
It should be understood that the number of computing devices in FIG. 1 is merely illustrative. There may be any number of computing devices, as implementation needs dictate.
With continued reference to FIG. 2, a flow chart 200 of some embodiments of a vehicle speed information generation method according to the present disclosure is shown. The method may be performed by the computing device 101 of fig. 1. The vehicle speed information generation method comprises the following steps:
step 201, acquiring an image set shot by the vehicle-mounted monocular camera, a speed value set of a monitored vehicle and camera parameter information of the vehicle-mounted monocular camera.
In some embodiments, an executing subject of the vehicle speed information generating method (for example, the computing device 101 shown in fig. 1) may obtain, through a wired connection manner or a wireless connection manner, a set of images (for example, 104 shown in fig. 1 and described above), a set of speed values of a monitored vehicle (for example, 103 shown in fig. 1 and described above), and camera parameter information (for example, 102 shown in fig. 1 and described above) of the vehicle-mounted monocular camera, where each image in the set of images and the speed value in the set of speed values of the monitored vehicle have a one-to-one correspondence relationship. The monocular camera parameters include an external camera parameter set and an internal camera parameter, the external camera parameter set includes a binary group, the binary group includes a first camera parameter and a second camera parameter, the first camera parameter is a rotation matrix, and the second camera parameter is a translation vector.
As an example the first camera parameter may be
Figure BDA0002606549280000061
The second camera parameter may be
Figure BDA0002606549280000062
Step 202, determining coordinates of the detected vehicle displayed in each image in the image set to generate coordinate data, so as to obtain a coordinate data set.
In some embodiments, the performing subject determines coordinates of the detected vehicle displayed in each image of the image set by a pre-trained convolutional neural network to generate coordinate data, resulting in a coordinate data set (105 described above in fig. 1). Specifically, the pre-trained convolutional neural network may include a feature extraction layer, a feature summarization layer, and a coordinate determination layer. The feature extraction layer is used for identifying the detected vehicle in the image and extracting features. The characteristic summarizing layer is used for summarizing the extracted characteristics. And the coordinate determination layer is used for obtaining the coordinate data of the detected vehicle according to the summarized characteristics.
And step 203, performing coordinate conversion processing on each coordinate data in the coordinate data set based on the camera parameter information of the vehicle-mounted monocular camera to generate alternative coordinate data, so as to obtain an alternative coordinate data set.
In some embodiments, the executing subject may perform coordinate conversion processing on each coordinate data in the coordinate data set based on the camera parameter information of the vehicle-mounted monocular camera to generate alternative coordinate data, resulting in an alternative coordinate data set (106 shown in fig. 1). Wherein the coordinate transformation is used for transforming the coordinate data in the picture coordinate system into the world coordinate system.
In some optional implementation manners of some embodiments, the executing entity may perform coordinate conversion processing on each coordinate data in the coordinate data set to generate alternative coordinate data based on the camera parameter information of the on-board monocular camera, so as to obtain an alternative coordinate data set:
the first step is converting the coordinate data into corresponding first coordinate data in a camera coordinate system by a first coordinate conversion formula, wherein the camera coordinate system is a coordinate system established by using a focusing center of the on-board monocular camera as a coordinate origin, using an optical axis of the on-board monocular camera as a vertical axis, using a line parallel to a horizontal axis of the image coordinate system as a horizontal axis, and using a line parallel to a vertical axis of the image coordinate system as a vertical axis:
Figure BDA0002606549280000071
where u represents the abscissa in the coordinate data. v represents a vertical coordinate in the coordinate data. And K represents the parameters in the vehicle monocular camera. X C And an abscissa representing coordinates of the coordinate data in a camera coordinate system. Y is C And a vertical coordinate representing the coordinate of the coordinate data in the camera coordinate system. Z C And a vertical coordinate representing the coordinate of the coordinate data in the camera coordinate system.
As an example, the above coordinate data may be
Figure BDA0002606549280000072
The internal parameters of the vehicle-mounted monocular camera can be 1, Z C May be 2. Converting the coordinate data obtained by the first coordinate conversion formula into corresponding first coordinate data in a camera coordinate system
Figure BDA0002606549280000081
Secondly, converting the first coordinate data through a second coordinate conversion formula to obtain the alternative coordinate data:
Figure BDA0002606549280000082
wherein R represents the first external parameter. t represents the second extrinsic parameter. O is T Representing a transposed matrix of the 0 matrix. X w The abscissa in the above-described candidate coordinate data is represented. Y is w Indicating the ordinate in the candidate coordinate data. Z w Representing the vertical coordinates in the candidate coordinate data. X C And an abscissa indicating the first coordinate data. Y is C And a vertical coordinate representing the first coordinate data. Z C And a vertical coordinate representing the first coordinate data.
As an example, the first external parameter may be
Figure BDA0002606549280000083
The second extrinsic parameter may be
Figure BDA0002606549280000084
Z w May be 1, 0 T May be [ 000 ]]The first coordinate data may be
Figure BDA0002606549280000085
Obtaining alternative coordinate data by the second coordinate conversion formula
Figure BDA0002606549280000086
And converting the coordinates of the detected vehicle in the image coordinate system into a world coordinate system through the first coordinate conversion formula and the second coordinate conversion formula, so that the coordinates are unified, and the calculation and use in the following steps are facilitated.
And 204, determining the distance between each candidate coordinate data in the candidate coordinate data set and the coordinate origin to obtain a distance set.
In some embodiments, the executing entity may determine a distance from the coordinate origin of each candidate coordinate data in the candidate coordinate data set, resulting in a distance set (e.g., 107 described above in fig. 1). The world coordinate system is a coordinate system established by taking the center of the rear axle of the monitoring vehicle as the coordinate origin, taking a line parallel to the traveling direction of the monitoring vehicle as a horizontal axis, taking a line parallel to the rear axle of the monitoring vehicle as a vertical axis and taking a line vertical to the ground as a vertical axis.
In some optional implementations of some embodiments, the executing entity determines a distance from a coordinate origin of each candidate coordinate data in the candidate coordinate data set, and the obtaining of the distance set may include:
firstly, determining the distance of the candidate coordinate data from the coordinate origin by the following distance formula:
Figure BDA0002606549280000091
wherein X represents an abscissa in the candidate coordinate data. Y represents the ordinate in the above-described candidate coordinate data. Z represents a vertical coordinate in the above-described alternative coordinate data. X 0 The abscissa representing the origin of the above coordinates. Y is 0 The ordinate represents the origin of the coordinates. Z 0 A vertical coordinate representing the origin of the coordinates. D represents the above distance in kilometers.
As an example, the alternative coordinate data may be "[ 0.06,0.18,0.21 ]]", then pass
Figure BDA0002606549280000092
The calculated distance is 0.28 km.
Step 205, determine the distance difference between every two adjacent records in the distance set, to obtain the distance difference set.
In some embodiments, the executive may determine a distance difference between each two adjacent distances in the distance set, resulting in a distance difference set (e.g., 108, shown in FIG. 1). Wherein the distance difference is determined by a distance formula.
Step 206, performing an instantaneous speed process on each distance difference in the distance difference set to generate an instantaneous speed value, so as to obtain an instantaneous speed value set.
In some embodiments, the executive body may perform instantaneous velocity processing on each distance difference in the set of distance differences to generate an instantaneous velocity value, resulting in a set of instantaneous velocity values (e.g., 109 described above in fig. 1). The instantaneous speed processing means processing the distance difference value by the following speed formula to generate an instantaneous speed value:
V=S/T。
Where V represents the above-described instantaneous speed value. S represents the distance difference. T represents a camera shutter value.
In some optional implementations of some embodiments, the performing the instantaneous velocity processing on each distance difference value in the distance difference value set by the performing body to generate the instantaneous velocity value may include:
first, each distance difference in the set of distance differences is subjected to an instantaneous velocity process to generate an instantaneous velocity value by the following instantaneous velocity calculation formula:
Figure BDA0002606549280000101
wherein S represents the distance difference in kilometers. t represents a time threshold value of 0.0000278 in hours. k is a radical of 1 A first distance corresponding to the distance difference is indicated. k is a radical of 2 A second distance corresponding to the difference in distance is indicated. w represents a preset distance threshold. V represents the above instantaneous speed value.
As an example, the distance difference may be "0.003 km", the first distance corresponding to the distance difference may be "0.2 km", the second distance corresponding to the distance difference may be "0.203 km", and the preset threshold may be "0.7 km", so that the relative velocity value "138.98 km/h" corresponding to the distance difference "0.003 km" is obtained.
The problem of distortion of the pictures taken by the camera can lead to the cameraThe distance obtained by the shot picture has an error from the actual distance. By introduction of
Figure BDA0002606549280000102
The speed value is expanded as an expansion coefficient, and the error between the distance obtained from the picture taken by the camera and the actual distance caused by the picture distortion and other problems is reduced. Wherein w represents a preset distance threshold, k 1 Representing a first distance, k, corresponding to said difference in distance 2 And representing a second distance corresponding to the distance difference, so that the calculated instantaneous speed value is closer to the actual speed value. In addition, the current camera shutter value is large, that is, a plurality of pictures can be taken per second, and the time interval between every two pictures is small, so the speed obtained by the speed formula can be used as the instantaneous speed.
And step 207, generating a speed value set of the detected vehicle based on the speed value set of the monitored vehicle and the instantaneous speed value set.
In some embodiments, the execution subject may generate the set of speed values (110 shown in fig. 1) of the detected vehicle based on the set of speed values of the monitored vehicle and the set of instantaneous speed values. Since each of the instantaneous speed values is determined relative to the corresponding monitored vehicle speed value in the set of monitored vehicle speed values, it is necessary to process each of the instantaneous speed values in the set of instantaneous speed values such that each of the obtained speed values in the set of speed values is a speed relative to the ground.
In some optional implementations of some embodiments, the executing subject may generate the set of speed values of the detected vehicle based on the set of speed values of the monitoring vehicle and the set of instantaneous speed values, and may perform the following steps:
firstly, inputting the speed value of each monitoring vehicle in the speed value set of the monitoring vehicles and the corresponding instantaneous speed value of the monitoring vehicle in the instantaneous speed value set into the following speed calculation formula to obtain the speed value set of the detected vehicle:
FV=V 0 +V。
wherein, V 0 Representing the speed value of the monitored vehicle. V represents an instantaneous speed value corresponding to the speed value of the monitored vehicle in the instantaneous speed value set. FV represents the velocity value of the detected vehicle.
As an example, the monitored vehicle speed may be "35 km/h", the instantaneous speed value corresponding to the monitored vehicle in the above-mentioned instantaneous speed value set may be "120 km/h", and the speed value of the detected vehicle is "155 km/h".
Step 208, selecting the speed value greater than the predetermined threshold from the speed value set as a candidate speed value, and generating a candidate speed value set.
In some embodiments, the execution subject may select a speed value greater than a predetermined threshold from the speed value set as a candidate speed value, and generate a candidate speed value set (111 shown in fig. 1). The preset threshold value refers to a speed limit value of a current running road section of the detected vehicle.
Step 209, performing an averaging process on the candidate speed values in the candidate speed value set to obtain a mean speed value, and using the mean speed value as the vehicle speed information.
In some embodiments, the executing entity may perform an averaging process on the candidate speed values in the candidate speed value set to obtain a mean speed value, and use the mean speed value as the vehicle speed information (e.g., 112 shown in fig. 1). And calculating the average value of the candidate speed values in the candidate speed value set to obtain the specific speed information of the detected vehicle.
In some optional implementations of some embodiments, the execution main body may send the vehicle speed information to a display terminal of the monitoring vehicle, so as to display vehicle speed data of the detected vehicle; and the vehicle speed information is sent and stored to a storage unit for storing the vehicle speed data of the detected vehicle.
One of the above-described various embodiments of the present disclosure has the following advantageous effects: firstly, an image set shot by a vehicle-mounted monocular camera, a speed value set of a monitoring vehicle and camera parameter information of the vehicle-mounted monocular camera are obtained. Secondly, the coordinates of the detected vehicle displayed in each image in the image set are determined to generate coordinate data, and a coordinate data set is obtained, wherein the coordinates of the detected vehicle are the coordinates in an image coordinate system. Data preparation is performed for the following velocity calculations. And then, based on the camera parameter information of the vehicle-mounted monocular camera, performing coordinate conversion processing on each coordinate data in the coordinate data set to generate alternative coordinate data, so as to obtain an alternative coordinate data set. By converting the coordinate data of the image coordinate system into the coordinate data in the world coordinate system, the unification of the coordinate data is ensured, and the calculation and use in the following steps are facilitated. And then, determining the distance between each candidate coordinate data in the candidate coordinate data set and a coordinate origin to obtain a distance set, wherein the coordinate origin refers to the coordinate origin of a world coordinate system, and the world coordinate system is a coordinate system established by taking the center of the rear shaft of the monitored vehicle as the coordinate origin, the advancing direction of the vehicle as a transverse shaft, a line parallel to the rear shaft of the monitored vehicle as a longitudinal shaft and a line vertical to the ground as a vertical shaft. And further, determining a distance difference value between every two adjacent distances in the distance set to obtain a distance difference value set. By determining the distance of the detected vehicle from the coordinate center in each image, and the difference between each distance, data preparation is provided for the next instantaneous speed calculation. Further, each distance difference in the distance difference set is subjected to instantaneous speed processing to generate an instantaneous speed value, and an instantaneous speed value set is obtained. The speed value obtained is more accurate because the shooting time interval between the images is short. And secondly, obtaining a speed value set of the detected vehicle based on the speed value set of the monitored vehicle and the instantaneous speed value set. Since the speed of the detected vehicle is relative to the speed of the monitored vehicle, speed conversion is required. Then, a speed value greater than a predetermined threshold value is selected from the speed value set as a candidate speed value, and a candidate speed value set is generated. And finally, carrying out mean value processing on the candidate speed value set to obtain a mean value speed value, and taking the mean value speed value as the speed information to obtain the speed information of the detected vehicle. Due to the characteristic of monitoring flexible running of the vehicle, the vehicle speed can be detected in a monitoring probe with speed detection or an area which cannot be detected by radar speed measurement, and the range of the vehicle speed monitoring area is further enlarged.
With further reference to fig. 3, as an implementation of the above-described method for each of the above-described figures, the present disclosure provides some embodiments of a vehicle speed information generation apparatus, which correspond to those of the method embodiments described above for fig. 2, and which may be particularly applicable to various electronic devices.
As shown in fig. 3, the vehicle speed information generation device 300 of some embodiments includes: an acquisition unit 301, a first generation unit 302, a coordinate conversion unit 303, a second generation unit 304, a third generation unit 305, a fourth generation unit 306, a fifth generation unit 307, a sixth generation unit 308, and a mean value processing unit 309. The acquiring unit 301 is configured to acquire a set of images shot by the vehicle-mounted monocular camera, a set of speed values of the monitored vehicle and camera parameter information of the vehicle-mounted monocular camera; a first generating unit 302 configured to determine coordinates of a detected vehicle displayed in each image in the image set to generate coordinate data, resulting in a coordinate data set, wherein the coordinates of the detected vehicle are coordinates in an image coordinate system; a coordinate conversion unit 303 configured to perform coordinate conversion processing on each coordinate data in the coordinate data set based on the camera parameter information of the vehicle-mounted monocular camera to generate alternative coordinate data, so as to obtain an alternative coordinate data set; a second generating unit 304 configured to determine a distance from a coordinate origin to a coordinate origin of each candidate coordinate data in the candidate coordinate data sets, to obtain a distance set, where the coordinate origin refers to a coordinate origin of a world coordinate system, the world coordinate system is a coordinate system established by taking a center of a rear axle of the monitored vehicle as the coordinate origin, a line parallel to a traveling direction of the monitored vehicle as a horizontal axis, a line parallel to the rear axle of the monitored vehicle as a vertical axis, and a line perpendicular to the ground as a vertical axis; a third generating unit 305 configured to determine a distance difference value between every two adjacent distances in the distance set, resulting in a distance difference value set; a fourth generating unit 306 configured to perform instantaneous speed processing on each distance difference value in the distance difference value set to generate an instantaneous speed value, resulting in an instantaneous speed value set; a fifth generating unit 307 configured to generate a set of speed values of the detected vehicle based on the set of speed values of the monitored vehicle and the set of instantaneous speed values; a sixth generating unit 308 configured to select a speed value greater than a predetermined threshold from the speed value set as a candidate speed value, and generate a candidate speed value set; the average processing unit 309 is configured to perform an average processing on the candidate velocity values in the candidate velocity value set to obtain an average velocity value, and use the average velocity value as the vehicle speed information.
It will be understood that the units described in the apparatus 300 correspond to the various steps in the method described with reference to fig. 2. Thus, the operations, features and resulting advantages described above with respect to the method are also applicable to the apparatus 300 and the units included therein, and are not described herein again.
Referring now to FIG. 4, a block diagram of an electronic device (e.g., computing device 101 of FIG. 1)400 suitable for use in implementing some embodiments of the present disclosure is shown. The server shown in fig. 4 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 4, electronic device 400 may include a processing device (e.g., central processing unit, graphics processor, etc.) 401 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)402 or a program loaded from a storage device 408 into a Random Access Memory (RAM) 403. In the RAM403, various programs and data necessary for the operation of the electronic apparatus 400 are also stored. The processing device 401, the ROM 402, and the RAM403 are connected to each other via a bus 404. An input/output (I/O) interface 404 is also connected to bus 404.
Generally, the following devices may be connected to the I/O interface 404: input devices 406 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 407 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 408 including, for example, tape, hard disk, etc.; and a communication device 409. The communication means 409 may allow the electronic device 400 to communicate wirelessly or by wire with other devices to exchange data. While fig. 4 illustrates an electronic device 400 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 4 may represent one device or may represent multiple devices as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network through the communication device 409, or from the storage device 408, or from the ROM 402. The computer program, when executed by the processing device 401, performs the above-described functions defined in the methods of some embodiments of the present disclosure.
It should be noted that the computer readable medium described above in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the apparatus; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring an image set shot by a vehicle-mounted monocular camera, a speed value set of a monitored vehicle and camera parameter information of the vehicle-mounted monocular camera; determining the coordinates of the detected vehicle displayed in each image in the image set to generate coordinate data to obtain a coordinate data set, wherein the coordinates of the detected vehicle are the coordinates in an image coordinate system; based on the camera parameter information of the vehicle-mounted monocular camera, performing coordinate conversion processing on each coordinate data in the coordinate data set to generate alternative coordinate data to obtain an alternative coordinate data set; determining the distance between each alternative coordinate data in the alternative coordinate data set and a coordinate origin to obtain a distance set, wherein the coordinate origin refers to the coordinate origin of a world coordinate system, the world coordinate system is a coordinate system established by taking the center of a rear shaft of the monitored vehicle as the coordinate origin, taking a line parallel to the traveling direction of the monitored vehicle as a horizontal axis, taking a line parallel to the rear shaft of the monitored vehicle as a longitudinal axis and taking a line vertical to the ground as a vertical axis; determining a distance difference value between every two adjacent distances in the distance set to obtain a distance difference value set; performing instantaneous speed processing on each distance difference value in the distance difference value set to generate an instantaneous speed value to obtain an instantaneous speed value set; generating a set of speed values of the detected vehicle based on the set of speed values of the monitored vehicle and the set of instantaneous speed values; selecting a speed value larger than a preset threshold value from the speed value set as a candidate speed value, and generating a candidate speed value set; and carrying out average processing on the candidate speed values in the candidate speed value set to obtain an average speed value, and taking the average speed value as the vehicle speed information.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by software, and may also be implemented by hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a first generation unit, a coordinate conversion unit, a second generation unit, a third generation unit, a fifth generation unit, a sixth generation unit, and a mean processing unit. The names of these units do not in some cases constitute a limitation on the unit itself, and for example, the acquisition unit may also be described as "a unit that acquires a set of images taken by the onboard monocular camera, a set of speed values of the monitoring vehicle, and camera parameter information of the onboard monocular camera".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the technical method may be formed by replacing the above-mentioned features with (but not limited to) technical features having similar functions disclosed in the embodiments of the present disclosure.

Claims (8)

1. A vehicle speed information generation method, comprising:
acquiring an image set shot by a vehicle-mounted monocular camera, a speed value set of a monitored vehicle and camera parameter information of the vehicle-mounted monocular camera;
determining coordinates of a detected vehicle displayed in each image in the image set to generate coordinate data, and obtaining a coordinate data set, wherein the coordinates of the detected vehicle are coordinates in an image coordinate system;
based on the camera parameter information of the vehicle-mounted monocular camera, performing coordinate conversion processing on each coordinate data in the coordinate data set to generate alternative coordinate data to obtain an alternative coordinate data set;
determining the distance between each alternative coordinate data in the alternative coordinate data set and a coordinate origin to obtain a distance set, wherein the coordinate origin refers to the coordinate origin of a world coordinate system, the world coordinate system is a coordinate system established by taking the center of a rear shaft of the monitored vehicle as the coordinate origin, taking a line parallel to the traveling direction of the monitored vehicle as a transverse axis, taking a line parallel to the rear shaft of the monitored vehicle as a longitudinal axis and taking a line perpendicular to the ground as a vertical axis;
determining a distance difference value between every two adjacent distances in the distance set to obtain a distance difference value set;
Performing instantaneous speed processing on each distance difference value in the distance difference value set to generate an instantaneous speed value to obtain an instantaneous speed value set;
generating a set of speed values for the detected vehicle based on the set of speed values for the monitored vehicle and the set of instantaneous speed values;
selecting speed values larger than a preset threshold value from the speed value set as candidate speed values to generate a candidate speed value set;
performing an average processing on the candidate speed values in the candidate speed value set to obtain an average speed value, and using the average speed value as vehicle speed information, wherein the determining coordinates of the detected vehicle displayed in each image in the image set to generate coordinate data includes:
determining coordinates of the detected vehicle displayed in the image by a pre-trained convolutional neural network to generate coordinate data, wherein the pre-trained convolutional neural network comprises: the system comprises a feature extraction layer, a feature summarizing layer and a coordinate determination layer, wherein the feature extraction layer is used for identifying detected vehicles in an image and extracting features, the feature summarizing layer is used for summarizing the extracted features, the coordinate determination layer is used for obtaining coordinate data of the detected vehicles according to the summarized features, and each distance difference in the distance difference set is subjected to instantaneous speed processing to generate an instantaneous speed value, and the system comprises the following components:
And obtaining an instantaneous speed value corresponding to the distance difference value through the following instantaneous speed calculation formula:
Figure 38481DEST_PATH_IMAGE001
,
wherein, the first and the second end of the pipe are connected with each other,
Figure 927720DEST_PATH_IMAGE002
representing said distance difference in kilometers;
Figure 320656DEST_PATH_IMAGE003
representing a time threshold value of 0.0000278 in hours;
Figure 954899DEST_PATH_IMAGE004
representing a first distance corresponding to the distance difference;
Figure 989851DEST_PATH_IMAGE005
representing a second distance corresponding to the distance difference;
Figure 760361DEST_PATH_IMAGE006
represents a preset distance threshold;
Figure 589777DEST_PATH_IMAGE007
representing the instantaneous speed value.
2. The method of claim 1, wherein the method further comprises:
sending the vehicle speed information to a display terminal of the monitoring vehicle, and displaying the vehicle speed data of the detected vehicle;
and sending and storing the vehicle speed information to a storage unit for storing the vehicle speed data of the detected vehicle.
3. The method of claim 1, wherein the camera parameter information includes a set of camera-external parameter information and camera-internal parameters, the camera-external parameter information including a tuple including a first camera parameter and a second camera-external parameter; and
the coordinate conversion processing is performed on each coordinate data in the coordinate data set based on the camera parameter information of the vehicle-mounted monocular camera to generate alternative coordinate data, and the method comprises the following steps:
Converting the coordinate data into corresponding first coordinate data under a camera coordinate system by using a first coordinate conversion formula, wherein the camera coordinate system is a coordinate system established by taking a focusing center of the vehicle-mounted monocular camera as a coordinate origin, taking an optical axis of the vehicle-mounted monocular camera as a vertical axis, taking a line parallel to a horizontal axis of the image coordinate system as a horizontal axis, and taking a line parallel to a longitudinal axis of the image coordinate system as a longitudinal axis:
Figure 445738DEST_PATH_IMAGE008
wherein the content of the first and second substances,
Figure 284381DEST_PATH_IMAGE009
representing the abscissa in the coordinate data;
Figure 909397DEST_PATH_IMAGE010
representing a vertical coordinate in the coordinate data;
Figure 673828DEST_PATH_IMAGE011
representing the intra-camera parameters;
Figure 17085DEST_PATH_IMAGE012
an abscissa representing the first coordinate data;
Figure 924998DEST_PATH_IMAGE013
a vertical coordinate representing the first coordinate data;
Figure 404521DEST_PATH_IMAGE014
a vertical coordinate representing the first coordinate data;
converting the first coordinate data through a second coordinate conversion formula to obtain the alternative coordinate data:
Figure 841319DEST_PATH_IMAGE015
wherein the content of the first and second substances,
Figure 671871DEST_PATH_IMAGE016
representing a first extrinsic parameter;
Figure 852317DEST_PATH_IMAGE017
representing a second extrinsic parameter;
Figure 451926DEST_PATH_IMAGE018
a transposed matrix representing a 0 matrix;
Figure 59625DEST_PATH_IMAGE019
representing an abscissa in the candidate coordinate data;
Figure 377473DEST_PATH_IMAGE020
representing a vertical coordinate in the alternative coordinate data;
Figure 594566DEST_PATH_IMAGE021
representing vertical coordinates in the alternative coordinate data;
Figure 314260DEST_PATH_IMAGE012
an abscissa representing the first coordinate data;
Figure 358440DEST_PATH_IMAGE013
A vertical coordinate representing the first coordinate data;
Figure 898005DEST_PATH_IMAGE014
a vertical coordinate representing the first coordinate data.
4. The method of claim 3, wherein the determining a distance of each candidate coordinate data in the candidate coordinate data set from a coordinate origin comprises:
determining a distance of the candidate coordinate data from a coordinate origin by a distance formula:
Figure 420254DEST_PATH_IMAGE022
,
wherein the content of the first and second substances,
Figure 994454DEST_PATH_IMAGE023
representing an abscissa in the candidate coordinate data;
Figure 943956DEST_PATH_IMAGE024
representing a vertical coordinate in the alternative coordinate data;
Figure 705238DEST_PATH_IMAGE025
representing vertical coordinates in the alternative coordinate data;
Figure 765598DEST_PATH_IMAGE026
an abscissa representing the origin of coordinates;
Figure 427262DEST_PATH_IMAGE027
a vertical coordinate representing the origin of coordinates;
Figure 547664DEST_PATH_IMAGE028
a vertical coordinate representing the origin of coordinates;
Figure 61822DEST_PATH_IMAGE029
representing the distance in kilometers.
5. The method of claim 4, wherein said generating a set of speed values for the detected vehicle based on the set of speed values for the monitored vehicle and the set of instantaneous speed values comprises:
inputting the speed value of each monitoring vehicle in the speed value set of the monitoring vehicles and the corresponding instantaneous speed value of the monitoring vehicle in the instantaneous speed value set into the following speed calculation formula to generate the speed value of the detected vehicle, and obtaining the speed value set of the detected vehicle:
Figure 660294DEST_PATH_IMAGE030
,
Wherein, the first and the second end of the pipe are connected with each other,
Figure 677928DEST_PATH_IMAGE031
representing a speed value of the monitored vehicle;
Figure 969232DEST_PATH_IMAGE032
an instantaneous speed value representing a speed value of the monitored vehicle corresponding to the set of instantaneous speed values;
Figure 705107DEST_PATH_IMAGE033
representing the velocity value of the sensed vehicle.
6. A vehicle speed information generation device comprising:
the vehicle-mounted monocular camera comprises an acquisition unit, a monitoring unit and a display unit, wherein the acquisition unit is configured to acquire a set of images shot by the vehicle-mounted monocular camera, a set of speed values of a monitored vehicle and camera parameter information of the vehicle-mounted monocular camera;
a first generation unit configured to determine coordinates of a detected vehicle displayed in each image of the set of images to generate coordinate data, resulting in a coordinate data set, wherein the coordinates of the detected vehicle are coordinates in an image coordinate system;
the coordinate conversion unit is configured to perform coordinate conversion processing on each coordinate data in the coordinate data set based on camera parameter information of the vehicle-mounted monocular camera to generate alternative coordinate data, so that an alternative coordinate data set is obtained;
a second generating unit, configured to determine a distance from a coordinate origin to each candidate coordinate data in the candidate coordinate data sets, to obtain a distance set, where the coordinate origin refers to a coordinate origin of a world coordinate system, the world coordinate system is a coordinate system established by taking a rear axle center of the monitored vehicle as the coordinate origin, taking a line parallel to a traveling direction of the monitored vehicle as a horizontal axis, taking a line parallel to a rear axle of the monitored vehicle as a vertical axis, and taking a line perpendicular to the ground as a vertical axis;
A third generating unit configured to determine a distance difference between every two adjacent distances in the distance set, resulting in a distance difference set;
a fourth generating unit configured to perform instantaneous speed processing on each distance difference value in the distance difference value set to generate an instantaneous speed value, resulting in an instantaneous speed value set;
a fifth generating unit configured to generate a set of speed values of the detected vehicle based on the set of speed values of the monitoring vehicle and the set of instantaneous speed values;
a sixth generating unit configured to select a speed value greater than a predetermined threshold from the speed value set as a candidate speed value, and generate a candidate speed value set;
a mean processing unit configured to perform mean processing on candidate speed values in the candidate speed value set to obtain a mean speed value, and use the mean speed value as vehicle speed information, wherein the determining coordinates of the detected vehicle displayed in each image in the image set to generate coordinate data includes:
determining coordinates of the detected vehicle displayed in the image by a pre-trained convolutional neural network to generate coordinate data, wherein the pre-trained convolutional neural network comprises: the system comprises a feature extraction layer, a feature summarizing layer and a coordinate determination layer, wherein the feature extraction layer is used for identifying detected vehicles in an image and extracting features, the feature summarizing layer is used for summarizing the extracted features, the coordinate determination layer is used for obtaining coordinate data of the detected vehicles according to the summarized features, and each distance difference in the distance difference set is subjected to instantaneous speed processing to generate an instantaneous speed value, and the system comprises the following components:
And obtaining an instantaneous speed value corresponding to the distance difference value through the following instantaneous speed calculation formula:
Figure 888962DEST_PATH_IMAGE034
,
wherein the content of the first and second substances,
Figure 761103DEST_PATH_IMAGE002
representing said distance difference in kilometers;
Figure 488888DEST_PATH_IMAGE003
representing a time threshold value of 0.0000278 in hours;
Figure 977638DEST_PATH_IMAGE004
representing a first distance corresponding to the distance difference;
Figure 183491DEST_PATH_IMAGE005
representing a second distance corresponding to the distance difference;
Figure 441297DEST_PATH_IMAGE006
represents a preset distance threshold;
Figure 605563DEST_PATH_IMAGE007
representing the instantaneous speed value.
7. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-5.
8. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-5.
CN202010740460.3A 2020-07-28 2020-07-28 Vehicle speed information generation method and device, electronic equipment and computer readable medium Active CN111965383B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010740460.3A CN111965383B (en) 2020-07-28 2020-07-28 Vehicle speed information generation method and device, electronic equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010740460.3A CN111965383B (en) 2020-07-28 2020-07-28 Vehicle speed information generation method and device, electronic equipment and computer readable medium

Publications (2)

Publication Number Publication Date
CN111965383A CN111965383A (en) 2020-11-20
CN111965383B true CN111965383B (en) 2022-07-29

Family

ID=73363953

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010740460.3A Active CN111965383B (en) 2020-07-28 2020-07-28 Vehicle speed information generation method and device, electronic equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN111965383B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112232326A (en) * 2020-12-15 2021-01-15 北京每日优鲜电子商务有限公司 Driving information generation method and device, electronic equipment and computer readable medium
CN113484530A (en) * 2021-05-26 2021-10-08 深圳市二郎神视觉科技有限公司 Vehicle speed detection method, system and computer readable storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108121941A (en) * 2016-11-30 2018-06-05 上海联合道路交通安全科学研究中心 A kind of object speed calculation method based on monitoring device
US20190347808A1 (en) * 2018-05-09 2019-11-14 Ford Global Technologies, Llc Monocular Visual Odometry: Speed And Yaw Rate Of Vehicle From Rear-View Camera
CN109917359B (en) * 2019-03-19 2022-10-14 福州大学 Robust vehicle distance estimation method based on vehicle-mounted monocular vision
CN110780287A (en) * 2019-11-08 2020-02-11 芜湖酷哇机器人产业技术研究院有限公司 Distance measurement method and distance measurement system based on monocular camera

Also Published As

Publication number Publication date
CN111965383A (en) 2020-11-20

Similar Documents

Publication Publication Date Title
CN112598762B (en) Three-dimensional lane line information generation method, device, electronic device, and medium
CN112348029B (en) Local map adjusting method, device, equipment and computer readable medium
CN112598731B (en) Vehicle positioning method and device, electronic equipment and computer readable medium
CN112328731B (en) Vehicle lane level positioning method and device, electronic equipment and computer readable medium
CN113607185B (en) Lane line information display method, lane line information display device, electronic device, and computer-readable medium
CN111965383B (en) Vehicle speed information generation method and device, electronic equipment and computer readable medium
CN114399589B (en) Three-dimensional lane line generation method and device, electronic device and computer readable medium
CN113674357B (en) Camera external reference calibration method and device, electronic equipment and computer readable medium
CN114399588B (en) Three-dimensional lane line generation method and device, electronic device and computer readable medium
CN115326099B (en) Local path planning method and device, electronic equipment and computer readable medium
CN113327318B (en) Image display method, image display device, electronic equipment and computer readable medium
CN114993328B (en) Vehicle positioning evaluation method, device, equipment and computer readable medium
CN113190613A (en) Vehicle route information display method and device, electronic equipment and readable medium
CN114445597B (en) Three-dimensional lane line generation method and device, electronic device and computer readable medium
CN112232326A (en) Driving information generation method and device, electronic equipment and computer readable medium
CN113269168A (en) Obstacle data processing method and device, electronic equipment and computer readable medium
CN113780247B (en) Traffic light detection method and device, electronic equipment and computer readable medium
CN113808134B (en) Oil tank layout information generation method, oil tank layout information generation device, electronic apparatus, and medium
CN111950238B (en) Automatic driving fault scoring table generation method and device and electronic equipment
CN112597788B (en) Target measuring method, target measuring device, electronic apparatus, and computer-readable medium
CN112528970A (en) Guideboard detection method, device, equipment and computer readable medium
CN112815959B (en) Vehicle lane level positioning system, method and device and electronic equipment
CN114399555B (en) Data online calibration method and device, electronic equipment and computer readable medium
CN114863025B (en) Three-dimensional lane line generation method and device, electronic device and computer readable medium
CN114842448B (en) Three-dimensional lane line generation method and device, electronic device and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Vehicle speed information generation method, device, electronic device and calculation readable medium

Effective date of registration: 20230228

Granted publication date: 20220729

Pledgee: Bank of Shanghai Co.,Ltd. Beijing Branch

Pledgor: HOLOMATIC TECHNOLOGY (BEIJING) Co.,Ltd.

Registration number: Y2023980033668

PE01 Entry into force of the registration of the contract for pledge of patent right
CP03 Change of name, title or address

Address after: 201, 202, 301, No. 56-4 Fenghuang South Road, Huadu District, Guangzhou City, Guangdong Province, 510806

Patentee after: Heduo Technology (Guangzhou) Co.,Ltd.

Address before: 100022 301, block B, halibut Pioneer Park, shunbai Road, Chaoyang District, Beijing

Patentee before: HOLOMATIC TECHNOLOGY (BEIJING) Co.,Ltd.

CP03 Change of name, title or address