CN114333390B - Method, device and system for detecting shared vehicle parking event - Google Patents
Method, device and system for detecting shared vehicle parking event Download PDFInfo
- Publication number
- CN114333390B CN114333390B CN202111649234.5A CN202111649234A CN114333390B CN 114333390 B CN114333390 B CN 114333390B CN 202111649234 A CN202111649234 A CN 202111649234A CN 114333390 B CN114333390 B CN 114333390B
- Authority
- CN
- China
- Prior art keywords
- parking
- coordinate information
- center point
- shared vehicle
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 83
- 238000001514 detection method Methods 0.000 claims abstract description 121
- 238000006243 chemical reaction Methods 0.000 claims description 72
- 239000011159 matrix material Substances 0.000 claims description 56
- 230000015654 memory Effects 0.000 claims description 27
- 238000004590 computer program Methods 0.000 claims description 21
- 230000009466 transformation Effects 0.000 claims description 17
- 238000004422 calculation algorithm Methods 0.000 claims description 6
- 238000005070 sampling Methods 0.000 claims description 3
- 230000000694 effects Effects 0.000 abstract description 17
- 238000012545 processing Methods 0.000 abstract description 10
- 238000013473 artificial intelligence Methods 0.000 abstract description 4
- 230000007547 defect Effects 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 18
- 238000004458 analytical method Methods 0.000 description 13
- 238000004891 communication Methods 0.000 description 11
- 238000012549 training Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 5
- 238000009434 installation Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 230000001788 irregular Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000000903 blocking effect Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000003062 neural network model Methods 0.000 description 2
- 230000003252 repetitive effect Effects 0.000 description 2
- PCTMTFRHKVHKIS-BMFZQQSSSA-N (1s,3r,4e,6e,8e,10e,12e,14e,16e,18s,19r,20r,21s,25r,27r,30r,31r,33s,35r,37s,38r)-3-[(2r,3s,4s,5s,6r)-4-amino-3,5-dihydroxy-6-methyloxan-2-yl]oxy-19,25,27,30,31,33,35,37-octahydroxy-18,20,21-trimethyl-23-oxo-22,39-dioxabicyclo[33.3.1]nonatriaconta-4,6,8,10 Chemical compound C1C=C2C[C@@H](OS(O)(=O)=O)CC[C@]2(C)[C@@H]2[C@@H]1[C@@H]1CC[C@H]([C@H](C)CCCC(C)C)[C@@]1(C)CC2.O[C@H]1[C@@H](N)[C@H](O)[C@@H](C)O[C@H]1O[C@H]1/C=C/C=C/C=C/C=C/C=C/C=C/C=C/[C@H](C)[C@@H](O)[C@@H](C)[C@H](C)OC(=O)C[C@H](O)C[C@H](O)CC[C@@H](O)[C@H](O)C[C@H](O)C[C@](O)(C[C@H](O)[C@H]2C(O)=O)O[C@H]2C1 PCTMTFRHKVHKIS-BMFZQQSSSA-N 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Landscapes
- Traffic Control Systems (AREA)
Abstract
The disclosure provides a method, a device and a system for detecting a shared vehicle parking event, which relate to intelligent parking in the technical field of artificial intelligence and image processing, wherein the method comprises the following steps: comprising the following steps: the method comprises the steps of obtaining a parking image when a shared vehicle is parked, wherein the parking image comprises the shared vehicle and a parking line used for guiding the shared vehicle to park, identifying key points of the shared vehicle in the parking image, and determining a detection result according to the key points and the parking line, wherein the detection result represents whether the shared vehicle is a vehicle parked normally or not, so that the defects of higher cost and lower accuracy caused by the influence of environment, own parameters and the like caused by adding additional equipment (such as a sensor) on the shared vehicle in the related art are avoided, the cost is saved, and the technical effects of detecting accuracy and reliability are improved.
Description
Technical Field
The disclosure relates to intelligent parking in the technical field of artificial intelligence and image processing, in particular to a method, a device and a system for detecting a shared vehicle parking event.
Background
The sharing vehicles (such as sharing automobiles, sharing bicycles, sharing electric vehicles and the like) bring convenience for people to travel, and meanwhile, the irregular parking of the sharing vehicles brings problems for urban and road management, so that the detection of the parking event of the sharing vehicles is very important to provide guidance for the correct parking of the sharing vehicles.
In the prior art, sensors are typically provided on the shared vehicle to detect a parking event of the shared vehicle by the sensors. For example, by setting a global positioning system on the shared vehicle, position information of the vehicle is acquired by the global positioning system, and it is determined that the shared vehicle is a normally parked vehicle or an unnormally parked vehicle based on the position information.
However, with the above method, it is necessary to provide the sensor on the shared vehicle, and therefore there is a technical problem that accuracy is low due to deviation of information collected by the sensor.
Disclosure of Invention
The disclosure provides a method, a device and a system for detecting a shared vehicle parking event, which are used for improving detection accuracy.
According to a first aspect of the present disclosure, there is provided a method of detecting a shared vehicle parking event, comprising:
Obtaining a parking image when a shared vehicle is parked, wherein the parking image comprises the shared vehicle and a parking line for guiding the shared vehicle to park;
identifying keypoints of a shared vehicle in the parking image;
and determining a detection result according to the key points and the parking line, wherein the detection result represents whether the shared vehicle is a standard parked vehicle.
According to a second aspect of the present disclosure, there is provided a detection apparatus for a shared vehicle parking event, comprising:
an acquisition unit configured to acquire a parking image when a shared vehicle is parked, wherein the parking image includes the shared vehicle and a parking line for guiding the shared vehicle to park;
an identification unit configured to identify key points of the shared vehicle in the parking image;
and the determining unit is used for determining a detection result according to the key points and the parking line, wherein the detection result represents whether the shared vehicle is a standard parked vehicle or not.
According to a third aspect of the present disclosure, there is provided an electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
The memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of the first aspect.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method according to the first aspect.
According to a fifth aspect of the present disclosure, there is provided a computer program product comprising: a computer program stored in a readable storage medium, from which it can be read by at least one processor of an electronic device, the at least one processor executing the computer program causing the electronic device to perform the method of the first aspect.
According to a sixth aspect of the present disclosure, there is provided a detection system of a shared vehicle parking event, comprising: an image acquisition device, and the shared-vehicle-park-event detection device as recited in the second aspect, wherein,
the image acquisition device is used for acquiring a parking image when the shared vehicle is parked.
Drawings
The drawings are for a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a schematic diagram according to a first embodiment of the present disclosure;
FIG. 2 is a schematic illustration of a scenario of a method of detecting a shared vehicle parking event according to an embodiment of the disclosure;
FIG. 3 is a schematic diagram according to a second embodiment of the present disclosure;
FIG. 4 is a schematic diagram of key points of a method of detecting a shared vehicle parking event according to one embodiment of the disclosure;
FIG. 5 is a schematic diagram of key points of a method of detecting a shared vehicle parking event in accordance with another embodiment of the disclosure;
FIG. 6 is a schematic diagram of key points of a method of detecting a shared vehicle parking event according to yet another embodiment of the disclosure;
FIG. 7 is a schematic diagram according to a third embodiment of the present disclosure;
FIG. 8 is a schematic diagram according to a fourth embodiment of the present disclosure;
FIG. 9 is a schematic diagram according to a fifth embodiment of the present disclosure;
FIG. 10 is a schematic diagram according to a sixth embodiment of the present disclosure;
fig. 11 is a block diagram of an electronic device for implementing a method of detecting a shared vehicle parking event in accordance with an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
The types of the shared vehicles at least comprise a shared automobile, a shared bicycle and a shared electric vehicle, the shared automobile brings convenience for people to travel, but meanwhile, the irregular parking of the shared vehicle brings inconvenience for city and road management, such as the fact that other vehicles cannot normally pass due to irregular parking of the shared vehicle, and parking events of the shared vehicle are detected to determine whether the shared vehicle is a normally parked vehicle or not, so that guiding is provided for correct parking of the shared vehicle, and therefore the detection of the parking events of the shared vehicle is an urgent problem to be solved.
In the related art, it is generally employed to provide a sensor on a shared vehicle to collect sensing information based on the sensor and to determine whether the shared vehicle is a parked vehicle based on the sensing information. For example, sensing information is acquired based on a global positioning system (Global Positioning System, GPS) on the set-up and vehicle, and based on the sensing information, it is determined whether the shared vehicle is a parked vehicle.
The inventors of the present disclosure have made corresponding improvements on this basis to determine whether the shared vehicle is a normally parked vehicle based on the improved scheme.
In one example, a global positioning system (Global Positioning System, GPS) and a gyroscope are provided on the shared vehicle, and whether the shared vehicle is a parked vehicle is determined based on sensing information acquired by the global positioning system and the gyroscope respectively.
For example, the sensing information collected based on the global positioning system includes the position information of the shared vehicle, the sensing information collected based on the gyroscope includes the motion information (such as speed and motion track) of the shared vehicle, and whether the shared vehicle is a vehicle parked normally is determined according to the position information and the motion information.
In another example, a direction sensor is provided on the shared vehicle, and the direction sensor is used for collecting direction information and the like of the shared vehicle, and determining whether the shared vehicle is a normally parked vehicle based on the direction information and the like.
In still another example, a bluetooth detection module is provided on the shared vehicle, and parking margin information of a shared vehicle parking area, that is, information related to the number of shared vehicles that can be parked in the shared vehicle parking area, may be determined based on the bluetooth detection module, and whether the shared vehicle is a normally parked vehicle may be determined based on the parking margin information.
In yet another example, if the shared vehicle is a shared bicycle, an interlock may be provided on the shared bicycle and a determination may be made as to whether the shared bicycle is a properly parked vehicle based on the interlock.
However, with the above method of setting the sensor on the shared vehicle, the sensor collects the sensing data of the shared vehicle during running, so as to determine whether the shared vehicle is a parked vehicle based on the sensing data, and redundant devices (i.e. the sensor) need to be added on the shared vehicle, so that the cost of the shared vehicle is correspondingly increased, and on one hand, the sensor is easily affected by factors such as weather, so that difference exists between the sensing information collected by different application scenes; on the other hand, the sensor may have a problem of low accuracy of the acquired sensing information due to the parameters or structures of the sensor.
Namely, the method for detecting the parking event of the shared vehicle by arranging the sensor on the shared vehicle has the technical problems of higher cost and lower accuracy.
In order to avoid at least one of the above technical problems, the inventors of the present disclosure have made creative efforts to obtain the inventive concept of the present disclosure: an image including the shared vehicle and the parking line is acquired, and keypoints of the shared vehicle in the acquired image are identified to determine whether the shared vehicle is a normally parked vehicle based on the keypoints and the parking line.
Based on the above inventive concept, the present disclosure provides a method, a device and a system for detecting a shared vehicle parking event, which are applied to intelligent parking in the technical fields of artificial intelligence and image processing, so as to improve the reliability and accuracy of detection.
Fig. 1 is a schematic diagram of a first embodiment of the present disclosure, as shown in fig. 1, a method for detecting a shared vehicle parking event according to an embodiment of the present disclosure, including:
s101: a parking image is acquired when the shared vehicle is parked.
The parking image comprises a shared vehicle and a parking line for guiding the shared vehicle to park.
For example, the execution body of the embodiment may be a detection device (hereinafter simply referred to as a detection device) for sharing a parking event, and the detection device may be a server (such as a cloud server, or a local server, where the server may be a cloud control platform, a vehicle-road collaborative management platform, a central subsystem, an edge computing platform, a cloud computing platform, or the like), or may be a road side device, or may be a terminal device, or may be a processor, or may be a chip, or the like.
In the system architecture of intelligent traffic road cooperation, the road side equipment comprises the road side sensing equipment and the road side computing equipment, wherein the road side sensing equipment (such as a road side camera) is connected to the road side computing equipment (such as a road side computing unit RSCU), the road side computing equipment is connected to a server, and the server can communicate with an automatic driving or assisted driving vehicle in various modes; alternatively, the roadside awareness device itself includes a computing function, and the roadside awareness device is directly connected to the server. The above connections may be wired or wireless.
For example, the method for detecting a shared vehicle parking event of the present embodiment may be applied to an application scenario as shown in fig. 2, as shown in fig. 2:
the image capturing device 201 may capture a parking event of the user 202 parking the shared vehicle 203, and obtain a parking image. And as shown in fig. 2, the parked shared vehicle 203 is included in the parking image, and a parking line 204 is also included.
The parking line is used for guiding a user to park the shared vehicle and also used for subsequently determining whether the shared vehicle is a standard parked vehicle.
The image capturing apparatus 201 establishes a communication connection with the server 205 and transmits the parking image to the server 205, and thus the server 205 acquires the parking image.
In connection with the above analysis, in some embodiments the detection device may be a server as shown in fig. 2, while in other embodiments the detection device may be a device comprising an image acquisition device integrated with the server.
It should be understood that fig. 2 and the description above with respect to fig. 2 are only exemplary for illustrating application scenarios to which the present embodiment may be applied, and are not to be construed as limiting the application scenarios of the present embodiment.
For example, in the application scenario shown in fig. 2, the shared vehicle is a shared bicycle, and in combination with the above analysis, the shared vehicle may also be a shared automobile or a shared electric vehicle.
S102: key points of the shared vehicle in the parking image are identified.
The key points are points for characterizing structural features of the shared vehicle.
For example, in the application scenario shown in fig. 2, the key points of the shared vehicle (the shared bicycle shown in fig. 2) may include points for characterizing the handle, points for characterizing the axle center of the wheel, such as points for characterizing the axle center of the front wheel and points for characterizing the axle center of the rear wheel, points for characterizing the contact of the shared bicycle with the ground, and so on, which are not listed here.
It should be noted that, the number of key points and the like can be determined based on the requirements, the history, the experiments and the like, and the present embodiment is not limited. And the present embodiment is not limited to the manner of recognition.
For example, the keypoints may be determined by means of an image recognition algorithm, or by means of a deep learning network.
S103: and determining a detection result according to the key points and the parking line.
The detection result represents whether the shared vehicle is a standard parked vehicle or not.
For example, the association relationship between the shared vehicle and the parking line in position can be determined according to the key points and the parking line, so as to determine the detection result; for another example, the detection result may be determined based on the association relationship between the key point and the parking line in position, and so on, which are not listed here.
Based on the above analysis, an embodiment of the present disclosure provides a method for detecting a shared vehicle parking event, including: the method comprises the steps of obtaining a parking image when a shared vehicle is parked, wherein the parking image comprises the shared vehicle and a parking line used for guiding the shared vehicle to park, identifying key points of the shared vehicle in the parking image, determining a detection result according to the key points and the parking line, wherein the detection result represents whether the shared vehicle is a normally parked vehicle or not.
Fig. 3 is a schematic diagram of a second embodiment of the present disclosure, as shown in fig. 3, a method for detecting a shared vehicle parking event according to an embodiment of the present disclosure, including:
s301: a parking image is acquired when the shared vehicle is parked.
The parking image comprises a shared vehicle and a parking line for guiding the shared vehicle to park.
It should be understood that, in order to avoid repetitive description, the same features as those of the above embodiment are not repeated in this embodiment.
S302: key points of the shared vehicle in the parking image are identified.
S303: and determining a standard parking area of the shared vehicle according to the parking line, and determining an actual parking area of the shared vehicle according to the key point.
The standard parking area refers to an area where the shared vehicle is parked in a standard manner. Correspondingly, the area outside the standard parking area is a non-standard parking area or an illegal parking area. I.e. the shared vehicles should be parked in a normal parking area to avoid traffic jams or to influence traffic order.
The actual parking area refers to an area in which the shared vehicle is parked.
In some embodiments, determining a canonical parking region of the shared vehicle from the parking line may include the steps of:
a first step of: and acquiring image coordinate information of the parking line under an image coordinate system.
And a second step of: and converting the image coordinate information of the parking line into physical coordinate information under a physical coordinate system.
And a third step of: and determining a standard parking area according to the physical coordinate information of the parking line.
For example, when the detection method of the shared-vehicle parking event of the present embodiment is applied to the scene as shown in fig. 2, the parking lines are divided into two, and the parking lines may be marked as a parking line AD and a parking line BC, respectively, for convenience of distinction.
As shown in fig. 2, points a and D are two points on the parking line AD, and points B and C are two points on the parking line BC.
And respectively determining the image coordinate information of the point A, the point D, the point B and the point C by using an image coordinate system (namely a coordinate system based on a parking image), and converting the image coordinate information corresponding to the point A, the point D, the point B and the point C into physical coordinate information under a physical coordinate system (namely a coordinate system based on a physical world in which a parking line is positioned), so as to obtain the physical coordinate information corresponding to the point A, the point D, the point B and the point C. Accordingly, the area framed based on the physical coordinate information corresponding to each of the points a, D, B, and C may be determined as the normal parking area. A rectangular area composed of ABCD as shown in fig. 2 may be determined as a canonical parking area.
Of course, in other embodiments, a plurality of points on the parking line may be selected to determine the standard parking area based on the points on the parking line, and the implementation principle may also be that, as described in the above embodiment, the area formed based on the points is first determined after coordinate transformation, so that the area is determined as the implementation principle of the standard parking area, which is not described herein again.
Similarly, in some embodiments, the actual parking area may also be determined based on the principle of determining the standard parking area, that is, coordinate conversion may be performed on the key points to convert the image coordinate information of the key points with reference to the image coordinate system into physical coordinate information with reference to the physical coordinate system, and the actual parking area may be determined according to the physical coordinate information of the key points.
For example, a key point for determining the shared vehicle profile may be selected from the key points, and the actual parking area may be determined based on the key point of the shared vehicle profile.
In other embodiments, the actual parking area may also be determined based on a deep learning approach, such as training a base network model based on sample data including sample keypoints, generating an area detection model, inputting the keypoints into the area detection model, and outputting the actual parking area.
In still other embodiments, the normal parking area may be determined first, and the positional relationship between the keypoints with respect to the normal parking area may be determined, for example, a coordinate deviation between the keypoints and a center point of the normal parking area may be determined, and the actual parking area may be calculated based on the coordinate deviation and the normal parking area.
It should be noted that, in some embodiments, when converting image coordinate information with respect to an image coordinate system into physical coordinate information with respect to a physical coordinate system, the image coordinate information with respect to the image coordinate system may be converted into image acquisition device coordinate information with respect to an image acquisition device coordinate system (i.e., may be understood as a camera coordinate system), and the physical coordinate information with respect to the physical coordinate system may be determined according to installation information of the image acquisition device and the image acquisition device coordinate information with respect to the image acquisition device coordinate system.
The installation information of the image acquisition device refers to the installation position of the image acquisition device corresponding to the physical coordinate information in the physical coordinate system by taking the physical coordinate system as a reference.
That is, the positional association between the image coordinate system and the image capturing device coordinate system may be determined, or the positional association between the image capturing device coordinate system and the physical coordinate system may be determined, and further the positional association between the image coordinate system and the physical coordinate system may be determined with the image capturing device coordinate system as an intermediate factor, thereby achieving coordinate conversion from the image coordinate information to the physical coordinate information.
S304: and determining a detection result according to the standard parking area and the actual parking area.
The detection result represents whether the shared vehicle is a standard parked vehicle or not.
For example, the canonical parking region may be compared to the actual parking region to determine whether the actual parking region is within the canonical parking region. If the parking area is not located in the standard parking area, determining that the shared vehicle is a non-standard parked vehicle; if the parking area is within the normal parking area, the shared vehicle may be a normal parked vehicle.
For example, in combination with the above analysis, it is known that the standard parking area is determined by the points of the physical coordinate information of the plurality of points, and the actual parking area is determined by the physical coordinate information of the key point, and then the comparison between the standard parking area and the actual parking area can be achieved by:
and determining whether the physical coordinate information of the key points is the physical coordinate information of the points in the standard parking area, namely whether the key points fall into the standard parking area, if so, the shared vehicle may be a standard parked vehicle, and if not, the shared vehicle is a non-standard parked vehicle.
In this embodiment, by determining the normal parking area based on the parking line, and determining the actual parking area according to the key points, so as to determine the detection result by combining the normal parking area and the actual parking area, the convenience and the rapidness of detection can be improved, and the detection result is not affected by the error of the sensor, so that the technical effects of the accuracy and the reliability of detection can be improved.
In some embodiments, S304 may include the steps of:
a first step of: and matching the actual parking area with the standard parking area to obtain a matching result.
And a second step of: if the matching result indicates that the actual parking area is in the standard parking area, determining the parking angle and/or the parking direction of the shared vehicle according to the key points, and determining the detection result according to the parking angle and/or the parking direction.
The parking angle refers to an angle at which the shared vehicle is parked with reference to a physical coordinate system. Similarly, the parking direction refers to a direction in which the shared vehicle is parked with reference to the physical coordinate system.
By way of example, this embodiment may be understood as:
and matching the actual parking area with the standard parking area to judge whether the actual parking area is in the standard parking area, and if so, further determining the parking direction and the parking angle to determine the detection result by combining the two dimensions of the parking direction and the parking angle.
In this embodiment, if it is determined that the actual parking area is in the standard parking area, the detection result is further determined by combining two dimensions of the parking direction and the parking angle, that is, whether the shared vehicle is the standard parked vehicle is further determined by combining two dimensions of the parking direction and the parking angle, so that the shared vehicle can be determined to be the standard parked vehicle or the non-standard parked vehicle more precisely, thereby improving the technical effects of accuracy and reliability of detection.
In connection with the above analysis, the key points have image coordinate information with reference to an image coordinate system, and thus determining the parking angle and/or the parking direction of the shared vehicle from the key points includes: and converting the image coordinate information of the key points into physical coordinate information under a physical coordinate system, and determining the parking angle and/or the parking direction according to the physical coordinate information of the key points.
That is, the coordinate information of the key points can be converted, that is, the image coordinate information of the key points is converted into physical coordinate information, so that the parking angle and/or the parking direction of the shared vehicle are determined based on the physical coordinate information of the key points, and whether the shared vehicle is a standard parked vehicle or not is determined by combining the parking angle and/or the parking direction, and further, the determined parking angle and/or the parking direction are attached to the physical coordinate system, the requirements of actual application scenes are attached, and the accuracy and the reliability of the detection result are improved.
In combination with the above analysis, the shared vehicle includes a shared vehicle, a shared bicycle, and a shared electric vehicle, and the key points are used for characterizing the structural features of the shared vehicle, but in contrast, the shared vehicle and the shared bicycle, the shared electric vehicle have a larger difference in structural features, and the shared bicycle and the shared electric vehicle have a larger similarity in structural features. For example, in general, a shared automobile has four wheels, whereas a shared bicycle and a shared electric automobile have only two wheels, and so on, which are not listed here.
In view of the above-described large differences in structural features between the shared vehicle and the shared bicycle and the shared electric vehicle and the large similarities in structural features between the shared bicycle and the shared electric vehicle, the method of determining the parking angle and the parking direction of the shared vehicle may be different from the method of determining the parking angle and the parking direction of the shared bicycle (shared electric vehicle).
For example, if the shared vehicle is a shared bicycle or a shared electric vehicle, the key points at least include a first wheel axle center point and a second wheel axle center point of the shared vehicle, and the parking angle and the parking direction of the shared vehicle may be determined by the following method:
and generating a central axis of the shared parking vehicle according to the physical coordinate information of the first wheel axle center point and the physical coordinate information of the second wheel axle center point, and determining a parking angle and a parking direction according to the central axis.
Illustratively, taking a shared vehicle as a shared bicycle as an example, as shown in fig. 4, key points of the shared bicycle include a first wheel axle center point 401 and a second wheel axle center point 402 of the shared bicycle. In combination with the above analysis, the physical coordinate information of the first wheel center point 401 may be determined based on the image coordinate information of the first wheel center point 401, the physical coordinate information of the second wheel center point 402 may be determined based on the image coordinate information of the second wheel center point 402, and accordingly, a line connecting the first wheel center point 401 and the second wheel center point 402 may be referred to as a central axis of the shared bicycle, and thus, the central axis may be generated based on the first wheel center point 401 and the second wheel center point 402, and the parking angle and the parking direction may be determined in combination with the central axis.
The parking angle and the parking direction are determined by taking the central axis as a reference, so that the standard (namely the central axis) for determining the parking angle and the parking direction has stability, and the determined parking angle and the determined parking direction have the technical effect of higher reliability.
In some embodiments, the park line may further include a park direction guide line for guiding a park direction of the shared vehicle so that the user parks the shared vehicle based on a direction indicated by the park direction guide line when parking the shared vehicle.
For example, the "arrow" 206 in fig. 2 indicates a parking direction guiding line, and in contrast, the direction indicated by the "arrow" indicates a direction in which the user parks the head of the shared vehicle, and accordingly, the parking angle is determined according to the central axis, which includes the following steps:
a first step of: and acquiring physical coordinate information of the parking direction guide wire under a physical coordinate system.
In one example, the image coordinate information of the park direction index line in the image coordinate system may be determined first, and then the image coordinate information of the index line is converted into physical coordinate information based on the physical coordinate system, so as to obtain the physical coordinate information of the direction index line. The principle of conversion can be found in the above embodiments, and will not be described here again.
In another example, physical coordinate information of the direction leads may be collected and stored when the direction leads are set, so that when it is necessary to determine the parking angle based on the direction leads, the pre-stored physical coordinate information of the direction leads is acquired.
That is, the physical coordinate information of the direction guide wire may be determined based on the conversion manner, or may be determined based on the storage manner, thereby achieving the technical effects of flexibility and diversity in determining the physical coordinate information of the direction guide wire.
And a second step of: and calculating an included angle between the central axis and the parking direction guide wire according to the physical coordinate information of the parking direction guide wire, and determining the included angle as a parking angle.
Under the condition that the physical coordinate information of the central axis is known and the physical coordinate information of the parking direction guide line is also known, the included angle between the two lines can be determined by combining the physical coordinate information corresponding to each of the two lines, the determined included angle between the two lines is determined as the parking angle, and the implementation principle of determining the included angle between the two lines can be referred to a calculation method in the related art, which is not repeated herein.
In this embodiment, the parking angle is determined by combining the parking direction guide wire and the central axis, which is equivalent to calculating the angle of the central axis deviated from the parking direction guide wire by taking the parking direction guide wire as a reference, so as to determine the angle as the parking angle, and the direction guide wire is a guiding standard for parking of the shared vehicle.
As can be seen in conjunction with fig. 4, the key points of the shared bicycle may include a plurality of points on the handlebar of the shared bicycle, such as the key points 403, 404, 405 described in fig. 4, in addition to the first wheel hub point 401 and the second wheel hub point 402 of the shared bicycle.
It should be understood that the keypoints on the handlebar shown in fig. 4 are for exemplary purposes only, and that the possible keypoints on the handlebar should not be construed as limiting the keypoints on the handlebar, which should be construed as neither limiting the number of keypoints nor limiting the positions of the keypoints.
Accordingly, determining the parking direction based on the central axis may include the steps of:
a first step of: and determining the suspected parking direction according to the central axis.
The suspected parking direction refers to a possible parking direction of the shared vehicle.
For example, two suspected parking directions may be determined according to the central axis, such as a suspected parking direction 1 and a suspected parking direction 2 shown in fig. 4.
And a second step of: the direction of the handlebar is determined based on the physical coordinate information of the plurality of points on the handlebar.
For example, the handlebar line in which the handlebar is located may be determined based on physical coordinate information of a plurality of points on the handlebar, and the direction of the handlebar may be determined based on the handlebar line.
And a third step of: and determining the parking direction according to the direction of the handle bar and the suspected parking direction.
Accordingly, the parking direction may be determined based on the suspected parking direction 1, the suspected parking direction 2, and the direction of the handle bar.
For example, a suspected parking direction in which the direction indicated by the head of the shared vehicle among the suspected parking directions is the direction of the handle bar is determined as the parking direction, i.e., a suspected parking direction 1 is determined as the parking direction.
That is, the direction of the head of the shared vehicle can be determined based on the direction of the handlebar, and the parking direction is determined from the suspected parking directions according to the direction of the head, so as to avoid misjudging the direction of the shared vehicle when the shared vehicle is parked, thereby improving the accuracy of determining the parking direction, and further improving the technical effects of determining the accuracy and reliability of the detection result based on the parking direction.
Based on the above analysis, the actual direction in which the shared vehicle is parked (i.e., the parking direction) may be determined in combination with the direction of the handlebar and the suspected parking direction, and in other embodiments, the parking direction may be determined based on the direction of the handlebar.
For example, after determining the direction of the handle bar, a vertical direction perpendicular to the direction of the handle bar may be determined, and the parking direction may be determined based on the direction of the handle bar and the vertical direction.
For example, the head direction of the shared vehicle is determined according to the direction of the handle bar, two opposite directions exist in the vertical direction, and the vertical direction pointing in the vehicle direction out of the two opposite directions is determined as the parking direction.
In this embodiment, by determining the parking direction in combination with the direction of the handle bar, the technical effect of convenience in determining the parking direction can be improved, while by determining the parking direction in the manner as in this embodiment and the embodiments described above, it is possible to achieve the technical effect of determining the parking direction in different manners, thereby improving the flexibility and diversity in determining the parking direction.
It should be noted that the key points described in the above embodiments are merely exemplary to illustrate possible key points of the present embodiment and the uses of the corresponding key points, but should not be construed as limiting the key points.
For example, in other embodiments, as shown in fig. 4, the key points may further include points 406 and 407 of contact between the shared bicycle and the ground, points 408 on the shared bicycle cushion, center points 409 of rotation of the shared bicycle, and so on, which are not illustrated herein.
Accordingly, on the basis of the above embodiment, the parking direction and the parking angle may be further determined in combination with the above key points, the parking angle and the parking direction may be determined in combination with the key points as in fig. 4, or the parking angle and the parking direction may be determined by selecting a part of key points from fig. 4, for example, the parking angle may be determined based on the center point 409, the contact point 406, and the contact point 407. The implementation principle can be referred to the above embodiments, and will not be described herein.
In connection with the above analysis, the keypoints may be determined by means of a deep learning network, and, for example, a keypoint identification model for identifying the keypoints may be trained based on the requirements of the scene.
For example, a sample key point data set is collected, the sample key point data set is input to an initial network model (the type and structure of the initial network model are not limited in this embodiment), so that the initial network model is trained based on the sample key point data set, for example, in the training process, a training value is obtained, the training value is compared with a pre-labeled calibration value to determine loss information between the training value and the calibration value, the basic network model is adjusted based on the loss information (for example, parameters of a convolution layer of the basic network model are adjusted), and the iteration is performed until the number of iterations meets the preset number of times requirement, or the loss information during the iteration meets the preset loss range, training is completed, and a key point identification model is obtained.
In some embodiments, when the key point identification model is generated through training, the category of each key point can be marked, such as an axle center point and a point contacted with the ground, and more refined, the axle center point can be marked as a front axle center point and a rear axle center point, so that when the key point of the shared vehicle is identified based on the key point identification model, the category of the key point can be obtained.
The above embodiments exemplarily illustrate the determination of the parking angle and the parking direction of the shared vehicle using the shared vehicle as the shared bicycle and the shared electric vehicle, but the method for determining the parking angle and the parking direction of the shared vehicle may be adaptively adjusted due to the difference in structural characteristics between the shared vehicle and the shared bicycle (shared electric vehicle).
For example, if the shared vehicle is a shared vehicle, the key points include at least axle center points of four wheels (not shown in the figure) of the shared vehicle, which are a third wheel axle center point, a fourth wheel axle center point, a fifth wheel axle center point, and a sixth wheel axle center point, respectively, and the parking angle and the parking direction of the shared vehicle are determined by adopting the following directions:
and generating the central axis of the shared vehicle according to the physical coordinate information of the axle center points of the four wheels, and determining the parking angle and the parking direction according to the central axis.
In one example, a plurality of wheel axes between front wheels and rear wheels of the shared vehicle may be generated according to physical coordinate information of axle center points of the four wheels, and a central axis of the shared vehicle may be determined according to the plurality of wheel axes, and a parking angle and/or a parking direction may be determined according to the central axis.
For example, as shown in fig. 5, the key points of the shared automobile include a third wheel center point 501, a fourth wheel center point 502, a fifth wheel center point 503, and a sixth wheel center point 504, and accordingly, an axle line between the third wheel center point 501 and the fourth wheel center point 502 (for convenience of distinction, the axle line is referred to as a first wheel axis) may be determined according to the physical coordinate information of the third wheel center point 501 and the physical coordinate information of the fourth wheel center point 502, and an axle line between the fifth wheel center point 503 and the sixth wheel center point 504 (for convenience of distinction, the axle line is referred to as a second wheel axis) may be determined according to the physical coordinate information of the fifth wheel center point 503 and the physical coordinate information of the sixth wheel center point 504.
After the first wheel axis and the second wheel axis are determined, a central axis may be determined based on the first wheel axis and the second wheel axis. For example, the central physical coordinate information of the two wheel axes may be determined according to the physical coordinate information of the first wheel axis and the physical coordinate information of the second wheel axis, so as to determine the central axis, where the physical coordinate information of the central axis is the central physical coordinate information.
In another example, physical coordinate information of the center points of the two wheels on the same horizontal line with the physical coordinate system as a reference when the shared vehicle is parked horizontally is determined according to the physical coordinate information corresponding to the center points of the four wheels, and physical coordinate information of the center point between the center points of the two wheels on the same horizontal line is determined to obtain the physical coordinate information of the two center points; and determining the central axis of the shared vehicle according to the physical coordinate information of the two central points, and determining the parking angle and/or the parking direction according to the central axis.
For example, as shown in fig. 5, the physical coordinate information of the wheel center point 505 of the two wheel center points may be determined according to the physical coordinate information of the third wheel center point 501 and the physical coordinate information of the fifth wheel center point 503; according to the physical coordinate information of the fourth wheel axial point 502 and the physical coordinate information of the sixth wheel axial point 504, the physical coordinate information of the wheel center points 506 of the two wheel axial points is determined; the central axis of the shared vehicle is determined according to the physical coordinate information of the wheel center points 505 of the two wheel center points and the physical coordinate information of the wheel center points 506 of the two wheel center points, and the parking angle and/or the parking direction are determined according to the central axis.
That is, the parking angle and/or the parking direction may be determined based on the center axes by first determining two wheel axes between the four wheels and determining the center axis of the shared vehicle based on the two wheel axes; two center points of the four axle center points can be determined, and the axle center of the shared vehicle is determined according to the two center points, so that the parking angle and/or the parking direction are determined based on the axle center, and the axle center is determined in different modes, so that the parking angle and/or the parking direction are determined, and the technical effects of flexibility and diversity of detection on whether the shared vehicle is a standard parked vehicle can be achieved.
It should be noted that, if the shared vehicle is a shared vehicle, the implementation principle of determining the parking angle and/or the parking direction of the shared vehicle may be the same as the implementation principle of determining the shared bicycle (or the shared electric vehicle), for example, determining the parking angle by means of a direction guide line, that is, determining the parking angle by calculating the angle between the direction guide line and the central axis.
In other embodiments, considering the variability in structural features of the shared automobile and the shared bicycle (or the shared electric vehicle), the parking direction and parking angle of the shared automobile may also be determined based on the following method:
Determining physical coordinate information of axle center points of two wheels on the same horizontal line by taking a physical coordinate system as a reference when the shared vehicle is parked horizontally according to the physical coordinate information corresponding to the axle center points of the four wheels; determining the distance between the wheel axle center points of the two wheels on the same horizontal line according to the physical coordinate information of the wheel axle center points of the two wheels on the same horizontal line, and obtaining the distance between the wheel axle center points of the two wheels on the same horizontal line; and determining the parking direction and the parking angle of the shared automobile according to the two distances.
For example, as shown in fig. 5, the distance between the third wheel center point 501 and the fifth wheel center point 503 may be determined based on the physical coordinate information of the third wheel center point 501 and the physical coordinate information of the fifth wheel center point 503 (for convenience of distinction, the distance is referred to as the first wheel center distance); the distance between the fourth wheel center point 502 and the sixth wheel center point 504 (for convenience of distinction, the distance is referred to as a second axial distance) may be determined according to the physical coordinate information of the fourth wheel center point 502 and the physical coordinate information of the sixth wheel center point 504, and the parking direction and the parking angle of the shared automobile may be determined according to the first axial distance and the second axial distance.
For example, the magnitudes between the first wheel center distance and the second wheel center distance are compared, the wheel center distance having a relatively small distance is determined as the front wheel center distance, the corresponding two wheels are front wheels, the wheel center distance having a relatively large distance is determined as the rear wheel center distance, the corresponding two wheels are rear wheels, and the parking direction is determined from the front wheels and the rear wheels.
Specifically, as shown in fig. 5, if the first wheel center distance is smaller than the second wheel center distance, the third wheel center point 501 and the fifth wheel center point 503 are determined as front wheels, and the fourth wheel center point 502 and the sixth wheel center point 504 are determined as rear wheels, thereby determining the parking direction.
Based on the above analysis, in this embodiment, there are a plurality of features that need to be used to convert the image coordinate information into the physical coordinate information, such as converting the image coordinate information of the key point into the physical coordinate information, so as to facilitate the conversion between the image coordinate information and the physical coordinate information, in this embodiment, it is proposed to determine a conversion matrix first to convert the image coordinate information into the physical coordinate information based on the conversion matrix, where the conversion matrix is used to characterize the conversion relationship of converting the image coordinate information of the pixel point in the parking image with the image coordinate system as the reference into the physical coordinate information with the physical coordinate system as the reference, so as to improve the reliability and the effectiveness of the coordinate conversion.
The method for determining the conversion matrix is not limited in this embodiment, for example, the conversion matrix may be determined based on a "multi-point calibration" manner, such as a "three-point calibration" manner, or a "four-point calibration" manner; the conversion matrix can also be determined based on a self-learning mode, such as a mode of constructing a neural network model, a mode of determining the conversion matrix based on the neural network model, and the like, so that flexibility and diversity of obtaining the conversion matrix are improved.
Illustratively, the determination of the transformation matrix based on the "three-point calibration" approach is set forth below:
the method includes the steps of obtaining an image, obtaining any three non-collinear points in the image, and obtaining image coordinate information and physical coordinate information corresponding to the three non-collinear points respectively (the three non-collinear points can be obtained in an artificial mode or can be obtained by combining a measuring tool, and the embodiment is not limited) so as to construct a transformation matrix based on the image coordinate information and the physical coordinate information corresponding to the three non-collinear points respectively. Based on the image coordinate information and the physical coordinate information corresponding to the three non-collinear points, determining a conversion relation which can convert the image coordinate information corresponding to the three non-collinear points into the physical coordinate information corresponding to the three non-collinear points, wherein the conversion relation is a conversion matrix.
The image may be a parking image as described in the above embodiment, or may be a sample image in the calibration stage.
That is, after the parking image is acquired, the parking image may be "three-point calibrated" to obtain the conversion matrix, or the calibration stage may be entered in advance before the detection, so that the conversion matrix is not required to be determined during the detection, and the image coordinate information may be converted into the physical coordinate information based on the conversion matrix determined in the calibration stage, thereby improving the efficiency in the detection.
The determination of the transformation matrix based on the "four-point calibration" approach is explained as follows:
the method includes the steps of obtaining an image, obtaining points of any quadrangle in the image, and obtaining image coordinate information and physical coordinate information corresponding to the points of any quadrangle (the points can be obtained in an artificial mode or can be obtained by combining a measuring tool, the embodiment is not limited), so that a transformation matrix is constructed based on the image coordinate information and the physical coordinate information corresponding to the points of any quadrangle.
If the image coordinate information corresponding to the points of the random quadrangle is determined based on the perspective transformation mode and the image coordinate information and the physical coordinate information corresponding to the points of the random quadrangle, the image coordinate information corresponding to the points of the random quadrangle is converted into the conversion relation of the physical coordinate information corresponding to the points of the random quadrangle, and the conversion relation is the conversion matrix.
Similarly, the image may be a parking image as described in the above embodiment, or may be a sample image of the calibration stage.
It should be noted that, determining the conversion matrix in the "multi-point calibration" manner may further include determining the conversion matrix based on more point calibration manners, that is, determining the conversion matrix in more than four point calibration manners, and determining the conversion matrix in more than four point calibration "multi-point calibration manners is described as follows:
the method comprises the steps of obtaining an image, selecting more than four points in the image, and obtaining image coordinate information and physical coordinate information (which can be obtained in an artificial mode or can be obtained by combining a measuring tool, and the embodiment is not limited) corresponding to each point, so as to determine a conversion matrix based on an iterative mode.
For example, a random sample consensus algorithm (RANSAC) or a least square method is adopted to perform iterative calculation on the image coordinate information and the physical coordinate information corresponding to each point, thereby obtaining conversion.
Similarly, the image may be a parking image as described in the above embodiment, or may be a sample image of the calibration stage.
Illustratively, the determination of the transformation matrix based on the "self-learning" approach is set forth below:
And acquiring an image, detecting key points of the shared vehicles in the image to obtain projection coordinates of the shared vehicles, namely coordinates when points of the shared vehicles in the image are projected to a physical coordinate system, determining the width (and/or length) of the shared vehicles according to the projection coordinates, and carrying out iterative computation according to the width (and/or length) to obtain an optimal conversion matrix.
For example, the detected key points of the shared vehicle are a, b, c, d as shown in fig. 6, the pre-calibrated calibration points c 'are obtained, the projection coordinates corresponding to each of the calibration points c' are determined a, b, c, d, c ', namely, the physical coordinate information is determined, the width of the shared vehicle is determined according to each of the physical coordinate information, for example, the width of the shared vehicle is determined according to the physical coordinate information of c and the physical coordinate information of d, the width of the shared vehicle is determined according to the physical coordinate information of c' and the physical coordinate information of b, and the iterative calculation is performed on the basic network model based on the two widths, so that the optimal transformation matrix is obtained.
Similarly, the image may be a parking image as described in the above embodiment, or may be a sample image of the calibration stage.
It should be noted that, in this embodiment, the method described in each embodiment may be used to determine the conversion matrix, so as to achieve the technical effect of determining the diversity and flexibility of the conversion matrix.
Accordingly, in the detection process, any of the above embodiments may be used to determine the conversion matrix, or a method for determining the conversion matrix may be selected in conjunction with the scene, and the conversion matrix may be determined based on the selected method for determining the conversion matrix.
The mapping relationship may exist between the scene and the method for determining the transformation matrix, that is, one method for determining the transformation matrix is adopted in one scene, and another method for determining the transformation matrix is adopted in another scene.
In some embodiments, the detection result may be determined based on the parking angle, and illustratively, determining the detection result according to the parking angle may include the steps of:
a first step of: judging the magnitude between the parking angle and a preset angle threshold, if the parking angle is larger than the angle threshold, executing the second step, and if the parking angle is smaller than or equal to the angle threshold, executing the third step.
And a second step of: and determining the detection result as a detection result for representing that the shared vehicle is a vehicle which is not parked normally.
And a third step of: and determining the work detection result as a detection result for representing that the shared vehicle is a vehicle parked in a standard manner.
In connection with the above analysis, this embodiment can be understood as: if the shared vehicle is determined to be in the standard parking area, further determining whether the parking angle of the shared vehicle is larger than an angle threshold value, if so, determining that the shared vehicle is a non-standard parked vehicle (namely, a illegal parked vehicle), otherwise, determining that the shared vehicle is a standard parked vehicle.
In other embodiments, the detection result may be determined according to the parking direction, and illustratively, determining the detection result according to the parking direction may include the steps of:
a first step of: judging whether the parking direction is the same as the preset standard direction, if not, executing the second step, and if not, executing the third step.
The standard direction refers to a parking direction of the shared vehicle when parking is standard.
And a second step of: and determining the detection result as a detection result for representing that the shared vehicle is a vehicle which is not parked normally.
And a third step of: and determining the detection result as a detection result for representing that the shared vehicle is a vehicle parked normally.
In connection with the above analysis, this embodiment can be understood as: if the shared vehicle is determined to be in the standard parking area, further determining whether the parking direction of the shared vehicle is the same as the standard direction, if so, determining that the shared vehicle is a non-standard parked vehicle (namely, a illegal parked vehicle), otherwise, determining that the shared vehicle is a standard parked vehicle.
In still other embodiments, the detection may be determined by combining elements of two dimensions of the parking angle and the parking direction, where if the shared vehicle is in the normal parking area, the parking angle is less than or equal to the angle threshold, and the parking direction is the same as the normal direction, the shared vehicle is determined to be an unconditionally parked vehicle (i.e., the vehicle is parked illegally), otherwise, if at least one of the three conditions is not satisfied (i.e., one condition is not satisfied, if the parking angle is greater than the angle threshold, or a plurality of conditions are not satisfied, if the parking angle is greater than the angle threshold and the parking direction is different from the normal direction), the shared vehicle is determined to be the vehicle parked in the normal direction.
In the present embodiment, by determining the detection result of the shared vehicle in combination with one or more elements of the parking angle and the parking direction, flexibility and diversity of detection can be achieved, and the technical effects of accuracy and reliability of the detection result can be further determined, especially when the detection result is determined in combination with the parking angle and the parking direction.
In connection with the above analysis, it may be determined whether the actual parking area is within the normal parking area based on the matching result, the above embodiment mainly exemplarily describes the method for detecting the shared vehicle parking event by taking the actual parking area is within the normal parking area as an example, and in other embodiments, the actual parking area may be at least partially outside the normal parking area, then:
the parking distance between the actual parking area and the canonical parking area may be determined from the keypoints and a detection result including the parking distance may be generated. The detection result including the parking distance is a detection result of a vehicle which represents that the shared vehicle is not parked in a standard manner.
That is, if a partial area in the actual parking area does not belong to the normal parking area, or if all areas in the actual parking area do not belong to the normal parking area, a detection result for characterizing that the shared vehicle is a vehicle that is not normally parked is generated, and in the detection result, a parking distance between the actual parking area and the normal parking area is further included, so that the detection result has a richer content, thereby improving the technical effect of reliability of detection.
In some embodiments, if the shared vehicle is a shared bicycle or a shared electric vehicle, the key points include a first wheel center point and a second wheel center point of the shared vehicle, determining the parking distance between the actual parking area and the standard parking area may include the following steps:
a first step of: and converting the image coordinate information of the first wheel axle center point into physical coordinate information under a physical coordinate system, and converting the image coordinate information of the second wheel axle center point into physical coordinate information under the physical coordinate system.
For the principle of conversion between the image coordinate information and the physical coordinate information, reference may be made to the above embodiment, and the description thereof is omitted here.
And a second step of: and determining the physical coordinate information of the center point of the shared vehicle according to the physical coordinate information of the first wheel center point and the physical coordinate information of the second wheel center point.
And a third step of: and calculating the distance between the physical coordinate information of the center point and the standard parking area, and determining the distance between the physical coordinate information of the center point and the standard parking area as the parking distance.
In this embodiment, the parking distance is determined by combining the physical coordinate information of the center point, so that the determined parking distance has higher accuracy and reliability, and further, the detection result has the technical effects of higher accuracy and reliability.
By way of example, in connection with the application scenario shown in fig. 2, it is possible to determine a vertical line passing through a center point perpendicular to a normal parking area, determine a center line between two parking lines according to the two parking lines, determine physical coordinate information of an intersection point of the vertical line and the center line, calculate a distance between the physical coordinate information of the center point and the physical coordinate information of the intersection point, and determine the distance as a parking distance.
In some embodiments, if the shared vehicle is a shared vehicle, the key points include a third wheel center point, a fourth wheel center point, a fifth wheel center point, and a sixth wheel center point of the shared vehicle, determining the parking distance between the actual parking area and the standard parking area may include the following steps:
a first step of: and converting the image coordinate information corresponding to the third wheel axle center point, the fourth wheel axle center point, the fifth wheel axle center point and the sixth wheel axle center point into physical coordinate information under a physical coordinate system.
For the principle of conversion between the image coordinate information and the physical coordinate information, reference may be made to the above embodiment, and the description thereof is omitted here.
And a second step of: and determining physical coordinate information of a center point of the shared vehicle according to the physical coordinate information corresponding to the third wheel axle center point, the fourth wheel axle center point, the fifth wheel axle center point and the sixth wheel axle center point.
For example, as can be seen from the above-described embodiment and fig. 5, the physical coordinate information of each of the third wheel center point, the fourth wheel center point, the fifth wheel center point, and the sixth wheel center point can be determined, and the third wheel center point and the fifth wheel center point are the center points of the two front wheels of the shared vehicle, and the physical coordinate information of the wheel center point 505 of the center points of the two front wheels can be calculated; correspondingly, the axle center point of the fourth wheel and the axle center point of the sixth wheel can be determined as the axle center points of the two rear wheels of the shared vehicle, and the physical coordinate information of the wheel center point 506 of the axle center points of the two rear wheels can be calculated; accordingly, the physical coordinate information of the wheel center point 505 and the physical coordinate information of the wheel center point 506 may be determined, and the physical coordinate information of the wheel center point 505 and the center point (the point indicated by 507 shown in fig. 5) of the wheel center point 506 may be determined, where the center point is the center point of the shared vehicle, and the physical coordinate information of the center point is the physical coordinate information of the center point of the shared vehicle.
And a third step of: and calculating the distance between the physical coordinate information of the center point and the standard parking area, and determining the distance between the physical coordinate information of the center point and the standard parking area as the parking distance.
Similarly, in this embodiment, the parking distance is determined by combining the physical coordinate information of the center point, so that the determined parking distance has higher accuracy and reliability, and further, the detection result has the technical effects of higher accuracy and reliability.
Fig. 7 is a schematic diagram of a third embodiment of the present disclosure, as shown in fig. 7, a method for detecting a shared vehicle parking event according to an embodiment of the present disclosure, including:
s701: a parking image is acquired when the shared vehicle is parked.
The parking image comprises a shared vehicle and a parking line for guiding the shared vehicle to park.
Similarly, in order to avoid repetitive description, the same features as those of the above embodiment are not repeated in this embodiment.
S702: key points of the shared vehicle in the parking image are identified.
S703: and determining a detection result according to the key points and the parking line.
The detection result represents whether the shared vehicle is a standard parked vehicle or not.
S704: sending alarm information to terminal equipment of an administrator user for managing the shared vehicle, wherein the alarm information is used for prompting the administrator user that the shared vehicle is a non-standard parked vehicle; and/or sending prompt information to the terminal equipment of the user using the shared vehicle, wherein the prompt information is used for prompting the user to park the non-standard parked shared vehicle again.
In one example, the detecting device establishes a communication connection with a terminal device of an administrator user who manages the shared vehicle, and if it is determined that the shared vehicle is a non-standard parked vehicle according to the detection result, the detecting device may send an alert message to the terminal device of the administrator user who manages the shared vehicle, so that the administrator user learns that the non-standard parked shared vehicle is available.
Accordingly, the manager user may process the shared vehicle based on the warning information, such as adjusting the parking position of the shared vehicle (e.g., adjusting the shared vehicle parked outside the normal parking area to the normal parking area, or adjusting the parking angle of the shared vehicle parked in the normal parking area but with a parking angle greater than the angle threshold to be less than the angle threshold, etc.), so as to maintain traffic order, and improve travel safety of vehicles and pedestrians, etc.
In another example, the detection device establishes communication connection with a terminal device of a user of the shared vehicle, and if it is determined that the shared vehicle is a non-standard parked vehicle according to the detection result, the detection device may send a prompt message to the terminal device of the user of the shared vehicle, so that the user can park the non-standard parked shared vehicle again (e.g. adjust the shared vehicle parked outside the standard parking area to the standard parking area, or adjust the parking angle of the shared vehicle parked in the standard parking area but with a parking angle greater than the angle threshold to be less than the angle threshold, etc.), so as to maintain the traffic order, and improve the travel safety of the vehicle and the pedestrian.
In some embodiments, the warning information and the prompt information may include a parking image, physical coordinate information of the shared vehicle, and adjustment requirement information, so that an administrator user and a user can adaptively adjust the shared vehicle based on the parking image.
And optionally, the adjusted shared vehicle and the standard parked vehicle can be displayed in a superimposed manner on the terminal device of the administrator user and the terminal device of the user, so as to guide the administrator user and the user to adjust the shared vehicle.
The standard parked vehicle refers to a shared vehicle when parked in a standard manner, that is, a standard shared vehicle for displaying standard parking to a user.
The adjustment requirement information refers to information for adjusting the non-standard parked shared vehicle to meet the standard parking requirement, such as adjustment requirement distance, adjustment requirement angle, and adjustment requirement direction.
Illustratively, take adjusting the required distance as an example: the parking distance is determined as the adjustment demand distance.
Taking the adjustment of the demand angle as an example: and calculating an angle difference value between the parking angle and the angle threshold value, and determining the angle difference value as an adjustment demand angle.
Taking the adjustment of the direction of demand as an example: according to the parking direction and the standard direction, the direction when the shared vehicle is adjusted is determined, and the direction is determined as the adjustment demand direction.
In some embodiments, to facilitate management of the non-standard parked shared vehicles by the administrator user, when the detected number of the non-standard parked shared vehicles reaches a preset number threshold, warning information may be sent to the terminal device of the administrator user, and the number of the non-standard parked shared vehicles may be carried in the warning information.
For example, the detection device may perform counting processing on the non-standard parked shared vehicles, and if a non-standard parked shared vehicle is detected, the counting operation of adding 1 is performed until the counted number reaches the number threshold, and a warning message is sent to the terminal device of the administrator user.
In other embodiments, when the detected parking area of the non-standard parked shared vehicle reaches the preset area threshold, the warning information may be sent to the terminal device of the administrator user, and the parking area of the non-standard parked shared vehicle may be carried in the warning information.
For example, after determining that the shared vehicle is a non-standard parked vehicle, the detection device may determine a parking area of the shared vehicle based on physical coordinate information of a key point of the shared vehicle, and may sequentially accumulate parking areas of the non-standard parked vehicles in an area accumulating manner until the accumulated parking areas reach an area threshold, and send warning information to a terminal device of an administrator user.
Similarly, the number threshold and the area threshold may be determined based on a requirement, a history, and a test, which is not limited in this embodiment.
It should be noted that, in this embodiment, by sending alert information to the terminal device of the administrator user and/or sending prompt information to the terminal device of the user, the administrator user and/or the user is guided to adjust the non-standard parked shared vehicle, so that the non-standard parked shared vehicle is converted into the standard parked shared vehicle, thereby realizing the technical effects of maintaining traffic order, ensuring travel safety and improving travel safety.
Fig. 8 is a schematic diagram of a fourth embodiment of the present disclosure, as shown in fig. 8, a shared-vehicle-park-event detection apparatus 800 of an embodiment of the present disclosure, comprising:
an obtaining unit 801 is configured to obtain a parking image when the shared vehicle is parked, where the parking image includes the shared vehicle and a parking line for guiding the shared vehicle to park.
And an identification unit 802 for identifying key points of the shared vehicle in the parking image.
And a determining unit 803, configured to determine a detection result according to the key point and the parking line, where the detection result characterizes whether the shared vehicle is a vehicle parked in a standard manner.
Fig. 9 is a schematic diagram of a fifth embodiment of the present disclosure, as shown in fig. 9, a shared-vehicle-park-event detection apparatus 900 of an embodiment of the present disclosure, including:
an acquisition unit 901 for acquiring a parking image when the shared vehicle is parked, wherein the parking image includes the shared vehicle and a parking line for guiding the shared vehicle to park.
And an identification unit 902 for identifying key points of the shared vehicle in the parking image.
A determining unit 903, configured to determine a detection result according to the key point and the parking line, where the detection result characterizes whether the shared vehicle is a vehicle parked in a standard manner.
As can be seen in connection with fig. 9, in some embodiments, the determining unit 903 comprises:
the first determining subunit 9031 is configured to determine a normal parking area of the shared vehicle according to the parking line.
A second determining subunit 9032 is configured to determine an actual parking area of the shared vehicle according to the keypoints.
The third determining subunit 9033 is configured to determine the detection result according to the standard parking area and the actual parking area.
In some embodiments, the third determination subunit 9033 comprises:
and the matching module is used for matching the actual parking area with the standard parking area to obtain a matching result.
And the first determining module is used for determining the parking angle and/or the parking direction of the shared vehicle according to the key points if the matching result indicates that the actual parking area is in the standard parking area.
In some embodiments, the keypoints have image coordinate information in the image coordinate system; a first determination module comprising:
the first conversion sub-module is used for converting the image coordinate information of the key points into physical coordinate information under a physical coordinate system.
In some embodiments, the first conversion sub-module is configured to obtain a conversion matrix, where the conversion matrix is configured to characterize a conversion relationship that converts image coordinate information of a pixel point in the parking image that is referenced to an image coordinate system into physical coordinate information that is referenced to a physical coordinate system; and converting the image coordinate information of the key points into physical coordinate information under a physical coordinate system according to the conversion matrix.
In some embodiments, the first conversion sub-module is configured to obtain image coordinate information and physical coordinate information corresponding to each of any three non-collinear points in the parking image, and construct a conversion matrix according to the image coordinate information and the physical coordinate information corresponding to each of any three non-collinear points; or,
The first conversion sub-module is used for acquiring image coordinate information and physical coordinate information corresponding to points which form a quadrilateral at will in the parking image, and constructing a conversion matrix according to the image coordinate information and the physical coordinate information corresponding to the points which form the quadrilateral at will; or,
the first conversion sub-module is used for acquiring image coordinate information and physical coordinate information corresponding to any more than four points in the parking image, and carrying out iterative computation on the image coordinate information and the physical coordinate information corresponding to the any more than four points based on a random sampling consistency algorithm or a least square method to obtain a conversion matrix; or,
the first conversion sub-module is used for acquiring image coordinate information and physical coordinate information corresponding to each of a plurality of points in the parking image, and carrying out iterative computation on the image coordinate information and the physical coordinate information corresponding to each of the plurality of points based on a preset network model to obtain a conversion matrix.
And the first determining submodule is used for determining the parking angle and/or the parking direction according to the physical coordinate information of the key points.
In some embodiments, the key points include axle center points of wheels of the shared vehicle, and the first determining submodule is configured to generate a central axis for parking the shared vehicle according to physical coordinate information of the axle center points of the wheels, and determine a parking angle and/or a parking direction according to the central axis.
In some embodiments, park direction index lines are included in the park lines; the first determination submodule is used for acquiring physical coordinate information of the parking direction guide wire under a physical coordinate system; and calculating an included angle between the central axis and the parking direction guide wire according to the physical coordinate information of the parking direction guide wire, and determining the included angle as a parking angle.
In some embodiments, if the shared vehicle is a shared bicycle or a shared electric vehicle, the axle center point includes a first wheel center point and a second wheel center point of the shared vehicle; the first determining submodule is used for determining a line comprising the first wheel axle center point and the second wheel axle center point as a central axis of the parking shared vehicle according to the physical coordinate information of the first wheel axle center point and the physical coordinate information of the second wheel axle center point.
In some embodiments, the keypoints comprise at least a plurality of points on the handlebars of the shared vehicle; the first determination submodule is used for determining a suspected parking direction according to the central axis; determining the direction of the handlebar according to the physical coordinate information of a plurality of points on the handlebar; and determining the parking direction according to the direction of the handle bar and the suspected parking direction.
In some embodiments, if the shared vehicle is a shared bicycle or a shared electric vehicle, the key points include at least a plurality of points on the handlebars of the shared vehicle; the first determining submodule is used for determining the direction of the handlebar according to physical coordinate information of a plurality of points on the handlebar and determining the parking direction according to the direction of the handlebar.
In some embodiments, if the shared vehicle is a shared vehicle, the axle center points include a third axle center point, a fourth axle center point, a fifth axle center point, and a sixth axle center point of the shared vehicle; the first determining submodule is used for determining a plurality of wheel axes of the shared vehicle according to the physical coordinate information corresponding to the third wheel axle center point, the fourth wheel axle center point, the fifth wheel axle center point and the sixth wheel axle center point; and determining the central axis of the parked shared vehicle according to the wheel axes.
In some embodiments, if the shared vehicle is a shared vehicle, the axle center points include a third axle center point, a fourth axle center point, a fifth axle center point, and a sixth axle center point of the shared vehicle; the first determining submodule is used for determining a central point among a plurality of wheel axle center points of the shared vehicle according to physical coordinate information corresponding to each of the third wheel axle center point, the fourth wheel axle center point, the fifth wheel axle center point and the sixth wheel axle center point; and determining the central axis of the shared vehicle according to the central points among the wheel axis points of the shared vehicle.
And the second determining module is used for determining a detection result according to the parking angle and/or the parking direction.
In some embodiments, the second determining module is configured to determine that the detection result is a detection result indicating that the shared vehicle is a vehicle that is not parked in a standard if the parking angle is greater than a preset angle threshold and/or the parking direction is different from a preset standard direction.
In some embodiments, the third determining subunit 9033 further comprises:
and the third determining module is used for determining the parking distance between the actual parking area and the standard parking area according to the key points if the matching result indicates that the actual parking area is outside the standard parking area.
In some embodiments, the keypoints include at least the wheel axle center point of the shared vehicle; a third determination module, comprising:
the second conversion sub-module is used for converting the image coordinate information of the wheel axle center point into physical coordinate information under a physical coordinate system.
And the second determining submodule is used for determining the physical coordinate information of the center point of the shared vehicle according to the physical coordinate information of the wheel center point.
In some embodiments, if the shared vehicle is a shared bicycle or a shared electric vehicle, the axle center point includes a first wheel center point and a second wheel center point of the shared vehicle; the second determining sub-module is used for determining a center point between the first wheel axle center point and the second wheel axle center point according to the physical coordinate information of the first wheel axle center point and the physical coordinate information of the second wheel axle center point, and determining the physical coordinate information of the center point between the first wheel axle center point and the second wheel axle center point as the physical coordinate information of the center point of the shared vehicle.
And the calculating sub-module is used for calculating the distance between the physical coordinate information of the center point and the standard parking area.
And the third determination submodule is used for determining the distance between the physical coordinate information of the central point and the standard parking area as the parking distance.
And the generating module is used for generating a detection result comprising the parking distance.
The detection result including the parking distance is a detection result of a vehicle which represents that the shared vehicle is not parked in a standard manner.
The first output unit 904 is configured to send alert information to a terminal device of an administrator user managing the shared vehicle, where the alert information is configured to prompt the administrator user that the shared vehicle is a vehicle that is not parked in a standard manner. And/or the number of the groups of groups,
the second output unit 905 is configured to send a prompt message to a terminal device of a user using the shared vehicle, where the prompt message is configured to prompt the user to park the non-standard parked shared vehicle again.
According to another aspect of the present disclosure, there is also provided a system for detecting a shared vehicle parking event, the system comprising: the image capturing device, the detection device for a shared vehicle parking event according to any one of the embodiments above, wherein the image capturing device is configured to capture a parking image when the shared vehicle is parked.
In some embodiments, the detection system further comprises a storage device configured to receive detection results transmitted by the detection means of the shared vehicle parking event and store the detection results.
The storage device may be a local storage device or a cloud storage device. For example, the detection result may be stored in a local storage device, or may be stored in a cloud storage device, or may be stored in both the local storage device and the cloud storage device, which is not limited in this embodiment.
The image acquisition device can be electronic equipment such as a camera and the like with an image acquisition function. The mounting height information of the image acquisition setting may be determined based on an image acquisition range of the image acquisition apparatus.
If the view angle of the image acquisition device can be determined based on the focal length of the image acquisition device, each image acquisition range of the image acquisition device under each to-be-installed height is determined according to the view angle of the image acquisition device, the image acquisition range corresponding to the standard parking area is determined from each image acquisition range, namely the image acquisition range covers the standard parking area in a whole, the to-be-installed height corresponding to the image acquisition range is determined as the installation height information, so that the comprehensiveness of the image acquisition device to the parked images in the standard parking area is improved, and the technical effects of detection accuracy and reliability are further achieved.
In other embodiments, it may be determined that any shared vehicle parked in the normal parking area is blocked by other shared vehicles parked in the normal parking area at each height to be installed, and the installation height information is determined from each height to be installed based on the blocking information and the normal parking area, so as to avoid the disadvantage that the accuracy of detection is reduced due to excessive blocking between the shared vehicles, thereby further improving the technical effects of reliability, accuracy, and effectiveness of detection.
Fig. 10 is a schematic diagram according to a sixth embodiment of the present disclosure, as shown in fig. 10, an electronic device 1000 in the present disclosure may include: a processor 1001 and a memory 1002.
A memory 1002 for storing a program; the memory 1002 may include a volatile memory (english: volatile memory), such as a random-access memory (RAM), such as a static random-access memory (SRAM), a double data rate synchronous dynamic random-access memory (DDR SDRAM), etc.; the memory may also include a non-volatile memory (English) such as a flash memory (English). The memory 1002 is used to store computer programs (e.g., application programs, functional modules, etc. that implement the methods described above), computer instructions, etc., which may be stored in one or more of the memories 1002 in a partitioned manner. And the above-described computer programs, computer instructions, data, etc. may be invoked by the processor 1001.
The computer programs, computer instructions, etc., described above may be stored in one or more of the memories 1002 in partitions. And the above-described computer programs, computer instructions, etc. may be invoked by the processor 1001.
A processor 1001 for executing computer programs stored in a memory 1002 to implement the steps in the method according to the above embodiment.
Reference may be made in particular to the description of the embodiments of the method described above.
The processor 1001 and the memory 1002 may be separate structures or may be integrated structures integrated together. When the processor 1001 and the memory 1002 are separate structures, the memory 1002 and the processor 1001 may be coupled by a bus 1003.
The electronic device in this embodiment may execute the technical scheme in the above method, and the specific implementation process and the technical principle are the same, which are not described herein again.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device, a readable storage medium and a computer program product.
According to an embodiment of the present disclosure, the present disclosure also provides a computer program product comprising: a computer program stored in a readable storage medium, from which at least one processor of an electronic device can read, the at least one processor executing the computer program causing the electronic device to perform the solution provided by any one of the embodiments described above.
Fig. 11 illustrates a schematic block diagram of an example electronic device 1100 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 11, the apparatus 1100 includes a computing unit 1101 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 1102 or a computer program loaded from a storage unit 1108 into a Random Access Memory (RAM) 1103. In the RAM 1103, various programs and data required for the operation of the device 1100 can also be stored. The computing unit 1101, ROM 1102, and RAM 1103 are connected to each other by a bus 1104. An input/output (I/O) interface 1105 is also connected to bus 1104.
Various components in device 1100 are connected to I/O interface 1105, including: an input unit 1106 such as a keyboard, a mouse, etc.; an output unit 1107 such as various types of displays, speakers, and the like; a storage unit 1108, such as a magnetic disk, optical disk, etc.; and a communication unit 1109 such as a network card, modem, wireless communication transceiver, or the like. The communication unit 1109 allows the device 1100 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The computing unit 1101 may be a variety of general purpose and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 1101 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 1101 performs the various methods and processes described above, such as a method of detecting a shared vehicle parking event. For example, in some embodiments, the method of detecting a shared vehicle parking event may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 1108. In some embodiments, some or all of the computer programs may be loaded and/or installed onto device 1100 via ROM 1102 and/or communication unit 1109. When the computer program is loaded into the RAM 1103 and executed by the computing unit 1101, one or more steps of the above-described method of detecting a shared park event may be performed. Alternatively, in other embodiments, the computing unit 1101 may be configured to perform the method of detecting a shared vehicle parking event in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service ("Virtual Private Server" or simply "VPS") are overcome. The server may also be a server of a distributed system or a server that incorporates a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel or sequentially or in a different order, provided that the desired results of the technical solutions of the present disclosure are achieved, and are not limited herein.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.
Claims (32)
1. A method of detecting a shared vehicle parking event, comprising:
obtaining a parking image when a shared vehicle is parked, wherein the parking image comprises the shared vehicle and a parking line for guiding the shared vehicle to park;
identifying key points of the shared vehicle in the parking image, wherein the key points have image coordinate information in an image coordinate system, and the key points comprise wheel center points of wheels of the shared vehicle;
Determining a standard parking area of the shared vehicle according to the parking line, and determining an actual parking area of the shared vehicle according to the key point;
matching the actual parking area with the standard parking area to obtain a matching result;
if the matching result indicates that the actual parking area is in the standard parking area, determining a parking angle and/or a parking direction of the shared vehicle according to the key points, and determining a detection result according to the parking angle and/or the parking direction, wherein the detection result indicates whether the shared vehicle is a standard parked vehicle or not;
determining the parking angle and/or the parking direction of the shared vehicle according to the key points, wherein the method comprises the following steps:
and converting the image coordinate information of the key points into physical coordinate information under a physical coordinate system, generating a central axis for parking the shared vehicle according to the physical coordinate information of the wheel axle center point, and determining the parking angle and/or the parking direction according to the central axis.
2. The method of claim 1, wherein the park line includes a park direction index line therein; determining the parking angle from the central axis includes:
Acquiring physical coordinate information of the parking direction guide wire under the physical coordinate system;
and calculating an included angle between the central axis and the parking direction guide wire according to the physical coordinate information of the parking direction guide wire, and determining the included angle as the parking angle.
3. The method of claim 1 or 2, wherein if the shared vehicle is a shared bicycle or a shared electric vehicle, the axle center point comprises a first wheel center point and a second wheel center point of the shared vehicle; generating a central axis for parking the shared vehicle according to the physical coordinate information of the wheel axle center point, including:
and determining a line comprising the first wheel axle center point and the second wheel axle center point as a central axis for parking the shared vehicle according to the physical coordinate information of the first wheel axle center point and the physical coordinate information of the second wheel axle center point.
4. A method according to claim 3, wherein the key points comprise at least a plurality of points on a handlebar of the shared vehicle; determining the parking direction according to the central axis comprises:
determining a suspected parking direction according to the central axis;
determining the direction of the handlebar according to physical coordinate information of a plurality of points on the handlebar;
And determining the parking direction according to the direction of the handle bar and the suspected parking direction.
5. The method of claim 1, wherein if the shared vehicle is a shared bicycle or a shared electric vehicle, the key points comprise at least a plurality of points on a handlebar of the shared vehicle; determining the parking direction according to the physical coordinate information of the key points comprises the following steps:
and determining the direction of the handlebar according to the physical coordinate information of a plurality of points on the handlebar, and determining the parking direction according to the direction of the handlebar.
6. The method of claim 1, wherein if the shared vehicle is a shared automobile, the axle center point comprises a third axle center point, a fourth axle center point, a fifth axle center point, and a sixth axle center point of the shared vehicle; generating a central axis for parking the shared vehicle according to the physical coordinate information of the wheel axle center point, including:
determining a plurality of wheel axes of the shared vehicle according to the physical coordinate information corresponding to the third wheel axle center point, the fourth wheel axle center point, the fifth wheel axle center point and the sixth wheel axle center point;
and determining the central axis for parking the shared vehicle according to the wheel axes.
7. The method of claim 1 or 2, wherein if the shared vehicle is a shared automobile, the axle center points include a third axle center point, a fourth axle center point, a fifth axle center point, and a sixth axle center point of the shared vehicle; generating a central axis for parking the shared vehicle according to the physical coordinate information of the wheel axle center point, including:
determining a center point among a plurality of wheel center points of the shared vehicle according to physical coordinate information corresponding to each of the third wheel center point, the fourth wheel center point, the fifth wheel center point and the sixth wheel center point;
and determining the central axis of the shared vehicle according to the central points among the wheel axis points of the shared vehicle.
8. The method according to any of claims 4-6, wherein determining the detection result from the parking angle and/or the parking direction comprises:
and if the parking angle is larger than a preset angle threshold value and/or the parking direction is different from a preset standard direction, determining that the detection result is a detection result representing that the shared vehicle is a vehicle which is not parked normally.
9. The method of claim 8, after matching the actual parking area with the canonical parking area, further comprising:
If the matching result represents that the actual parking area is outside the standard parking area, determining the parking distance between the actual parking area and the standard parking area according to the key points, and generating a detection result comprising the parking distance;
the detection result comprising the parking distance is a detection result representing that the shared vehicle is a vehicle which is not parked normally.
10. The method of claim 9, wherein the keypoints comprise at least axle center points of the shared vehicle; determining the parking distance between the standard parking areas of the actual parking areas according to the key points comprises the following steps:
converting the image coordinate information of the wheel axle center point into physical coordinate information under a physical coordinate system, and determining the physical coordinate information of the center point of the shared vehicle according to the physical coordinate information of the wheel axle center point;
and calculating the distance between the physical coordinate information of the central point and the standard parking area, and determining the distance between the physical coordinate information of the central point and the standard parking area as the parking distance.
11. The method of claim 10, wherein if the shared vehicle is a shared bicycle or a shared electric vehicle, the axle center point comprises a first wheel center point and a second wheel center point of the shared vehicle; determining physical coordinate information of a center point of the shared vehicle according to the physical coordinate information of the wheel axle center point, including:
And determining a center point between the first wheel axle center point and the second wheel axle center point according to the physical coordinate information of the first wheel axle center point and the physical coordinate information of the second wheel axle center point, and determining the physical coordinate information of the center point between the first wheel axle center point and the second wheel axle center point as the physical coordinate information of the center point of the shared vehicle.
12. The method of any of claims 4-6, wherein converting the image coordinate information of the keypoint to physical coordinate information under a physical coordinate system comprises:
obtaining a conversion matrix, wherein the conversion matrix is used for representing a conversion relation of converting image coordinate information of pixel points taking an image coordinate system as a reference in the parking image into physical coordinate information taking a physical coordinate system as a reference;
and converting the image coordinate information of the key points into physical coordinate information under a physical coordinate system according to the conversion matrix.
13. The method of claim 12, wherein obtaining a transformation matrix comprises:
acquiring image coordinate information and physical coordinate information corresponding to any three non-collinear points in the parking image, and constructing the transformation matrix according to the image coordinate information and the physical coordinate information corresponding to any three non-collinear points; or,
Acquiring image coordinate information and physical coordinate information corresponding to points which form a quadrilateral in the parking image, and constructing the transformation matrix according to the image coordinate information and the physical coordinate information corresponding to the points which form the quadrilateral; or,
acquiring image coordinate information and physical coordinate information corresponding to any more than four points in the parking image, and performing iterative computation on the image coordinate information and the physical coordinate information corresponding to the any more than four points based on a random sampling consistency algorithm or a least square method to obtain the conversion matrix; or,
and acquiring image coordinate information and physical coordinate information corresponding to each of a plurality of points in the parking image, and performing iterative computation on the image coordinate information and the physical coordinate information corresponding to each of the plurality of points based on a preset network model to obtain the transformation matrix.
14. The method of claim 13, further comprising:
sending alarm information to terminal equipment of an administrator user for managing the shared vehicle, wherein the alarm information is used for prompting the administrator user that the shared vehicle is a non-standard parked vehicle;
And/or sending prompt information to terminal equipment of a user using the shared vehicle, wherein the prompt information is used for prompting the user to park the non-standard parked shared vehicle again.
15. A shared vehicle parking event detection apparatus, comprising:
an acquisition unit configured to acquire a parking image when a shared vehicle is parked, wherein the parking image includes the shared vehicle and a parking line for guiding the shared vehicle to park;
an identification unit configured to identify key points of the shared vehicle in the parking image;
the determining unit is used for determining a detection result according to the key points and the parking line, wherein the detection result represents whether the shared vehicle is a standard parked vehicle or not;
the determination unit includes:
a first determining subunit, configured to determine a canonical parking area of the shared vehicle according to the parking line;
a second determining subunit, configured to determine an actual parking area of the shared vehicle according to the keypoints;
a third determining subunit, configured to determine the detection result according to the canonical parking region and the actual parking region;
The third determination subunit includes:
the matching module is used for matching the actual parking area with the standard parking area to obtain a matching result;
the first determining module is used for determining the parking angle and/or the parking direction of the shared vehicle according to the key point if the matching result represents that the actual parking area is in the standard parking area;
the second determining module is used for determining the detection result according to the parking angle and/or the parking direction;
wherein the key points have image coordinate information under an image coordinate system; the first determining module includes:
the first conversion sub-module is used for converting the image coordinate information of the key points into physical coordinate information under a physical coordinate system;
the first determining submodule is used for determining the parking angle and/or the parking direction according to the physical coordinate information of the key points;
the key points comprise wheel center points of wheels of the shared vehicle, and the first determination submodule is used for generating a central axis for parking the shared vehicle according to physical coordinate information of the wheel center points and determining the parking angle and/or the parking direction according to the central axis.
16. The apparatus of claim 15, wherein the park line includes a park direction index line therein; the first determination submodule is used for acquiring physical coordinate information of the parking direction guide wire under the physical coordinate system; and calculating an included angle between the central axis and the parking direction guide wire according to the physical coordinate information of the parking direction guide wire, and determining the included angle as the parking angle.
17. The apparatus of claim 15 or 16, wherein if the shared vehicle is a shared bicycle or a shared electric vehicle, the axle center point comprises a first wheel center point and a second wheel center point of the shared vehicle; the first determining submodule is used for determining a line comprising the first wheel axle center point and the second wheel axle center point as a central axis for parking the shared vehicle according to the physical coordinate information of the first wheel axle center point and the physical coordinate information of the second wheel axle center point.
18. The apparatus of claim 17, wherein the keypoints comprise at least a plurality of points on a handlebar of the shared vehicle; the first determination submodule is used for determining a suspected parking direction according to the central axis; determining the direction of the handlebar according to physical coordinate information of a plurality of points on the handlebar; and determining the parking direction according to the direction of the handle bar and the suspected parking direction.
19. The apparatus of claim 15, wherein if the shared vehicle is a shared bicycle or a shared electric vehicle, the key points comprise at least a plurality of points on a handlebar of the shared vehicle; the first determining submodule is used for determining the direction of the handlebar according to physical coordinate information of a plurality of points on the handlebar and determining the parking direction according to the direction of the handlebar.
20. The apparatus of claim 15, wherein if the shared vehicle is a shared automobile, the axle center point comprises a third axle center point, a fourth axle center point, a fifth axle center point, and a sixth axle center point of the shared vehicle; the first determining submodule is used for determining a plurality of wheel axes of the shared vehicle according to physical coordinate information corresponding to the third wheel axle center point, the fourth wheel axle center point, the fifth wheel axle center point and the sixth wheel axle center point; and determining the central axis for parking the shared vehicle according to the wheel axes.
21. The apparatus of claim 15 or 16, wherein if the shared vehicle is a shared automobile, the axle center point comprises a third axle center point, a fourth axle center point, a fifth axle center point, and a sixth axle center point of the shared vehicle; the first determining submodule is used for determining a center point among a plurality of wheel axle center points of the shared vehicle according to physical coordinate information corresponding to the third wheel axle center point, the fourth wheel axle center point, the fifth wheel axle center point and the sixth wheel axle center point; and determining the central axis of the shared vehicle according to the central points among the wheel axis points of the shared vehicle.
22. The apparatus according to any one of claims 18-20, wherein the second determining module is configured to determine the detection result as a detection result indicating that the shared vehicle is a vehicle that is not parked in a standard if the parking angle is greater than a preset angle threshold and/or the parking direction is different from a preset standard direction.
23. The apparatus of claim 22, the third determination subunit further comprising:
a third determining module, configured to determine a parking distance between the actual parking area and the standard parking area according to the key point if the matching result indicates that the actual parking area is outside the standard parking area;
the generation module is used for generating a detection result comprising the parking distance;
the detection result comprising the parking distance is a detection result representing that the shared vehicle is a vehicle which is not parked normally.
24. The apparatus of claim 23, wherein the keypoints comprise at least axle center points of the shared vehicle; the third determining module includes:
the second conversion sub-module is used for converting the image coordinate information of the wheel axle center point into physical coordinate information under a physical coordinate system;
The second determining submodule is used for determining physical coordinate information of a center point of the shared vehicle according to the physical coordinate information of the wheel axle center point;
a calculating sub-module, configured to calculate a distance between the physical coordinate information of the center point and the canonical parking region;
and a third determining sub-module, configured to determine a distance between the physical coordinate information of the center point and the standard parking area as the parking distance.
25. The apparatus of claim 24, wherein if the shared vehicle is a shared bicycle or a shared electric vehicle, the axle center point comprises a first wheel center point and a second wheel center point of the shared vehicle; the second determining submodule is used for determining a center point between the first wheel axle center point and the second wheel axle center point according to the physical coordinate information of the first wheel axle center point and the physical coordinate information of the second wheel axle center point, and determining the physical coordinate information of the center point between the first wheel axle center point and the second wheel axle center point as the physical coordinate information of the center point of the shared vehicle.
26. The apparatus of any one of claims 18-20, wherein the first conversion sub-module is configured to obtain a conversion matrix, wherein the conversion matrix is configured to characterize a conversion relationship that converts image coordinate information of pixels in the park image that are referenced to an image coordinate system into physical coordinate information that is referenced to a physical coordinate system; and converting the image coordinate information of the key points into physical coordinate information under a physical coordinate system according to the conversion matrix.
27. The apparatus of claim 26, wherein the first transformation submodule is configured to obtain image coordinate information and physical coordinate information corresponding to each of any three non-collinear points in the parking image, and construct the transformation matrix according to the image coordinate information and the physical coordinate information corresponding to each of the any three non-collinear points; or,
the first conversion sub-module is used for acquiring image coordinate information and physical coordinate information corresponding to points which form a quadrilateral arbitrarily in the parking image, and constructing the conversion matrix according to the image coordinate information and the physical coordinate information corresponding to the points which form the quadrilateral arbitrarily; or,
the first conversion sub-module is used for acquiring image coordinate information and physical coordinate information corresponding to any more than four points in the parking image, and carrying out iterative computation on the image coordinate information and the physical coordinate information corresponding to the any more than four points based on a random sampling consistency algorithm or a least square method to obtain the conversion matrix; or,
the first conversion sub-module is used for acquiring image coordinate information and physical coordinate information corresponding to each of a plurality of points in the parking image, and performing iterative computation on the image coordinate information and the physical coordinate information corresponding to each of the plurality of points based on a preset network model to obtain the conversion matrix.
28. The apparatus of claim 27, further comprising:
the first output unit is used for sending warning information to terminal equipment of an administrator user for managing the shared vehicle, and the warning information is used for prompting the administrator user that the shared vehicle is a non-standard parked vehicle; and/or the number of the groups of groups,
and the second output unit is used for sending prompt information to the terminal equipment of the user using the shared vehicle, wherein the prompt information is used for prompting the user to park the non-standard parked shared vehicle again.
29. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-14.
30. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-14.
31. A computer program product comprising a computer program which, when executed by a processor, implements the steps of the method of any of claims 1-14.
32. A system for detecting a shared vehicle parking event, comprising: image acquisition device, and a detection device for a shared vehicle parking event as claimed in any one of claims 15-28, wherein,
the image acquisition device is used for acquiring a parking image when the shared vehicle is parked.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111649234.5A CN114333390B (en) | 2021-12-29 | 2021-12-29 | Method, device and system for detecting shared vehicle parking event |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111649234.5A CN114333390B (en) | 2021-12-29 | 2021-12-29 | Method, device and system for detecting shared vehicle parking event |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114333390A CN114333390A (en) | 2022-04-12 |
CN114333390B true CN114333390B (en) | 2023-08-08 |
Family
ID=81019666
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111649234.5A Active CN114333390B (en) | 2021-12-29 | 2021-12-29 | Method, device and system for detecting shared vehicle parking event |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114333390B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116504092B (en) * | 2023-04-24 | 2024-06-25 | 深圳市泰比特科技有限公司 | Method, device, equipment and storage medium for calibrating parking position of shared vehicle |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003242593A (en) * | 2002-02-18 | 2003-08-29 | Kawatetsu Galvanizing Co Ltd | Bicycle parking system |
CN107230346A (en) * | 2017-08-01 | 2017-10-03 | 何永安 | Confirmation method, device, server and the storage medium of shared bicycle parking specification |
TW201923693A (en) * | 2017-11-17 | 2019-06-16 | 愛飛凌科技股份有限公司 | Sharing bicycle parking management method and the management system of the same |
CN111242002A (en) * | 2020-01-10 | 2020-06-05 | 上海大学 | Shared bicycle standardized parking judgment method based on computer vision |
CN111754758A (en) * | 2017-06-16 | 2020-10-09 | 侯苏华 | Shared bicycle and parking method thereof |
CN112580477A (en) * | 2020-12-12 | 2021-03-30 | 江西洪都航空工业集团有限责任公司 | Shared bicycle random parking and random parking detection method |
CN112712723A (en) * | 2020-12-25 | 2021-04-27 | 永安行科技股份有限公司 | Shared vehicle standard parking system and method based on machine vision assistance |
CN113052141A (en) * | 2021-04-26 | 2021-06-29 | 超级视线科技有限公司 | Method and device for detecting parking position of vehicle |
CN113076896A (en) * | 2021-04-09 | 2021-07-06 | 北京骑胜科技有限公司 | Standard parking method, system, device and storage medium |
CN113780183A (en) * | 2021-09-13 | 2021-12-10 | 宁波小遛共享信息科技有限公司 | Standard parking determination method and device for shared vehicles and computer equipment |
-
2021
- 2021-12-29 CN CN202111649234.5A patent/CN114333390B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003242593A (en) * | 2002-02-18 | 2003-08-29 | Kawatetsu Galvanizing Co Ltd | Bicycle parking system |
CN111754758A (en) * | 2017-06-16 | 2020-10-09 | 侯苏华 | Shared bicycle and parking method thereof |
CN107230346A (en) * | 2017-08-01 | 2017-10-03 | 何永安 | Confirmation method, device, server and the storage medium of shared bicycle parking specification |
TW201923693A (en) * | 2017-11-17 | 2019-06-16 | 愛飛凌科技股份有限公司 | Sharing bicycle parking management method and the management system of the same |
CN111242002A (en) * | 2020-01-10 | 2020-06-05 | 上海大学 | Shared bicycle standardized parking judgment method based on computer vision |
CN112580477A (en) * | 2020-12-12 | 2021-03-30 | 江西洪都航空工业集团有限责任公司 | Shared bicycle random parking and random parking detection method |
CN112712723A (en) * | 2020-12-25 | 2021-04-27 | 永安行科技股份有限公司 | Shared vehicle standard parking system and method based on machine vision assistance |
CN113076896A (en) * | 2021-04-09 | 2021-07-06 | 北京骑胜科技有限公司 | Standard parking method, system, device and storage medium |
CN113052141A (en) * | 2021-04-26 | 2021-06-29 | 超级视线科技有限公司 | Method and device for detecting parking position of vehicle |
CN113780183A (en) * | 2021-09-13 | 2021-12-10 | 宁波小遛共享信息科技有限公司 | Standard parking determination method and device for shared vehicles and computer equipment |
Also Published As
Publication number | Publication date |
---|---|
CN114333390A (en) | 2022-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110163930B (en) | Lane line generation method, device, equipment, system and readable storage medium | |
EP3633539A2 (en) | Method for position detection, device, and storage medium | |
CN109271944B (en) | Obstacle detection method, obstacle detection device, electronic apparatus, vehicle, and storage medium | |
CN109931944B (en) | AR navigation method, AR navigation device, vehicle-side equipment, server side and medium | |
US11226200B2 (en) | Method and apparatus for measuring distance using vehicle-mounted camera, storage medium, and electronic device | |
JP2020047276A (en) | Method and device for calibrating sensor, computer device, medium, and vehicle | |
JP2020064046A (en) | Vehicle position determining method and vehicle position determining device | |
CN109974734A (en) | A kind of event report method, device, terminal and storage medium for AR navigation | |
CN110299028B (en) | Parking line crossing detection method, device, equipment and readable storage medium | |
CN112525147B (en) | Distance measurement method for automatic driving equipment and related device | |
KR101995223B1 (en) | System, module and method for detecting pedestrian, computer program | |
CN110909626A (en) | Vehicle line pressing detection method and device, mobile terminal and storage medium | |
EP4403879A1 (en) | Vehicle, vehicle positioning method and apparatus, device, and computer-readable storage medium | |
JP2020013573A (en) | Three-dimensional image reconstruction method of vehicle | |
CN112700486A (en) | Method and device for estimating depth of road lane line in image | |
WO2023184869A1 (en) | Semantic map construction and localization method and apparatus for indoor parking lot | |
CN114333390B (en) | Method, device and system for detecting shared vehicle parking event | |
CN114705121A (en) | Vehicle pose measuring method and device, electronic equipment and storage medium | |
EP4024084A2 (en) | Spatial parking place detection method and device, storage medium, and program product | |
CN114386481A (en) | Vehicle perception information fusion method, device, equipment and storage medium | |
CN110111018A (en) | Assess method, apparatus, electronic equipment and the storage medium of vehicle sensing function | |
JP2018073275A (en) | Image recognition device | |
CN117470258A (en) | Map construction method, device, equipment and medium | |
CN112902911B (en) | Ranging method, device, equipment and storage medium based on monocular camera | |
CN108489506B (en) | Method, device, terminal and medium for displaying enlarged intersection image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |