CN115705551A - Method, device and equipment for acquiring vehicle motion track and readable storage medium - Google Patents

Method, device and equipment for acquiring vehicle motion track and readable storage medium Download PDF

Info

Publication number
CN115705551A
CN115705551A CN202110932737.7A CN202110932737A CN115705551A CN 115705551 A CN115705551 A CN 115705551A CN 202110932737 A CN202110932737 A CN 202110932737A CN 115705551 A CN115705551 A CN 115705551A
Authority
CN
China
Prior art keywords
data
vehicle
park
target
platform
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110932737.7A
Other languages
Chinese (zh)
Inventor
段淼然
张晓威
陈皓昕
吴莹莹
张玲
陈慧
张晓乐
张小凤
李玉波
赵敏
李冲
孙天野
郭子荣
郑紫月
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hanhai Information Technology Shanghai Co Ltd
Original Assignee
Hanhai Information Technology Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hanhai Information Technology Shanghai Co Ltd filed Critical Hanhai Information Technology Shanghai Co Ltd
Priority to CN202110932737.7A priority Critical patent/CN115705551A/en
Publication of CN115705551A publication Critical patent/CN115705551A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The application discloses a method, a device and equipment for acquiring a vehicle movement track and a readable storage medium, and belongs to the technical field of computers. The method comprises the following steps: acquiring park entering data of a target vehicle in a target time period; acquiring garden exit data of a target vehicle in a target time period; the method comprises the steps of obtaining platform data of a target vehicle in a target time period; and generating the activity track of the target vehicle based on the circle entering data, the circle exiting data and the platform data of the target vehicle in the target time period. The data acquired by the method is comprehensive, so that the generated moving track of the target vehicle can cover all track points of the target vehicle, the generated moving track is high in accuracy and reliability, and the usability of the moving track can be improved.

Description

Method, device and equipment for acquiring vehicle motion track and readable storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method, a device and equipment for acquiring a vehicle activity track and a readable storage medium.
Background
With the popularization of vehicles, the application of vehicle transportation is more and more. For example, a supplier transports goods to a central bin via a first transport vehicle, and a carrier transports the sold quantity of goods from the central bin to each transport destination via a second transport vehicle within a target time after order taking. There are multiple platforms in the central bin, with a first transport vehicle unloading goods at a different platform and a second transport vehicle loading goods at a different platform.
In order to make suppliers and/or carriers understand the loading and unloading efficiency of transportation vehicles and the operation efficiency of platforms, a method for acquiring the motion trajectory of a transportation vehicle is needed.
Disclosure of Invention
The embodiment of the application provides a method, a device and equipment for acquiring a vehicle motion track and a readable storage medium, which can be used for solving the problems in the related art. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a method for acquiring a vehicle activity track, where the method includes:
acquiring park entering data of a target vehicle in a target time period;
acquiring garden exit data of the target vehicle in the target time period;
acquiring platform data of the target vehicle in the target time period;
and generating an activity track of the target vehicle based on the park entering data, the park exiting data and the platform data of the target vehicle in the target time period.
In one possible implementation manner, the obtaining of the circle entry data of the target vehicle in the target time period includes:
acquiring candidate park entering data, wherein the candidate park entering data comprises park entering time and vehicle information of park entering vehicles;
and taking the park entering data with the consistent vehicle information of the park entering vehicles in the first park entering data and the vehicle information of the target vehicle as the park entering data of the target vehicle, wherein the first park entering data is the park entering data with the park entering time in the candidate park entering data within the target time period.
In one possible implementation manner, the obtaining of the gardening data of the target vehicle in the target time period includes:
acquiring candidate park data, wherein the candidate park data comprises park leaving time and vehicle information of park leaving vehicles;
and taking the park-out data with the consistent vehicle information of the park-out vehicle and the vehicle information of the target vehicle in the first park-out data as the park-out data of the target vehicle, wherein the first park-out data is the park-out data with the park-out time in the candidate park-out data within the target time period.
In one possible implementation manner, the acquiring the platform data of the target vehicle in the target time period includes:
obtaining candidate platform data, wherein the candidate platform data comprises platform coding and recording time;
taking the platform data with the recording time within the target time period in the candidate platform data as first platform data;
taking the platform data with the vehicle information in the first platform data as second platform data;
and taking the platform data of the second platform data, in which the vehicle information is consistent with the vehicle information of the target vehicle, as the platform data of the target vehicle.
In one possible implementation, the method further includes:
determining a matching degree between the vehicle information in the third platform data and the vehicle information of the target vehicle in response to the presence of third platform data in which the vehicle information is incompletely displayed in the second platform data;
and taking the platform data with the matching degree meeting the matching requirement in the third platform data as the platform data of the target vehicle.
In one possible implementation manner, the generating an activity track of the target vehicle based on the park entrance data, the park exit data and the platform data of the target vehicle in the target time period includes:
acquiring a reference platform parking record based on platform data of the target vehicle in the target time period, wherein the reference platform parking record is used for indicating the parking duration of the target vehicle at any platform;
adjusting the reference platform stop record to obtain a platform stop record of the target vehicle;
and generating the activity track of the target vehicle according to the time sequence based on the circle entering data and the circle exiting data of the target vehicle in the target time period and the platform stop record of the target vehicle.
In a possible implementation manner, the adjusting the reference platform stop record to obtain the platform stop record of the target vehicle includes:
in response to the fact that the time interval between the time of leaving the dock of a first dock docking record and the time of approaching the dock of a second dock docking record in the reference dock docking record is smaller than a first numerical value, merging the first dock docking record and the second dock docking record to obtain a candidate dock docking record, wherein the first dock docking record and the second dock docking record are two adjacent dock docking records, the dock codes of the first dock docking record and the second dock docking record are consistent, and the time of leaving the dock of the first dock docking record is earlier than the time of approaching the dock of the second dock docking record;
and in response to the third dock parking record with the parking duration smaller than a second numerical value exists in the candidate dock parking records, discarding the third dock parking record to obtain the dock parking record of the target vehicle.
In one possible implementation manner, after the obtaining of the circle entry data of the target vehicle in the target time period, the method further includes:
in response to the target vehicle missing first entering data in the target time period, taking first platform data in the target time period as first entering data of the target vehicle;
and in response to the target vehicle missing non-first entering data in the target time period, taking the leaving data adjacent to the non-first entering data and before the non-first entering data as the non-first entering data.
In one possible implementation manner, after the obtaining of the park exit data of the target vehicle in the target time period, the method further includes:
in response to the target vehicle missing tail garden leaving data in the target time period, taking the tail platform data in the target time period as the tail garden leaving data of the target vehicle;
in response to the target vehicle missing non-terminal gardening data in the gardening data of the target time period, taking gardening data which are adjacent to the non-terminal gardening data and are subsequent to the non-terminal gardening data as the non-terminal gardening data.
In a possible implementation manner, the obtaining candidate garden entering data includes:
acquiring a first photo and the shooting time of the first photo, wherein the first photo is a park entering photo within a reference time period, the first photo comprises a vehicle, and the time length of the reference time period is not less than the time length of the target time period;
identifying the first photo to obtain vehicle information of a vehicle included in the first photo;
taking the shooting time of the first picture as the park entering time of the vehicle included in the first picture;
and composing the candidate park entering data by the vehicle information of the vehicle included in the first photo and the park entering time of the vehicle.
In a possible implementation manner, the obtaining of the candidate park data includes:
acquiring a second photo and the shooting time of the second photo, wherein the second photo is a park-out photo within a reference time period and comprises a vehicle;
identifying the second photo to obtain vehicle information of the vehicle included in the second photo;
taking the shooting time of the second picture as the departure time of the vehicle included in the second picture;
and composing the vehicle information of the vehicle and the park leaving time of the vehicle included in the second photo into the candidate park data.
In one possible implementation, the obtaining candidate platform data includes:
acquiring a third photo and the shooting time of the third photo, wherein the third photo is a platform photo within a reference time period and comprises a platform code;
responding to the third photo including the vehicle, and identifying the third photo to obtain vehicle information of the vehicle included in the third photo; composing the candidate platform data from the vehicle information of the vehicle included in the third photograph, the platform code, and the recording time;
in response to not including a vehicle in the third photograph, composing the dock code and the recording time as the time of capture of the third photograph into the candidate dock data.
In another aspect, an embodiment of the present application provides an apparatus for acquiring a vehicle movement track, where the apparatus includes:
the first acquisition module is used for acquiring park entering data of a target vehicle in a target time period;
the second acquisition module is used for acquiring the park-out data of the target vehicle in the target time period;
the third acquisition module is used for acquiring the platform data of the target vehicle in the target time period;
and the generation module is used for generating the activity track of the target vehicle based on the circle entering data, the circle exiting data and the platform data of the target vehicle in the target time period.
In a possible implementation manner, the first obtaining module is configured to obtain candidate entering-garden data, where the candidate entering-garden data includes entering-garden time and vehicle information of entering-garden vehicles;
and taking park entering data with the consistent vehicle information of the park entering vehicles in the first park entering data and the vehicle information of the target vehicle as the park entering data of the target vehicle, wherein the first park entering data is the park entering data with the park entering time in the candidate park entering data within the target time period.
In a possible implementation manner, the second obtaining module is configured to obtain candidate park data, where the candidate park data includes park exit time and vehicle information of a park exit vehicle;
and taking the park-out data with the consistent vehicle information of the park-out vehicle and the vehicle information of the target vehicle in the first park-out data as the park-out data of the target vehicle, wherein the first park-out data is the park-out data with the park-out time in the candidate park-out data within the target time period.
In a possible implementation manner, the third obtaining module is configured to obtain candidate platform data, where the candidate platform data includes a platform code and a recording time;
taking the platform data with the recording time within the target time period in the candidate platform data as first platform data;
taking the platform data with the vehicle information in the first platform data as second platform data;
and taking the platform data of the second platform data, the vehicle information of which is consistent with the vehicle information of the target vehicle, as the platform data of the target vehicle.
In a possible implementation manner, the third obtaining module is further configured to determine a matching degree between the vehicle information in the third platform data and the vehicle information of the target vehicle in response to that third platform data with incompletely displayed vehicle information exists in the second platform data;
and taking the platform data with the matching degree meeting the matching requirement in the third platform data as the platform data of the target vehicle.
In one possible implementation, the apparatus further includes:
a fourth obtaining module, configured to obtain a reference platform stop record based on platform data of the target vehicle in the target time period, where the reference platform stop record is used to indicate a length of time for which the target vehicle stops at any platform;
the adjusting module is used for adjusting the reference platform stop record to obtain a platform stop record of the target vehicle;
the generation module is used for generating the activity track of the target vehicle according to the time sequence based on the park entering data and the park exiting data of the target vehicle in the target time period and the platform stop record of the target vehicle.
In a possible implementation manner, the adjusting module is configured to, in response to that a time interval between a time when a first dock parking record leaves a dock and a time when a second dock parking record approaches the dock, which are smaller than a first numerical value, exists in the reference dock parking records, merge the first dock parking record and the second dock parking record to obtain candidate dock parking records, where the first dock parking record and the second dock parking record are two adjacent dock parking records, dock codes of the first dock parking record and the second dock parking record are consistent, and the time when the first dock parking record leaves is earlier than the time when the second dock parking record approaches the dock;
and in response to the third dock parking record with the parking duration smaller than a second numerical value exists in the candidate dock parking records, discarding the third dock parking record to obtain the dock parking record of the target vehicle.
In a possible implementation manner, the first obtaining module is configured to, in response to that the first entering data of the target vehicle is missing from the entering data of the target vehicle in the target time period, use the first platform data of the target vehicle in the target time period as the first entering data of the target vehicle;
and in response to the target vehicle missing non-first entering data in the target time period, taking the leaving data adjacent to the non-first entering data and before the non-first entering data as the non-first entering data.
In a possible implementation manner, the second obtaining module is configured to, in response to that the target vehicle lacks last gardening data in the target time period, take last platform data in platform data of the target vehicle in the target time period as last gardening data of the target vehicle;
in response to the target vehicle missing non-last park-out data in the target time period, taking park-in data which is adjacent to the non-last park-out data and is subsequent to the non-last park-out data as the non-last park-out data.
In a possible implementation manner, the first obtaining module is configured to obtain a first photo and a shooting time of the first photo, where the first photo is a round entering photo within a reference time period, the first photo includes a vehicle, and a time length of the reference time period is not less than a time length of the target time period;
identifying the first photo to obtain vehicle information of a vehicle included in the first photo;
taking the shooting time of the first picture as the park entering time of the vehicle included in the first picture;
and composing the candidate entering data by the vehicle information of the vehicle and the entering time of the vehicle included in the first photo.
In a possible implementation manner, the second obtaining module is configured to obtain a second photo and a shooting time of the second photo, where the second photo is a round-out photo within a reference time period, and the second photo includes a vehicle;
identifying the second photo to obtain vehicle information of the vehicle included in the second photo;
taking the shooting time of the second picture as the departure time of the vehicle included in the second picture;
and composing the vehicle information of the vehicle and the park-out time of the vehicle included in the second photo into the candidate park data.
In a possible implementation manner, the third obtaining module is configured to obtain a third photo and a shooting time of the third photo, where the third photo is a dock photo within a reference time period, and the third photo includes a dock code;
responding to the third photo including the vehicle, identifying the third photo, and obtaining vehicle information of the vehicle included in the third photo; composing the candidate platform data from the vehicle information of the vehicle included in the third photograph, the platform code, and the recording time;
in response to not including a vehicle in the third photograph, composing the dock code and the recording time as the time of capture of the third photograph into the candidate dock data.
In another aspect, an embodiment of the present application provides a computer device, where the computer device includes a processor and a memory, where the memory stores at least one program code, and the at least one program code is loaded and executed by the processor, so as to enable the computer device to implement any one of the above-mentioned methods for acquiring a vehicle activity track.
In another aspect, a computer-readable storage medium is provided, where at least one program code is stored in the computer-readable storage medium, and the at least one program code is loaded into and executed by a processor, so as to make a computer implement any one of the above-mentioned methods for acquiring a vehicle activity track.
In another aspect, a computer program or a computer program product is provided, where at least one computer instruction is stored in the computer program or the computer program product, and the at least one computer instruction is loaded by a processor and executed to make a computer implement any one of the above methods for acquiring a motion trajectory of a vehicle.
The technical scheme provided by the embodiment of the application at least has the following beneficial effects:
according to the technical scheme, the park entering data and the park exiting data of the target vehicle are obtained, the platform data of the target vehicle are also obtained, and the obtained data are comprehensive. When the movement track of the target vehicle is generated based on the acquired data, the generated movement track can cover all track points of the target vehicle, so that the generated movement track is high in accuracy and reliability, and the usability of the generated movement track is high.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is an implementation environment schematic diagram of a method for acquiring a vehicle activity track according to an embodiment of the present application;
FIG. 2 is a flowchart of a method for obtaining a vehicle motion trajectory according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of an information obtaining page provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of a first photograph provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of a second photograph provided by an embodiment of the present application;
fig. 6 is a schematic diagram of determination of entrance and exit circle data of a target vehicle according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an activity track of a target vehicle according to an embodiment of the present disclosure;
FIG. 8 is a schematic illustration of a display of a fused data set provided by an embodiment of the present application;
fig. 9 is a schematic structural diagram of an apparatus for acquiring a vehicle motion trajectory according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 11 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of an implementation environment of a method for acquiring a vehicle activity track according to an embodiment of the present application, and as shown in fig. 1, the implementation environment includes: the computer device 101 may be an electronic device or a server, and the embodiment of the present application is not limited thereto. The computer device 101 is used for executing the method for acquiring the vehicle activity track provided by the embodiment of the application.
In response to the computer device 101 being an electronic device, the electronic device may be at least one of a smart phone, a game console, a desktop computer, a tablet computer, an electronic book reader, an MP3 (Moving Picture Experts Group Audio Layer III) player, an MP4 (Moving Picture Experts Group Audio Layer IV) player, and a laptop portable computer.
When the computer device 101 is a server, the server may be one server, may also be a server cluster formed by multiple servers, and may also be any one of a cloud computing platform and a virtualization center, which is not limited in this embodiment of the present invention. Of course, the server may also have other functions, which are not limited in this embodiment of the present application.
Based on the foregoing implementation environment, an embodiment of the present application provides a method for acquiring a vehicle activity track, which may be executed by the computer device 101 in fig. 1, taking a flowchart of the method for acquiring a vehicle activity track provided in the embodiment of the present application shown in fig. 2 as an example. As shown in fig. 2, the method comprises the steps of:
in step 201, park entering data of the target vehicle in the target time period is acquired.
In a possible implementation manner, before acquiring the circle entering data of the target vehicle in the target time period, the target time period and the target vehicle need to be determined, and the determination process of the target time period and the target vehicle is as follows: an information acquisition page is displayed in the electronic device, and a time determination frame 301 and a vehicle determination frame 302 are displayed in the information acquisition page. The time determination frame is used for determining a target time period, and the vehicle determination frame is used for determining a target vehicle. The user inputs the target time period in the time determination frame, and the vehicle information of the target vehicle is input in the vehicle determination frame, so that the electronic device acquires the target time period and the target vehicle.
The starting time and the ending time of the target time period are any time, and the time length of the target time period may also be any time length, which is not limited in the embodiments of the present application. The vehicle information of the target vehicle may be a license plate number of the target vehicle, or other identifiers capable of uniquely representing the target vehicle, which is not limited in the embodiment of the present application.
In one possible implementation, in response to the computer device being an electronic device, the electronic device may directly obtain the target time period and the target vehicle based on the content input by the user. In response to the fact that the computer device is the server, after the electronic device obtains the target time period and the target vehicle, the electronic device sends the target time period and the target vehicle to the server, so that the server obtains the target time period and the target vehicle.
As shown in fig. 3, which is a schematic diagram of an information acquisition page provided in an embodiment of the present application, in fig. 3, a time determination box 301 and a vehicle determination box 302 are displayed, a user inputs a target time period in the time determination box 301, and vehicle information of a target vehicle is input in the vehicle determination box 302. A determination control 303 is also displayed, and in response to the user triggering the determination control 303, the electronic device acquires the target time period and the target vehicle. Of course, other contents may also be displayed in the information obtaining page, which is not limited in this embodiment of the application.
Illustratively, the target time period is "5 months, 24 days 00 in 2021: 45: 5/24/00 to 2021, 02:15:00 ' and the target vehicle is ' vehicle A12345 '.
In a possible implementation manner, after the target time period and the target vehicle are determined, the process of acquiring the gardening data of the target vehicle in the target time period is as follows: and acquiring candidate entering data, wherein the candidate entering data comprises entering time and vehicle information of entering vehicles, the entering data with the vehicle information of the entering vehicles in the first entering data consistent with the vehicle information of the target vehicles is used as the entering data of the target vehicles, and the first entering data is the entering data with the entering time in the candidate entering data within the target time period.
In one possible implementation manner, the process of acquiring the candidate entering data is as follows: the method comprises the steps of obtaining a first photo and the shooting time of the first photo, wherein the first photo is a park entering photo within a reference time period, the first photo comprises park entering vehicles, and the time length of the reference time period is not less than that of a target time period. The first photograph is taken at a time earlier than the target time period and the vehicle information of the target vehicle. And recognizing the first picture to obtain the vehicle information of the vehicle included in the first picture. And taking the shooting time of the first picture as the park entering time of the vehicle included in the first picture. And composing the vehicle information of the vehicle and the park entering time of the vehicle, which are included in the first picture, into candidate park entering data.
In one possible implementation manner, after the candidate entering data are acquired, the candidate entering data are stored in a storage space of the computer device, so that the entering data of the target vehicle can be acquired based on the candidate entering data.
The shooting time of the first picture is located at a target position of the first picture, and the target position may be the upper left corner of the first picture, the lower right corner of the first picture, or another position of the first picture. Fig. 4 is a schematic diagram of a first photo provided in an embodiment of the present application, and the upper left corner in fig. 4 is a shooting time of the first photo "2021, 5, month, 24, day 00:48:22", i.e. the first photograph comprises the time of entry of the vehicle in the garden, i.e. 5 months and 24 days 00 in 2021: 48:22. because the first photo includes the vehicle, the process of obtaining the vehicle information of the vehicle included in the first photo by recognizing the first photo is as follows: and extracting the content on the vehicle included in the first picture, and identifying the content to obtain the vehicle information of the vehicle. Illustratively, the first photograph in fig. 4 is recognized, and the vehicle information of the vehicle included in the first photograph is obtained as "vehicle a12345".
In a possible implementation manner, after the vehicle information of the vehicle and the shooting time of the first photo included in the first photo are acquired, the shooting time of the first photo is taken as the park entrance time of the vehicle included in the first photo, and the vehicle information of the vehicle and the park entrance time of the vehicle included in the first photo constitute the candidate park entrance data corresponding to the first photo, that is, the candidate park entrance data are: 24/5/2021 00:48:22 car a12345 enters the park.
In a possible implementation manner, the candidate garden entry data may be stored in a KEY-VALUE form, and may also be stored in other forms, which is not limited in the embodiment of the present application.
The first table below is a process for storing candidate gardening data in the form of KEY-VALUE, where KEY is vehicle information of a gardening vehicle and VALUE is gardening time of the gardening vehicle, according to the embodiment of the present application. Of course, KEY may be used as the time of entering the park, and VALUE may be used as the vehicle information of the park entering the park.
Watch 1
Vehicle information of entering vehicle Time of entering park for entering park vehicle
Vehicle A12345 24/5/2021: 48:22
It should be noted that, in the determination process of the candidate park entry data corresponding to only one first photo provided in the embodiment of the present application, the number of the first photos obtained in the reference time period may be more, and the determination processes of the candidate park entry data corresponding to other first photos are consistent with the determination process of the candidate park entry data corresponding to the first photo, and are not described herein again.
In one possible implementation manner, after the candidate entering data is acquired, the entering data of the target vehicle is determined in the candidate entering data in the following two manners.
In the first mode, the park entering data with the park entering time within the target time period in the candidate park entering data is used as the first park entering data. And taking the park entering data in which the vehicle information of the park entering vehicle is consistent with the vehicle information of the target vehicle in the first park entering data as the park entering data of the target vehicle.
Illustratively, the candidate circle entry data includes three pieces, namely candidate circle entry data one: 24/5/2021 00:48:22, vehicle A12345 enters the park, and candidate park entering data II: vehicle a12345 entering park at 12 at 7 month 1 day 2021: at 24 days 01, 5/2021, vehicle B12345 was entered into the garden. The time for entering the park of the candidate park entering data I and the time for entering the park of the candidate park entering data III are within the target time period, and therefore the candidate park entering data I and the candidate park entering data III are used as first park entering data. The vehicle information of the vehicle in the first candidate park-entering data is consistent with the vehicle information of the target vehicle, and the vehicle information of the vehicle in the third candidate park-entering data is inconsistent with the vehicle information of the target vehicle, so that the first candidate park-entering data is used as the park-entering data of the target vehicle. That is, the entering data of the target vehicle is: 24/5/2021 00:48:22 car a12345 enters the park.
In the second mode, the park entrance data in which the vehicle information of the park entrance vehicle is consistent with the vehicle information of the target vehicle in the candidate park entrance data is used as second park entrance data. And taking the park entering data of which the park entering time is in the target time period in the second park entering data as the park entering data of the target vehicle.
It should be noted that the process of acquiring the park entrance data of the target vehicle according to the second method is similar to the process of acquiring the park entrance data of the target vehicle according to the first method, and details are not repeated here.
In one possible implementation manner, in response to the target vehicle missing the first inbound data in the target time period, the first platform data in the platform data of the target vehicle in the target time period is used as the first inbound data of the target vehicle. And in response to the target vehicle missing non-first entering data in the target time period, taking the leaving data adjacent to the non-first entering data and before the non-first entering data as the non-first entering data.
In step 202, park export data of the target vehicle within the target time period is obtained.
In one possible implementation manner, the process of acquiring the gardening data of the target vehicle in the target time period is as follows: obtaining candidate park output data, wherein the candidate park output data comprise park output time and vehicle information of park output vehicles, the park output data with the consistent vehicle information of the park output vehicles in the first park output data and the vehicle information of the target vehicles are used as the park output data of the target vehicles, and the first park output data are the park output data with the park output time in the candidate park output data within the target time period.
In one possible implementation, the process of acquiring the candidate garden data is as follows: and acquiring a second photo and the shooting time of the second photo, wherein the second photo is a park-out photo within the reference time period, and the second photo comprises park-out vehicles. And recognizing the second picture to obtain the vehicle information of the vehicle included in the second picture. And taking the shooting time of the second picture as the departure time of the vehicle included in the second picture. And composing the vehicle information of the vehicle and the departure time of the vehicle included in the second picture into candidate park data.
In one possible implementation manner, after the candidate park data are acquired, the candidate park data are stored in a storage space of the computer device, so that the park data of the target vehicle can be acquired based on the candidate park data.
The shooting time of the second picture is located at a target position of the second picture, the target position may be the upper left corner of the second picture, the lower right corner of the second picture, or another position of the second picture, and the target position is not limited in the embodiment of the present application. Fig. 5 is a schematic diagram of a second photo provided in an embodiment of the present application, and the upper left corner in fig. 5 is a shooting time of the second photo "2021, 5, month, 24, day 02:13:22", i.e. the second photograph comprises vehicles with a departure time of 2021, 5 months, 24 days 02:13:22. because the second photo includes the vehicle, the process of obtaining the vehicle information of the vehicle included in the second photo by recognizing the second photo is as follows: and extracting the content on the vehicle included in the second picture, and identifying the content to obtain the vehicle information of the vehicle. Illustratively, the second photograph in fig. 5 is recognized, and the vehicle information of the vehicle included in the second photograph is obtained as "vehicle a12345".
In a possible implementation manner, after the vehicle information of the vehicle and the shooting time of the second picture included in the second picture are acquired, the shooting time of the second picture is taken as the time for leaving the park of the vehicle included in the second picture, and the vehicle information of the vehicle and the time for leaving the park of the vehicle included in the second picture form candidate park leaving data corresponding to the second picture, that is, the candidate park leaving data is: in 2021, 5, month, 24, day 02:13:22 cars a12345 out of the park.
In a possible implementation manner, the candidate garden data may be stored in a KEY-VALUE form, and may also be stored in other forms, which is not limited in this embodiment of the present application.
The following table ii provides a process for storing candidate park data in the form of KEY-VALUE, where KEY is vehicle information of a park-leaving vehicle, and VALUE is park-leaving time of the park-leaving vehicle. Of course, KEY may be used as the departure time of the departure vehicle, and VALUE may be used as the vehicle information of the departure vehicle.
Watch 2
Vehicle information of outgoing vehicles Time of leaving park for vehicles
Vehicle A12345 24/5/2021 and 02:13:22
It should be noted that, in the determination process of the candidate garden exit data corresponding to only one second photo provided in the embodiment of the present application, the number of the second photos obtained in the reference time period may be more, and the determination processes of the candidate garden exit data corresponding to other second photos are consistent with the determination process of the candidate garden exit data corresponding to the second photo, which is not described herein again.
In one possible implementation manner, after the candidate gardening data are acquired, the gardening data of the target vehicle are determined from the candidate gardening data in the following two manners.
In the first mode, the park-out data with the park-out time within the target time period in the candidate park-out data is used as the first park-out data. And taking the park-out data in which the vehicle information of the park-out vehicle is consistent with the vehicle information of the target vehicle in the first park-out data as the park-out data of the target vehicle.
Illustratively, the candidate circle-out data includes three pieces, which are respectively a candidate circle-out data one: at 24 days 02, 5 months 2021: 13:2, vehicle A12345 goes out of the park and a park selection data II is selected: vehicle a12345 leaving, candidate leaving data three in 2021, 7, 3, 12: at 2021, 5 month 24 day 01. The time for leaving the park of the candidate park data I and the time for leaving the park of the candidate park data III are within the target time period, and therefore the candidate park data I and the candidate park data III are used as first park data. The vehicle information of the vehicle in the first candidate park data is consistent with the vehicle information of the target vehicle, and the vehicle information of the vehicle in the third candidate park data is inconsistent with the vehicle information of the target vehicle, so that the first candidate park data is used as the park output data of the target vehicle. That is, the data of the target vehicle for leaving the park are: at 24 days 02, 5 months 2021: 13:22 cars a12345 out of the park.
In a second mode, the park-out data, in the candidate park-out data, with the vehicle information of the park-out vehicle consistent with the vehicle information of the target vehicle is used as second park-out data. And taking the park-out data of which the park-out time is in the target time period in the second park-out data as the park-out data of the target vehicle.
It should be noted that the process of obtaining the park export data of the target vehicle according to the second method is similar to the process of obtaining the park export data of the target vehicle according to the first method, and is not described in detail herein.
In one possible implementation, in response to the target vehicle missing the last gardening data in the target time period, the last platform data in the target time period of the target vehicle is used as the last gardening data of the target vehicle. And in response to the target vehicle missing non-last-out-of-park data in the target time period, taking the in-park data adjacent to and subsequent to the non-last-out-of-park data as the non-last-out-of-park data.
Fig. 6 is a schematic diagram illustrating determination of circle entry and exit data of a target vehicle according to an embodiment of the present application, and in fig. 6, in response to that the first circle entry data (circle entry data 1) is not acquired, the platform data 1 is used as the first circle entry data. And in response to not acquiring the non-first garden entry data (garden entry data 2), taking the garden exit data 1 as the non-first garden entry data. In response to the last outgoing garden data (outgoing garden data 2) not being acquired, the platform data 4 is taken as the last outgoing garden data. And in response to that the non-last park export data (park export data 1) is not acquired, taking the park import data 2 as the non-last park export data.
In step 203, the platform data of the target vehicle within the target time period is acquired.
In one possible implementation, the process of acquiring the platform data of the target vehicle in the target time period is as follows: candidate dock data is obtained, the candidate dock data including dock codes and recording times. And taking the platform data with the recording time within the target time period in the candidate platform data as the first platform data. And taking the platform data with the vehicle information in the first platform data as second platform data. And taking the platform data of the second platform data, wherein the vehicle information is consistent with the vehicle information of the target vehicle, as the platform data of the target vehicle.
In a possible implementation manner, each platform is provided with a camera, the camera is used for taking a platform picture, the camera takes a platform picture every target duration, and sends the taken platform picture to the computer device, so that the computer device analyzes the platform picture to obtain candidate platform data, and the candidate platform data are stored in a storage space of the computer device. The target time length may be 20 seconds or 30 seconds, which is not limited in the embodiment of the present application. The target duration may be adjusted based on the application scenario.
In one possible implementation, the process of obtaining candidate platform data is as follows: and acquiring a third picture and the shooting time of the third picture, wherein the third picture is a platform picture within the reference time period, and the third picture comprises a platform code. Responding to the third picture including the vehicle, and identifying the third picture to obtain vehicle information of the vehicle included in the third picture; and composing the vehicle information, the platform code and the recording time of the vehicle included in the third picture into candidate platform data. In response to not including a vehicle in the third photograph, the dock code and the time of recording are composed into candidate dock data. Wherein the recording time is the shooting time of the third picture. The time of taking the third picture is at the target location on the third picture.
Illustratively, the third photograph was taken at a time of 2021, 5 months, 24 days 00. The third photo includes a vehicle, the third photo is recognized, the vehicle information of the vehicle included in the third photo is obtained as "vehicle a12345", and further the candidate platform data is obtained as: vehicle "car a12345" was parked at dock 1 at 5 months, 24 days 00, 2021.
For another example, the shooting time of the third picture is 2021 year 5 month 24 day 00: dock 1 has no vehicle parked at 5 months and 24 days 00 in 2021.
A table storing candidate platform data provided in the embodiments of the present application is shown in table three below.
Watch III
Figure BDA0003211653970000141
Figure BDA0003211653970000151
The third table includes 13 candidate platform data, and candidate platform data 1 is: dock 1 has vehicle a12345 parked at 24 days 00, 5 months, 2021, and candidate dock data 4 are: no vehicle is parked at dock 1 at 5/24/00 in 2021, and the data of other candidate docks are detailed in table three above and are not described in detail here.
It should be noted that, in the above-mentioned several pieces of candidate platform data provided for the embodiments of the present application, in practical applications, the number of candidate platform data is very large, and the vehicle information of the vehicle included in the candidate platform data may not be complete, and the vehicle information of the vehicle in the above-mentioned 13 th candidate platform data in the third table is "the vehicle a11" and is not completely displayed.
In one possible implementation manner, after acquiring the candidate platform data, the process of acquiring the platform data of the target vehicle in the target time period is as follows: and taking the platform data with the recording time within the target time period in the candidate platform data as the first platform data. And taking the platform data with the vehicle information in the first platform data as second platform data. And taking the platform data of the second platform data, in which the vehicle information is the same as the vehicle information of the target vehicle, as the platform data of the target vehicle.
In one possible implementation, in response to the third month data in which the display of the vehicle information is incomplete in the second month data, a degree of matching between the vehicle information in the third month data and the vehicle information of the target vehicle is determined. And taking the platform data with the matching degree meeting the matching requirement in the third platform data as the platform data of the target vehicle.
The matching degree meeting the matching requirement can be that the matching degree is higher than a target threshold, the target threshold can be adjusted based on an application scene, and the value of the target threshold is not limited in the embodiment of the application. Illustratively, the target threshold is 0.8.
The determination process of the matching degree between the vehicle information in the third platform data and the vehicle information of the target vehicle is as follows: and determining the number of the vehicle information in the third platform data and the number of the contents (characters and/or numbers) in the vehicle information of the target vehicle, and taking the quotient between the number of the contents and the total number of the contents included in the vehicle information of the target vehicle as the matching degree between the vehicle information in the third platform data and the vehicle information of the target vehicle.
Illustratively, the vehicle information of the target vehicle is "vehicle a12345", and the vehicle information included in the third monthly data is "vehicle a1234", whereby it can be seen that the number of the same contents is 6, the total number of the vehicle information included in the target vehicle is 7, and the matching degree between the vehicle information in the third monthly data and the vehicle information of the target vehicle is 6/7 ≈ 0.857. Since the matching degree is greater than the target threshold, the third platform data serves as the platform data of the target vehicle.
For another example, the vehicle information of the target vehicle is "vehicle a12345", and the vehicle information included in the third monthly data is "vehicle a11", so it can be seen that the number of the same contents is 2, the total number of the vehicle information included in the target vehicle is 7, and the matching degree between the vehicle information in the third monthly data and the vehicle information of the target vehicle is 2/7 ≈ 0.286. Since the matching degree is smaller than the target threshold, the third platform data cannot be the platform data of the target vehicle.
In step 204, an activity track of the target vehicle is generated based on the circle entering data, the circle exiting data and the platform data of the target vehicle in the target time period.
In a possible implementation manner, a reference platform parking record is obtained based on platform data of the target vehicle in the target time period, and the reference platform parking record is used for indicating the time length for the target vehicle to park at any platform. And adjusting the reference platform stop record to obtain the platform stop record of the target vehicle. And generating the activity track of the target vehicle according to the time sequence based on the circle entering data and the circle exiting data of the target vehicle in the target time period and the platform stop record of the target vehicle.
The process of obtaining the reference platform stop record based on the platform data of the target vehicle in the target time period is as follows: and taking the recording time of the first platform data in a plurality of platform data with the same platform code, connected platform data and the same vehicle information of the vehicles parked in the platforms as the starting time of the reference platform parking record, and taking the recording time of the last platform data as the ending time of the reference platform parking record.
In the candidate platform data shown in table three above, the candidate platform data 1 to the candidate platform data 3 are a plurality of platform data having the same platform code, connected platform data, and the same vehicle information of the vehicles parked in the platforms, and the reference platform parking record is obtained by taking the recording time of the candidate platform data 1 as the start time of the reference platform parking record and the recording time of the candidate platform data 3 as the end time of the reference platform parking record: vehicle a12345 docked at dock 1 from 5/24/2021 at 53.
The reference dock parking record includes a dock approach time, a dock exit time, and a parking duration. The reference dock parking record shown above has a docking time of 53 at 5 months and 24 days at 2021, and a docking time of 58 at 5 months and 24 days at 2021 at 58.
In a possible implementation manner, the process of adjusting the reference parking record to obtain the platform parking record of the target vehicle is as follows: and in response to the fact that the time interval between the time of leaving the platform of the first platform stop record and the time of approaching the platform of the second platform stop record in the reference platform record is smaller than a first numerical value, combining the first platform stop record and the second platform stop record to obtain candidate platform stop records. The first and second dockset docking records are two adjacent dockset docking records, the dockset codes of the first and second dockset docking records are consistent, and the time of leaving the dockset of the first dockset docking record is earlier than the time of approaching the dockset of the second dockset docking record. And in response to the third dock stop record with the stop duration being smaller than the second numerical value existing in the candidate dock stop records, discarding the third dock stop record to obtain the dock stop record of the target vehicle.
The first numerical value and the second numerical value may be adjusted according to an application scenario, and the first numerical value and the second numerical value may be the same or different, which is not limited in this embodiment of the application. Illustratively, the first and second values are both 3 minutes.
Illustratively, the first dock record is: cart a12345 docked dock 1 at 2021, 5 month 24 day 00 53 to 2021, 5 month 24 day 00, 58, and the second: vehicle a12345 docked at dock 1 from 5/24/2021 at 00. Since the time interval between the time of leaving the dock for the first dock docking record and the time of approaching the dock for the second dock docking record is less than 3 minutes, merging the first dock docking record and the second dock docking record to obtain candidate dock docking records as follows: cart a12345 parked at dock 1 from 2021, 5, 24, 00, 53 to 2021, 5, 24, 01.
For another example, there is a third dock stop record in the candidate dock stop records: vehicle a12345 is parked at dock 2 from 2021, 7, 1, 00.
In a possible implementation manner, the time sequence may be an order from early to late, or may be another order, which is not limited in this application.
Illustratively, the park entry data for the target vehicle is: car a12345 was entered into the garden at 24 days 00, 5 months 2021. The data of the target vehicle for leaving the park are as follows: car a12345 was released from the garden at 2021, 5 month 24 day 02. The dock parking record of the target vehicle includes the following three: cart a12345 was parked at dock 1 at 5/24/00 in 2021, 53: 34: 5 months and 24 days 02 from year 09 to 2021. Based on the above data, the movement locus of the target vehicle as shown in fig. 7 is obtained.
In one possible implementation manner, in response to not acquiring the circle entering data and the circle exiting data of the target vehicle in the target time period, only the platform data of the target vehicle in the target time period is acquired, and then the motion track of the target vehicle is generated only according to the platform data of the target vehicle in the target time period.
According to the method, the park entering data and the park exiting data of the target vehicle are acquired, the platform data of the target vehicle are also acquired, and the acquired data are comprehensive. When the movement track of the target vehicle is generated based on the acquired data, the generated movement track can cover all track points of the target vehicle, so that the generated movement track is high in accuracy and reliability, and the usability of the generated movement track is high.
Fig. 8 is a schematic display diagram of a fused data set according to an embodiment of the present application, where seven data sets, namely, a data set a, a data set B, a data set C, a data set D, a data set E, a data set F, and a data set G, are displayed in fig. 8.
The data set A comprises vehicle information, access circle data corresponding to the vehicle information and platform data corresponding to the vehicle information. The data set B includes entrance and exit garden data and platform data. The data set C includes vehicle information and access circle data corresponding to the vehicle information. The data set D includes vehicle information and platform data corresponding to the vehicle information. The data set E includes platform data. The data set F includes entrance and exit data. The data set G includes vehicle information.
The data set a is a data set required in the method for acquiring a vehicle activity track provided by the embodiment of the present application, that is, in the method for acquiring a vehicle activity track provided by the embodiment of the present application, both the park and entrance data corresponding to vehicle information and the dock data corresponding to the vehicle information need to be acquired.
Fig. 9 is a schematic structural diagram of an apparatus for acquiring a vehicle motion trajectory according to an embodiment of the present application, and as shown in fig. 9, the apparatus includes:
a first obtaining module 901, configured to obtain park entering data of a target vehicle in a target time period;
a second obtaining module 902, configured to obtain circle departure data of the target vehicle in the target time period;
a third obtaining module 903, configured to obtain platform data of the target vehicle in the target time period;
a generating module 904, configured to generate an activity track of the target vehicle based on the circle entering data, the circle exiting data, and the platform data of the target vehicle in the target time period.
In a possible implementation manner, the first obtaining module 901 is configured to obtain candidate entering-garden data, where the candidate entering-garden data includes entering-garden time and vehicle information of entering-garden vehicles; and taking the park entering data in which the vehicle information of the park entering vehicle is consistent with the vehicle information of the target vehicle in the first park entering data as the park entering data of the target vehicle, wherein the first park entering data is the park entering data of which the park entering time in the candidate park entering data is within the target time period.
In a possible implementation manner, the second obtaining module 902 is configured to obtain candidate park exit data, where the candidate park exit data includes park exit time and vehicle information of a park exit vehicle; and taking the park-out data in which the vehicle information of the park-out vehicle in the first park-out data is consistent with the vehicle information of the target vehicle as the park-out data of the target vehicle, wherein the park-out data is the park-out data of which the park-out time in the park-out data is within the target time period.
In a possible implementation, the third obtaining module 903 is configured to obtain candidate platform data, where the candidate platform data includes a platform code and a recording time; the platform data with the recording time within the target time period in the candidate platform data is used as first platform data; taking the platform data with the vehicle information in the first platform data as second platform data; and taking the platform data of the second platform data, wherein the vehicle information is consistent with the vehicle information of the target vehicle, as the platform data of the target vehicle.
In a possible implementation manner, the third obtaining module 903 is further configured to determine a matching degree between the vehicle information in the third month data and the vehicle information of the target vehicle in response to that third month data with incomplete vehicle information display exists in the second month data; and taking the platform data with the matching degree meeting the matching requirement in the third platform data as the platform data of the target vehicle.
In one possible implementation, the apparatus further includes:
the fourth acquisition module is used for acquiring a reference platform parking record based on platform data of the target vehicle in the target time period, wherein the reference platform parking record is used for indicating the parking duration of the target vehicle at any platform;
the adjusting module is used for adjusting the reference platform stop record to obtain the platform stop record of the target vehicle;
a generating module 904, configured to generate an activity track of the target vehicle according to a time sequence based on the circle entering data, the circle exiting data, and the platform stop record of the target vehicle in the target time period.
In a possible implementation manner, the adjusting module is configured to combine the first and second dock docking records to obtain a candidate dock docking record in response to that a time interval between a dock leaving time of the first dock docking record and a dock approaching time of the second dock docking record is smaller than a first numerical value in the reference dock docking record, where the first and second dock docking records are two adjacent dock docking records, dock codes of the first and second dock docking records are consistent, and a dock leaving time of the first dock docking record is earlier than a dock approaching time of the second dock docking record; and in response to the third platform stop record with the stop duration being smaller than the second numerical value in the candidate platform stop records, discarding the third platform stop record to obtain the platform stop record of the target vehicle.
In a possible implementation manner, the first obtaining module 901 is configured to, in response to that the first entering data of the target vehicle is missing from the entering data of the target vehicle in the target time period, take the first platform data of the target vehicle in the target time period as the first entering data of the target vehicle; and in response to the target vehicle missing non-first entering data in the target time period, taking the leaving data adjacent to the non-first entering data and before the non-first entering data as the non-first entering data.
In a possible implementation manner, the second obtaining module 902 is configured to, in response to that the target vehicle lacks last gardening data in the target time period, take last platform data in the platform data of the target vehicle in the target time period as last gardening data of the target vehicle; and in response to the target vehicle missing non-last-out-of-park data in the target time period, taking the in-park data adjacent to and subsequent to the non-last-out-of-park data as the non-last-out-of-park data.
In a possible implementation manner, the first obtaining module 901 is configured to obtain a first photo and a shooting time of the first photo, where the first photo is a park entrance photo within a reference time period, the first photo includes a vehicle, and a time length of the reference time period is not less than a time length of a target time period; recognizing the first picture to obtain vehicle information of the vehicle included in the first picture; taking the shooting time of the first picture as the park entering time of the vehicle included in the first picture; and composing the vehicle information of the vehicle and the park entering time of the vehicle included in the first picture into candidate park entering data.
In a possible implementation manner, the second obtaining module 902 is configured to obtain a second photo and a shooting time of the second photo, where the second photo is a round-out photo within a reference time period, and the second photo includes a vehicle; recognizing the second picture to obtain vehicle information of the vehicle included in the second picture; taking the shooting time of the second picture as the departure time of the vehicle included in the second picture; and composing the vehicle information of the vehicle and the departure time of the vehicle included in the second picture into candidate park data.
In a possible implementation manner, the third obtaining module 903 is configured to obtain a third photo and a shooting time of the third photo, where the third photo is a dock photo within a reference time period, and the third photo includes a dock code; responding to the third photo including the vehicle, and recognizing the third photo to obtain vehicle information of the vehicle included in the third photo; composing the vehicle information, the platform code and the recording time of the vehicle included in the third photograph into candidate platform data; and in response to the third picture not including the vehicle, composing the platform code and the recording time into candidate platform data, the recording time being the shooting time of the third picture.
The device not only acquires the park entering data and the park exiting data of the target vehicle, but also acquires the platform data of the target vehicle, and the acquired data are comprehensive. When the movement track of the target vehicle is generated based on the acquired data, the generated movement track can cover all track points of the target vehicle, so that the generated movement track is high in accuracy and reliability, and the generated movement track is high in usability.
It should be understood that, when the apparatus provided in fig. 9 implements its functions, it is only illustrated by the division of the functional modules, and in practical applications, the above functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided in the above embodiments belong to the same concept, and specific implementation processes thereof are described in detail in the method embodiments, which are not described herein again.
Fig. 10 shows a block diagram of an electronic device 1000 according to an exemplary embodiment of the present application. The electronic device 1000 may be a portable mobile terminal, such as: a smart phone, a tablet computer, an MP3 (Moving Picture Experts Group Audio Layer III, moving Picture Experts Group Audio Layer IV, moving Picture Experts Group Audio Layer 4) player, a notebook computer, or a desktop computer. The electronic device 1000 may also be referred to by other names as user equipment, portable terminal, laptop terminal, desktop terminal, and so on.
In general, the electronic device 1000 includes: a processor 1001 and a memory 1002.
The processor 1001 may include one or more processing cores, such as 4-core processors, 8-core processors, and so on. The processor 1001 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1001 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1001 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, the processor 1001 may further include an AI (Artificial Intelligence) processor for processing a calculation operation related to machine learning.
Memory 1002 may include one or more computer-readable storage media, which may be non-transitory. Memory 1002 can also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1002 is used to store at least one instruction for execution by processor 1001 to implement the method of obtaining a vehicle activity trajectory provided by the method embodiments herein.
In some embodiments, the electronic device 1000 may further include: a peripheral interface 1003 and at least one peripheral. The processor 1001, memory 1002 and peripheral interface 1003 may be connected by a bus or signal line. Various peripheral devices may be connected to peripheral interface 1003 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1004, display screen 1005, camera assembly 1006, audio circuitry 1007, positioning assembly 1008, and power supply 1009.
The peripheral interface 1003 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 1001 and the memory 1002. In some embodiments, processor 1001, memory 1002, and peripheral interface 1003 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1001, the memory 1002, and the peripheral interface 1003 may be implemented on separate chips or circuit boards, which are not limited by this embodiment.
The Radio Frequency circuit 1004 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1004 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1004 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1004 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1004 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1004 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1005 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1005 is a touch display screen, the display screen 1005 also has the ability to capture touch signals on or over the surface of the display screen 1005. The touch signal may be input to the processor 1001 as a control signal for processing. At this point, the display screen 1005 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display screen 1005 may be one, disposed on the front panel of the electronic device 1000; in other embodiments, the display screens 1005 may be at least two, respectively disposed on different surfaces of the electronic device 1000 or in a folded design; in other embodiments, the display 1005 may be a flexible display disposed on a curved surface or a folded surface of the electronic device 1000. Even more, the display screen 1005 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display screen 1005 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or other materials.
The camera assembly 1006 is used to capture images or video. Optionally, the camera assembly 1006 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of a terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, the main camera and the wide-angle camera are fused to realize panoramic shooting and a VR (Virtual Reality) shooting function or other fusion shooting functions. In some embodiments, the camera assembly 1006 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp and can be used for light compensation under different color temperatures.
The audio circuit 1007 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1001 for processing or inputting the electric signals to the radio frequency circuit 1004 for realizing voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be respectively disposed at different portions of the electronic device 1000. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1001 or the radio frequency circuit 1004 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1007 may also include a headphone jack.
The positioning component 1008 is used to locate a current geographic Location of the electronic device 1000 to implement navigation or LBS (Location Based Service). The Positioning component 1008 can be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
The power supply 1009 is used to supply power to various components in the electronic device 1000. The power source 1009 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power source 1009 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery can also be used to support fast charge technology.
In some embodiments, the electronic device 1000 also includes one or more sensors 1100. The one or more sensors 1100 include, but are not limited to: acceleration sensor 1011, gyro sensor 1012, pressure sensor 1013, fingerprint sensor 1014, optical sensor 1015, and proximity sensor 1016.
The acceleration sensor 1011 may detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the electronic apparatus 1000. For example, the acceleration sensor 1011 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1001 may control the display screen 1005 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1011. The acceleration sensor 1011 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1012 may detect a body direction and a rotation angle of the electronic device 1000, and the gyro sensor 1012 and the acceleration sensor 1011 may cooperate to acquire a 3D motion of the user on the electronic device 1000. From the data collected by the gyro sensor 1012, the processor 1001 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensor 1013 may be disposed on a side bezel of the electronic device 1000 and/or on a lower layer of the display screen 1005. When the pressure sensor 1013 is disposed on a side frame of the electronic device 1000, a user's holding signal of the electronic device 1000 can be detected, and the processor 1001 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1013. When the pressure sensor 1013 is disposed at a lower layer of the display screen 1005, the processor 1001 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 1005. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1014 is used to collect a fingerprint of the user, and the processor 1001 identifies the user according to the fingerprint collected by the fingerprint sensor 1014, or the fingerprint sensor 1014 identifies the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 1001 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 1014 may be disposed on the front, back, or side of the electronic device 1000. When a physical button or vendor Logo is provided on the electronic device 1000, the fingerprint sensor 1014 may be integrated with the physical button or vendor Logo.
Optical sensor 1015 is used to collect ambient light intensity. In one embodiment, the processor 1001 may control the display brightness of the display screen 1005 according to the ambient light intensity collected by the optical sensor 1015. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1005 is increased; when the ambient light intensity is low, the display brightness of the display screen 1005 is turned down. In another embodiment, the processor 1001 may also dynamically adjust the shooting parameters of the camera assembly 1006 according to the intensity of the ambient light collected by the optical sensor 1015.
A proximity sensor 1016, also known as a distance sensor, is typically disposed on the front panel of the electronic device 1000. The proximity sensor 1016 is used to capture the distance between the user and the front of the electronic device 1000. In one embodiment, when the proximity sensor 1016 detects that the distance between the user and the front surface of the electronic device 1000 is gradually reduced, the display screen 1005 is controlled by the processor 1001 to switch from a bright screen state to a dark screen state; when the proximity sensor 1016 detects that the distance between the user and the front surface of the electronic device 1000 gradually becomes larger, the display screen 1005 is controlled by the processor 1001 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 10 is not limiting of the electronic device 1000 and may include more or fewer components than shown, or combine certain components, or employ a different arrangement of components.
Fig. 11 is a schematic structural diagram of a server provided in this embodiment of the present application, where the server 1100 may generate relatively large differences due to different configurations or performances, and may include one or more processors (CPUs) 1101 and one or more memories 1102, where at least one program code is stored in the one or more memories 1102, and is loaded and executed by the one or more processors 1101 to implement the method for obtaining a vehicle activity track provided in each method embodiment described above. Of course, the server 1100 may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input/output, and the server 1100 may also include other components for implementing device functions, which are not described herein again.
In an exemplary embodiment, a computer-readable storage medium is further provided, in which at least one program code is stored, and the at least one program code is loaded and executed by a processor, so as to make a computer implement any one of the above-mentioned methods for acquiring a vehicle activity track.
Alternatively, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program or a computer program product is further provided, in which at least one computer instruction is stored, and the at least one computer instruction is loaded and executed by a processor, so as to enable a computer to implement any one of the above-mentioned methods for acquiring a vehicle activity track.
It should be understood that reference herein to "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A method for acquiring a vehicle activity track is characterized by comprising the following steps:
acquiring park entering data of a target vehicle in a target time period;
acquiring garden exit data of the target vehicle in the target time period;
acquiring platform data of the target vehicle in the target time period;
and generating an activity track of the target vehicle based on the circle entering data, the circle exiting data and the platform data of the target vehicle in the target time period.
2. The method of claim 1, wherein the obtaining the inbound data for the target vehicle over the target time period comprises:
acquiring candidate park entering data, wherein the candidate park entering data comprises park entering time and vehicle information of park entering vehicles;
and taking the park entering data with the consistent vehicle information of the park entering vehicles in the first park entering data and the vehicle information of the target vehicle as the park entering data of the target vehicle, wherein the first park entering data is the park entering data with the park entering time in the candidate park entering data within the target time period.
3. The method of claim 1, wherein the obtaining of the park out data of the target vehicle over the target time period comprises:
acquiring candidate park data, wherein the candidate park data comprise park exit time and vehicle information of park exit vehicles;
and taking the park-out data in which the vehicle information of the park-out vehicle in the first park-out data is consistent with the vehicle information of the target vehicle as the park-out data of the target vehicle, wherein the park-out data is the park-out data of which the park-out time in the candidate park-out data is within the target time period.
4. The method of claim 1, wherein the obtaining of the dock data for the target vehicle over the target time period comprises:
obtaining candidate platform data, wherein the candidate platform data comprise platform codes and recording time;
taking the platform data with the recording time within the target time period in the candidate platform data as first platform data;
taking the platform data with the vehicle information in the first platform data as second platform data;
and taking the platform data of the second platform data, in which the vehicle information is consistent with the vehicle information of the target vehicle, as the platform data of the target vehicle.
5. The method of claim 4, further comprising:
determining a matching degree between the vehicle information in the third platform data and the vehicle information of the target vehicle in response to the existence of the third platform data in which the vehicle information is incompletely displayed in the second platform data;
and taking the platform data with the matching degree meeting the matching requirement in the third platform data as the platform data of the target vehicle.
6. The method according to any one of claims 1 to 5, wherein the generating the activity track of the target vehicle based on the inbound data, the outbound data and the platform data of the target vehicle within the target time period comprises:
acquiring a reference platform parking record based on platform data of the target vehicle in the target time period, wherein the reference platform parking record is used for indicating the parking duration of the target vehicle at any platform;
adjusting the reference platform stop record to obtain a platform stop record of the target vehicle;
and generating the activity track of the target vehicle according to the time sequence based on the circle entering data and the circle exiting data of the target vehicle in the target time period and the platform stop record of the target vehicle.
7. The method of claim 6, wherein said adjusting the reference dock stop record to obtain the dock stop record for the target vehicle comprises:
in response to a time interval between a time when a first dock berth record leaves a dock and a time when a second dock berth record approaches the dock being smaller than a first numerical value, merging the first dock berth record and the second dock berth record to obtain a candidate dock berth record, wherein the first dock berth record and the second dock berth record are two adjacent dock berth records, dock codes of the first dock berth record and the second dock berth record are consistent, and the time when the first dock berth record leaves the dock is earlier than the time when the second dock berth record approaches the dock;
and in response to the third dock parking record with the parking duration smaller than a second numerical value exists in the candidate dock parking records, discarding the third dock parking record to obtain the dock parking record of the target vehicle.
8. The method according to any one of claims 1 to 5, wherein after the obtaining of the inbound data for the target vehicle within the target time period, the method further comprises:
in response to the target vehicle missing first entering data in the target time period, taking first platform data in the target time period as first entering data of the target vehicle;
responding to the target vehicle missing non-first entering data in the target time period, and taking leaving data which is adjacent to the non-first entering data and is before the non-first entering data as the non-first entering data.
9. The method according to any one of claims 1 to 5, wherein after the acquiring of the gardening data of the target vehicle within the target time period, the method further comprises:
in response to the target vehicle missing last park exiting data in the target time period, taking last platform data in platform data of the target vehicle in the target time period as last park exiting data of the target vehicle;
in response to the target vehicle missing non-last park-out data in the target time period, taking park-in data which is adjacent to the non-last park-out data and is subsequent to the non-last park-out data as the non-last park-out data.
10. The method of claim 2, wherein the obtaining candidate inbound data comprises:
acquiring a first photo and the shooting time of the first photo, wherein the first photo is a park entering photo within a reference time period, the first photo comprises a vehicle, and the time length of the reference time period is not less than the time length of the target time period;
identifying the first photo to obtain vehicle information of a vehicle included in the first photo;
taking the shooting time of the first picture as the park entering time of the vehicle included in the first picture;
and composing the candidate park entering data by the vehicle information of the vehicle included in the first photo and the park entering time of the vehicle.
11. The method of claim 3, wherein the obtaining candidate garden data comprises:
acquiring a second photo and the shooting time of the second photo, wherein the second photo is a park-out photo within a reference time period and comprises a vehicle;
identifying the second photo to obtain vehicle information of the vehicle included in the second photo;
taking the shooting time of the second picture as the departure time of the vehicle included in the second picture;
and composing the vehicle information of the vehicle and the park-out time of the vehicle included in the second photo into the candidate park data.
12. The method of claim 4, wherein the obtaining candidate platform data comprises:
acquiring a third photo and the shooting time of the third photo, wherein the third photo is a platform photo within a reference time period and comprises a platform code;
responding to the third photo including the vehicle, identifying the third photo, and obtaining vehicle information of the vehicle included in the third photo; composing the candidate platform data from the vehicle information of the vehicle included in the third photograph, the platform code, and the recording time;
in response to not including a vehicle in the third photograph, composing the dock code and the recording time as the time of capture of the third photograph into the candidate dock data.
13. An apparatus for obtaining a trajectory of a vehicle, the apparatus comprising:
the first acquisition module is used for acquiring park entering data of a target vehicle in a target time period;
the second acquisition module is used for acquiring the park-out data of the target vehicle in the target time period;
the third acquisition module is used for acquiring the platform data of the target vehicle in the target time period;
and the generation module is used for generating the activity track of the target vehicle based on the circle entering data, the circle exiting data and the platform data of the target vehicle in the target time period.
14. A computer device, characterized in that it comprises a processor and a memory, in which at least one program code is stored, which is loaded and executed by the processor, to make the computer device implement the method of acquiring a trajectory of a vehicle activity according to any one of claims 1 to 12.
15. A computer-readable storage medium, wherein at least one program code is stored in the computer-readable storage medium, and the at least one program code is loaded and executed by a processor to cause a computer to implement the method for acquiring a vehicle motion trajectory according to any one of claims 1 to 12.
CN202110932737.7A 2021-08-13 2021-08-13 Method, device and equipment for acquiring vehicle motion track and readable storage medium Pending CN115705551A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110932737.7A CN115705551A (en) 2021-08-13 2021-08-13 Method, device and equipment for acquiring vehicle motion track and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110932737.7A CN115705551A (en) 2021-08-13 2021-08-13 Method, device and equipment for acquiring vehicle motion track and readable storage medium

Publications (1)

Publication Number Publication Date
CN115705551A true CN115705551A (en) 2023-02-17

Family

ID=85180268

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110932737.7A Pending CN115705551A (en) 2021-08-13 2021-08-13 Method, device and equipment for acquiring vehicle motion track and readable storage medium

Country Status (1)

Country Link
CN (1) CN115705551A (en)

Similar Documents

Publication Publication Date Title
CN108737897B (en) Video playing method, device, equipment and storage medium
CN109117635B (en) Virus detection method and device for application program, computer equipment and storage medium
CN111127509B (en) Target tracking method, apparatus and computer readable storage medium
CN110288689B (en) Method and device for rendering electronic map
CN110705614A (en) Model training method and device, electronic equipment and storage medium
CN111427629B (en) Application starting method and device, vehicle equipment and storage medium
CN112560435B (en) Text corpus processing method, device, equipment and storage medium
CN111586279A (en) Method, device and equipment for determining shooting state and storage medium
CN111127541A (en) Vehicle size determination method and device and storage medium
CN111723124B (en) Data collision analysis method and device, electronic equipment and storage medium
CN111370096A (en) Interactive interface display method, device, equipment and storage medium
CN114724312B (en) Cabinet opening method, device and equipment of storage cabinet and computer readable storage medium
CN113408989B (en) Automobile data comparison method and device and computer storage medium
CN111711841B (en) Image frame playing method, device, terminal and storage medium
CN114595019A (en) Theme setting method, device and equipment of application program and storage medium
CN114598992A (en) Information interaction method, device, equipment and computer readable storage medium
CN111324815B (en) Automobile information processing method and device and storage medium
CN115705551A (en) Method, device and equipment for acquiring vehicle motion track and readable storage medium
CN111159168A (en) Data processing method and device
CN112731972B (en) Track management method, device, equipment and computer readable storage medium
CN113656893B (en) Pedestrian protection checking method and device and computer storage medium
CN114566064B (en) Method, device, equipment and storage medium for determining position of parking space
CN112579661B (en) Method and device for determining specific target pair, computer equipment and storage medium
CN115690666A (en) Method, device and equipment for determining transport means and readable storage medium
CN114299948A (en) Method, device, equipment and storage medium for operating application program through voice

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination