CN112632316B - Data processing method and device, electronic equipment and storage medium - Google Patents

Data processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112632316B
CN112632316B CN202011521905.5A CN202011521905A CN112632316B CN 112632316 B CN112632316 B CN 112632316B CN 202011521905 A CN202011521905 A CN 202011521905A CN 112632316 B CN112632316 B CN 112632316B
Authority
CN
China
Prior art keywords
vehicle
person
snapshot
vehicles
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011521905.5A
Other languages
Chinese (zh)
Other versions
CN112632316A (en
Inventor
茅陈庆
杨海舟
楼炯
胡滨
叶雅妮
李方生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision System Technology Co Ltd
Original Assignee
Hangzhou Hikvision System Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision System Technology Co Ltd filed Critical Hangzhou Hikvision System Technology Co Ltd
Priority to CN202011521905.5A priority Critical patent/CN112632316B/en
Publication of CN112632316A publication Critical patent/CN112632316A/en
Application granted granted Critical
Publication of CN112632316B publication Critical patent/CN112632316B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/587Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a data processing method, a data processing device, electronic equipment and a storage medium, relates to the field of data processing, can timely and effectively determine a human-vehicle association relationship, and is wide in adaptability. The method comprises the following steps: receiving an inquiry request comprising inquiry parameters, wherein the inquiry request is used for requesting to inquire a human-vehicle association relation corresponding to the inquiry parameters, and the human-vehicle association relation is used for representing transportation means used by personnel under the condition of inquiring the parameters; acquiring a snapshot image corresponding to the query parameter, and identifying the acquired snapshot image to acquire space-time information of a target object (the target object comprises personnel and vehicles) in the snapshot image; and determining the association relationship between the people and the vehicle according to the acquired space-time information.

Description

Data processing method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a data processing method and apparatus, an electronic device, and a storage medium.
Background
Vehicles are currently in more and more widespread use. When a person goes out, the person can select vehicles such as a balance car, an electric vehicle, a bicycle, a motorcycle, an automobile and the like to go out. The personnel select different travel tools, and the travel activity behaviors of the personnel can be indicated.
The incidence relation between the person and the vehicle is determined, and the travel behavior of the person is determined. However, currently, the association between the vehicle and the person is mainly dependent on the communication network to associate the person and the vehicle, and the applicable scenarios of the method are limited.
Disclosure of Invention
The application provides a data processing method and device, electronic equipment and a storage medium, wherein the association relationship between people and vehicles can be effectively determined by analyzing snapshot images, and the adaptability is wide.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides a data processing method, including: receiving an inquiry request comprising inquiry parameters, wherein the inquiry request is used for requesting to inquire a human-vehicle association relation corresponding to the inquiry parameters, and the human-vehicle association relation is used for representing transportation means used by personnel under the condition of inquiring the parameters; acquiring a snapshot image corresponding to the query parameter, and identifying the acquired snapshot image to acquire space-time information of a target object (the target object comprises personnel and vehicles) in the snapshot image; and determining the association relationship between the people and the vehicle according to the acquired space-time information.
According to the data processing method, the acquired multiple snap-shot images are analyzed, so that the spatiotemporal information of personnel and the spatiotemporal information of vehicles in the images are determined. And then, determining the association relationship between the people and the vehicle according to the spatiotemporal information of the people and the spatiotemporal information of the vehicles. Thus, the association relationship between the people and the vehicle can be effectively determined. Compared with the existing scheme, the method has wide adaptability.
With reference to the first aspect, in a possible implementation manner, the spatiotemporal information includes a snapshot time and a snapshot position. Correspondingly, the step of determining the association relationship between the people and the vehicle according to the acquired spatiotemporal information comprises the following steps: determining a first person and a first vehicle, wherein the first person and the first vehicle both belong to a target object, and a first preset condition is met between the snapshot time of the first person and the snapshot time of the first vehicle; determining that a second preset condition is met between the snapshot position of the first person and the snapshot position of the first vehicle; a first person is determined to be associated with a first vehicle.
Through matching the time information of the personnel and the time information of the vehicles, the snapshot information of the personnel and the snapshot information of the vehicles, the matching of the time and the position can be determined, and the personnel and the vehicles with higher matching degree can be determined, so that the incidence relation between the personnel and the vehicles can be determined.
With reference to the first aspect, in a possible implementation manner, the "recognizing the snap-shot image to obtain the spatiotemporal information of the target object in the snap-shot image" includes: identifying a vehicle in the snap shot image; determining a geographical coordinate area of the vehicle, and taking the geographical coordinate area of the vehicle as a snapshot position of the vehicle; the geographic coordinate region of the vehicle includes a vertical projection region of the vehicle on a level ground.
With reference to the first aspect, in a possible implementation manner, if the snapshot position of the first person is a center point coordinate of the first person, the second preset condition includes: the snapshot location of the first person is located within a geographic coordinate area of the first vehicle. If the capturing position of the first person is a vertical projection area of the first person on the horizontal ground, the second preset condition includes: the snapshot location of the first person overlaps with the geographic coordinate region of the first vehicle.
With reference to the first aspect, in a possible implementation manner, the data processing method further includes: determining the characteristics of each vehicle used by each person in the person-vehicle association relationship; for each person, the vehicles with the same characteristics are the same vehicle; determining characteristic information of the same vehicle used by each person according to the determined characteristics of the vehicles; the characteristic information includes at least one of a quantity, a frequency, or a usage rule.
With reference to the first aspect, in a possible implementation manner, the data processing method further includes: and determining the rule of using the vehicle by each person in the person-vehicle association relationship.
In a second aspect, the present application provides a data processing apparatus comprising: the device comprises a communication unit, a processing unit and a determining unit.
The communication unit is used for receiving a query request; the inquiry request comprises inquiry parameters and is used for requesting to inquire the human-vehicle association relation corresponding to the inquiry parameters; the human-vehicle association relationship is used for representing vehicles used by personnel under the condition of inquiring parameters. The communication unit is further configured to acquire a snapshot image corresponding to the query parameter. And the processing unit is used for identifying the snapshot image so as to acquire the space-time information of the target object in the snapshot image, wherein the target object comprises personnel and vehicles. The determining unit is used for determining the association relationship between the people and the vehicle according to the acquired space-time information.
With reference to the second aspect, in a possible implementation manner, the determining unit is specifically configured to: a first person and a first vehicle; the method comprises the following steps that a first preset condition is met between the snapshot time of a first person and the snapshot time of a first vehicle; both the first person and the first vehicle belong to a target object; determining that a second preset condition is met between the snapshot position of the first person and the snapshot position of the first vehicle; and determining that the first person is associated with the first vehicle.
With reference to the second aspect, in a possible implementation manner, the processing unit is specifically configured to: identifying a vehicle in the snapshot image, determining a geographic coordinate area of the vehicle, and taking the geographic coordinate area of the vehicle as a snapshot position of the vehicle; the geographic coordinate region of the vehicle includes a vertical projection region of the vehicle on a level ground.
With reference to the second aspect, in a possible implementation manner, the determining unit is further configured to: determining the characteristics of each vehicle used by each person in the person-vehicle association relationship; for each person, the vehicles with the same characteristics are the same vehicle; determining the characteristic information of the same vehicle used by each person according to the determined characteristics of the vehicles; the characteristic information includes at least one of a quantity, a frequency, or a usage rule.
With reference to the second aspect, in a possible implementation manner, if the snapshot position of the first person is the center point coordinate of the first person, the second preset condition includes: the snapshot location of the first person is located within a geographic coordinate area of the first vehicle. If the snapshot position of the first person is a vertical projection area of the first person on the horizontal ground, the second preset condition comprises: the snapshot location of the first person overlaps with the geographic coordinate region of the first vehicle.
In a third aspect, the present application provides an electronic device comprising a memory and a processor. A memory for storing a computer program and a processor for executing the computer program to perform the data processing method according to the first aspect and any possible implementation thereof.
In a fourth aspect, the present application provides a chip system, which is applied to a data processing apparatus; the system-on-chip includes one or more interface circuits, and one or more processors. The interface circuit and the processor are interconnected through a line; the interface circuit is configured to receive signals from a memory of the data processing apparatus and to transmit the signals to the processor, the signals including computer instructions stored in the memory. When the processor executes the computer instructions, the data processing apparatus performs the data processing method according to the first aspect and any possible implementation manner thereof.
In a fifth aspect, the present application provides a computer-readable storage medium comprising computer instructions which, when run on a computer, cause the computer to perform a data processing method as described in the first aspect and any one of its possible implementations.
In a sixth aspect, the present application provides a computer program product comprising computer instructions which, when run on a data processing apparatus, cause the data processing apparatus to perform the data processing method according to the first aspect and any one of its possible implementations.
Reference may be made in detail to the second to sixth aspects and various implementations of the first aspect in this application; in addition, for the beneficial effects of the second aspect to the sixth aspect and the various implementation manners thereof, reference may be made to beneficial effect analysis in the first aspect and the various implementation manners thereof, which is not described herein again.
These and other aspects of the present application will be more readily apparent from the following description.
Drawings
Fig. 1 is a system architecture diagram of a monitoring system according to an embodiment of the present application;
FIG. 2A is a schematic diagram of a data processing method according to an embodiment of the present disclosure;
FIG. 2B is a schematic diagram of another data processing method provided by an embodiment of the present application;
FIG. 3 is a block diagram of a hardware of a computing device according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of a data processing method according to an embodiment of the present application;
fig. 5 is a schematic flow chart of another data processing method according to an embodiment of the present application;
FIG. 6 is a schematic diagram of geographic coordinates of an automobile determined by an electronic device according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an electronic device determining geographic coordinates of a bicycle according to an embodiment of the present application;
fig. 8 is a schematic flowchart of another data processing method according to an embodiment of the present application;
fig. 9 is a schematic view of a display interface of an electronic device according to an embodiment of the present disclosure;
fig. 10 is a schematic view of a display interface of another electronic device according to an embodiment of the present application;
fig. 11 is a schematic view of a display interface of another electronic device according to an embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In the description of this application, "/" means "or" unless otherwise stated, for example, A/B may mean A or B. "and/or" herein is merely an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, and may mean: a exists alone, A and B exist simultaneously, and B exists alone. Further, "at least one" means one or more, "a plurality" means two or more. The terms "first", "second", and the like do not necessarily limit the number and execution order, and the terms "first", "second", and the like do not necessarily limit the difference.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to mean exemplary, illustrative, or descriptive. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present relevant concepts in a concrete fashion.
With the development of society, traveling using various vehicles (e.g., balance cars, electric vehicles, bicycles, motorcycles, automobiles) has become a conventional way for people to travel.
At present, the incidence relation of personnel and vehicles can be correlated, the travel track of the personnel is described according to the information such as the travel route, the travel time and the like of the personnel, and whether the travel of the personnel is abnormal or not is determined according to the travel track of the personnel.
In the prior art, when a vehicle used by a person is related, the mobile terminal of the person is mainly positioned by means of a communication network.
One example, the current method of associating people with vehicles may be: and a vehicle acquisition device and a mobile terminal signal acquisition device are arranged at the fixed bayonet. When the vehicle passes through the gate, the vehicle acquisition device acquires vehicle information, and the mobile terminal signal acquisition device acquires signals of a mobile terminal in the vehicle to determine information of the mobile terminal. And the background server determines the personnel information bound by the mobile terminal according to the mobile terminal information. For example, the mobile terminal is a mobile phone, and the background server determines the information of the person corresponding to the mobile terminal through the information of the person who opens the card corresponding to the SIM card in the mobile phone. And then, the background server associates the vehicle information and the personnel information to determine the association relationship between the vehicle and the personnel.
The method for carrying out the people-vehicle association needs to rely on a communication network to position personnel, and the personnel can accurately position the personnel only by carrying the terminal which has the positioning function and is bound with the personnel, so as to complete the people-vehicle association.
If the person does not carry the mobile terminal with the positioning function, the association relationship between the person and the vehicle cannot be established. Or if there is no association between the mobile terminal carried by the person and the person (for example, the card opening information of the SIM card in the mobile phone carried by the person is the identity information of other persons), the person-vehicle association will be a wrong association.
Based on the technical problem, the application provides a data processing method, and the spatiotemporal information of the personnel and the spatiotemporal information of the vehicles in the images are determined by analyzing the acquired multiple snapshot images. And then, determining the association relationship between the people and the vehicle according to the space-time information of the people and the space-time information of the vehicle. Thus, the association relationship between the people and the vehicle can be effectively determined.
The data processing method provided by the embodiment of the application can be applied to the monitoring system 100. Fig. 1 shows a system architecture diagram of a monitoring system 100.
As shown in fig. 1, a monitoring system 100 provided in the embodiment of the present application includes a terminal device 10, a backend server 20, a storage server 30, and an image capturing apparatus 40.
The terminal device 10 may be configured to generate an inquiry request according to an operation of a user, obtain an inquiry result corresponding to the inquiry request, and display the inquiry result. Optionally, the terminal device may determine, according to the query request, a human-vehicle association relationship corresponding to the query request; the query request may also be sent to the backend server 20, and then the human-vehicle association relationship from the backend server 20 is received.
For example, after the user inputs a request for querying a relationship between a person and a vehicle into the terminal device 10, the terminal device 10 obtains a snapshot image corresponding to a query parameter in the query request, and identifies the snapshot image to obtain spatiotemporal information of a target object (a person and a vehicle) in the snapshot image. Then, the terminal device 10 determines a human-vehicle association relationship according to the acquired spatiotemporal information, and then displays the human-vehicle association relationship.
The background server 20 may be configured to, in response to the query request of the terminal device 10, obtain an image corresponding to the query parameter in the query request from the storage server 30, and analyze the obtained image to determine the human-vehicle association relationship. After determining the human-vehicle association relationship, the backend server 20 may also send the association relationship to the terminal device 10 so that the terminal device 10 displays the association relationship.
The storage server 30 is used for receiving the image from the image acquisition device 40 and storing the image.
The image capturing device 40 is configured to capture a picture image, a video image, or the like in the coverage area, and send the captured image to the storage server 40.
In practical application, the terminal device 10 and the background server 20 may be integrated into one computing device, or may be located in two computing devices independent from each other, and the position relationship between the terminal device 10 and the background server 20 is not limited in this embodiment. Similarly, the backend server 20 and the storage server 30 may be integrated in one device, or may be located in two devices independent from each other, and the embodiment of the present application does not limit any position relationship between the backend server 20 and the storage server 30.
For convenience of description, the present application is described by taking an example in which the terminal device 10, the backend server 20, and the storage server 30 are independently provided.
The principles of the data processing method provided by the present application will now be described with reference to fig. 1.
As shown in fig. 2A, the image capture apparatus 40 transmits the captured image to the storage server 30 after capturing the image. The storage server 30 stores the captured image after receiving the image. After detecting that the first operation for establishing the association relationship is triggered by the staff, the terminal device 10 generates a corresponding query request and sends the query request to the background server 20. After receiving the query request, the background server 20 obtains a corresponding image from the storage server 30, and analyzes the image to determine the association relationship between the people and the vehicle. The background server 20 sends the association relationship between the person and the vehicle to the terminal device 10. After receiving the human-vehicle association relationship from the background server 20, the terminal device 10 displays the human-vehicle association relationship.
As shown in fig. 2B, the image capture device 40 transmits the captured image to the storage server 30 after capturing the image. The storage server 30 stores the captured image after receiving the image. After detecting that the staff triggers the first operation of establishing the association relationship, the terminal device 10 generates a corresponding query request. The terminal device 10 acquires the corresponding image from the storage server 30 and analyzes the image to determine the human-vehicle association relationship. After that, the terminal device 10 displays the human-vehicle association relationship.
The basic hardware structures of the terminal device 10, the backend server 20, the storage server 30, and the image capturing apparatus 40 are similar, and all include elements included in the computing apparatus shown in fig. 3. The hardware structures of the terminal device 10, the backend server 20, the storage server 30, and the image capturing apparatus 40 will be described below by taking the computing apparatus shown in fig. 3 as an example.
As shown in fig. 3, the computing device may include a processor 31, a memory 32, a communication interface 33, and a bus 34. The processor 31, the memory 32 and the communication interface 33 may be connected by a bus 34.
The processor 31 is a control center of the computing device, and may be a single processor or a collective term for a plurality of processing elements. For example, the processor 31 may be a Central Processing Unit (CPU), or may be another general-purpose processor. Wherein a general purpose processor may be a microprocessor or any conventional processor or the like.
For one embodiment, processor 31 may include one or more CPUs, such as CPU 0 and CPU1 shown in FIG. 3.
The memory 32 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that may store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that may store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
In a possible implementation, the memory 32 may exist separately from the processor 31, and the memory 32 may be connected to the processor 31 through a bus 34 for storing instructions or program codes. The processor 31 can implement the data processing method provided by the following embodiments of the present application when calling and executing the instructions or program codes stored in the memory 32.
In the embodiment of the present application, the terminal device 10, the backend server 20, the storage server 30, and the image capturing apparatus 40 are different in software programs stored in the memory 32, so that the functions implemented by the terminal device 10, the backend server 20, the storage server 30, and the image capturing apparatus 40 are different. The functions performed by the devices will be described in connection with the following flow charts.
In another possible implementation, the memory 32 may also be integrated with the processor 31.
A communication interface 33, configured to connect the computing apparatus with other devices through a communication network, where the communication network may be an ethernet, a Radio Access Network (RAN), a Wireless Local Area Network (WLAN), or the like. The communication interface 33 may include a receiving unit for receiving data, and a transmitting unit for transmitting data.
The bus 34 may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 3, but this does not mean only one bus or one type of bus.
It should be noted that the configuration shown in fig. 3 does not constitute a limitation of the computing device, which may include more or less components than those shown in fig. 3, or some components may be combined, or a different arrangement of components than those shown in fig. 3.
The execution main body of the data processing method provided by the embodiment of the application is a data processing device. The data processing device may be the terminal device 10, a CPU in the terminal device 10, a control module for analyzing an image in the terminal device 10, or a client for analyzing an image in the terminal device 10. Of course, the data processing device may be the backend server 20, the CPU in the backend server 20, or a control module for analyzing an image in the backend server 20. The embodiment of the present application takes an electronic device (the electronic device may be the terminal device 10 or the backend server 20) as an example to execute a data processing method, and describes the data processing method provided by the present application.
The data processing method provided by the embodiment of the present application is described below with reference to the drawings.
As shown in fig. 4, the data processing method provided in the embodiment of the present application includes S400, S401, S402, and S403. Next, S401, S402, and S403 will be described in detail.
S400, the electronic equipment receives the query request.
The inquiry request comprises an inquiry parameter and is used for requesting to inquire the human-vehicle association relation corresponding to the inquiry parameter. The human-vehicle association relationship is used for representing vehicles used by personnel under the condition of inquiring parameters.
The query parameters are user-selected in the actual application. The following describes the contents included in the query parameter, taking scene a and scene b as examples.
And a scene a, wherein the query parameters comprise an area parameter and a time parameter.
The area parameters are used for representing the area range of the association relation between the people and the vehicle needing to be inquired. The time parameter is used for representing the time range of the association relation of the people and the vehicles needing to be inquired.
Illustratively, the region parameter is H city, and the time parameter is 2020, 1 month to 2020, 6 months. Accordingly, the query request is for requesting a query of vehicles used by all citizens for traveling during the period from 1 month to 6 months of 2020.
And b, the query parameters comprise personnel identification parameters and time parameters.
The personnel identification parameters are used for representing personnel of the association relation between the personnel and the vehicle needing to be inquired. The time parameter is used for representing the time range of the association relation between the person and the vehicle needing to be inquired.
Illustratively, the person identification parameters include zhang three and lie four, and the time parameter is from 1 month 2020 to 6 months 2020. Accordingly, the query request is for requesting a query for vehicles used by zhang for travel during 1 month to 6 months of 2020, and vehicles used by lie four for travel during 1 month to 6 months of 2020.
In practical applications, the query parameter may further include other query parameters. For example, people are divided by people, and people are divided into normal people and abnormal people (such as control people); when the query is carried out, the query can be carried out aiming at a certain type of personnel (such as abnormal personnel). As such, the query parameters may include an abnormal person identification. Alternatively, the application may also query for vehicles, for example, with vehicle identification as a parameter, matching people who have used the vehicle for a period of time.
The type of the parameter included in the query parameter may be determined according to actual requirements, which is not limited in the present application.
S401, the electronic equipment acquires the snapshot image corresponding to the query parameter.
The snapshot image comprises an image captured by the image acquisition device corresponding to the query parameter. The image capturing device may be the image capturing device 40 of the monitoring system shown in fig. 1.
The image recorded by the application comprises a picture image captured by the image acquisition equipment and a video image captured by the image acquisition equipment. The images described in this application may also include other types of images, and this application is not limited in this respect.
With reference to the example in the scenario a, the electronic device may acquire an image in the city H during the period from 1 month 2020 to 6 months 2020, acquired by the image acquisition apparatus.
In combination with the example in the scenario b, the electronic device may acquire an image having image information of zhang san and lie san, which is captured by the image capturing apparatus during the period from 1 month 2020 to 6 months 2020.
S402, the electronic equipment identifies the snapshot image to acquire the space-time information of the target object in the snapshot image.
The target objects include people and vehicles. The spatiotemporal information is used for representing the snapshot position of the target object and the snapshot time of the target object.
In the case where the target object is a person, the spatiotemporal information of the person includes a snapshot position of the person and a snapshot time of the person. The position of the person to be photographed and the time of the person to be photographed together characterize the position of the person at a certain moment.
For example, the image a has image information of the person M. For the image a, the electronic device determines the time when the image a is captured by the image capturing device, which is recorded as time T1. The electronic equipment determines the snapshot position S1 of the person M according to the image information of the person M in the image A, and the position information and internal and external reference information of the image acquisition device for acquiring the image A.
Optionally, the method for determining the snapshot position of the person by the electronic device may be: the electronic equipment identifies the personnel in the snapshot image and determines the geographic coordinates of the face. The geographic coordinates of the face can be the coordinates of the center point of the person, and can also be the vertical projection area of the person on the horizontal ground. In the case where the target object is a vehicle, the temporal-spatial information of the vehicle includes a snapshot position of the vehicle, and a snapshot time of the vehicle. The snapshot position of the vehicle and the snapshot time of the vehicle together characterize the position of the vehicle at a certain moment.
For example, the image B has therein image information of the vehicle N. For image B, the electronic device determines the time at which the image capturing device captures image B, which is denoted as time T2. And the electronic equipment determines the snapshot position S2 of the vehicle N according to the image information of the vehicle N in the image B, and the position information and the internal and external reference information of the image acquisition device for acquiring the image B.
Alternatively, with reference to fig. 4, as shown in fig. 5, in the case that the target object is a vehicle, the above S402 may be specifically implemented by S402 a.
S402a, the electronic equipment identifies the vehicles in the snapshot image, determines the geographic coordinate areas of the vehicles, and takes the geographic coordinate areas of the vehicles as snapshot positions of the vehicles.
Optionally, the geographic coordinate region of the vehicle includes a vertical projection region of the vehicle on level ground.
In one example, as shown in fig. 6, the target object is an automobile, and the electronic device determines that the snapshot position of the automobile is a region composed of C1, C2, C3, and C4.
As yet another example, as shown in fig. 7, the target object is a bicycle, and the electronic device determines that the snapshot position of the bicycle is an area consisting of C5, C6, C7, and C8.
It should be noted that, when determining the snapshot position of the vehicle, the electronic device may suitably enlarge the vertical projection area of the vehicle on the horizontal ground. In this way, it is also possible to manage persons and vehicles when the persons are getting on or off the vehicle.
The electronic equipment can determine the snapshot position of the target object according to the first parameter of the image acquisition device and the position of the target object in the image, whether the electronic equipment is a person or a vehicle. Wherein the first parameter of the image acquisition device comprises at least one of: position information (e.g., latitude and longitude information) of the image pickup device, a resolution of the image pickup device, a focal length of the image pickup device, a mounting height of the image pickup device, and a mounting angle of the image pickup device.
Optionally, when determining the snapshot position of the target object in the snapshot image, the electronic device may first establish a rectangular coordinate system in the snapshot image, and determine a coordinate point of the target object in the rectangular coordinate system. The electronic equipment establishes a three-dimensional coordinate system for the image acquisition device and determines an internal reference matrix and an external reference matrix corresponding to the three-dimensional coordinate system. The electronic equipment determines the coordinate point of the target object in the three-dimensional coordinate system according to the coordinate point of the target object in the rectangular coordinate system, and the internal reference matrix and the external reference matrix corresponding to the three-dimensional coordinate system by adopting a first preset algorithm. The electronic equipment adopts a second preset algorithm to convert the coordinate points in the three-dimensional coordinate system into coordinate points in the world coordinate system, and therefore the actual geographic position information of the target object can be determined.
In addition, the electronic device determines the snapshot time of the snapshot image as the snapshot time of the target object.
It should be noted that, when the image capturing device captures an image, the image capturing device may synchronously record the time for capturing the image. When the electronic device acquires the snapshot image, the snapshot time of the snapshot image is generally acquired. When the electronic equipment analyzes the target object in the snapshot image, the snapshot time of the snapshot image is used as the snapshot time of the target object.
In this way, the electronic device can determine the snapshot time and the snapshot position of the target object according to the method described in the above scheme. A data basis is determined for subsequent matching of the electronic device to the person and the vehicle.
And S403, the electronic equipment determines the association relationship between the people and the vehicle according to the acquired space-time information.
The electronic equipment analyzes the space-time information of the vehicles and the space-time information of the personnel and determines the association between the vehicles and the personnel, wherein the snapshot time and the snapshot position both meet the conditions.
For example, as shown in fig. 6, if the snapshot position of the person is located in the area formed by C1, C2, C3, and C4 or overlaps with the area, and the snapshot time of the car is the same as the snapshot time of the person or the time difference is within a preset range, it is determined that the person and the car have an association relationship.
As another example, as shown in fig. 7, if the snapshot position of the person is located in or overlaps with the area formed by C5, C6, C7, and C8, and the snapshot time of the bicycle is the same as the snapshot time of the person or the time difference is within the preset range, it is determined that the person has an association relationship with the bicycle. The scene a and the association relationship between people and vehicles are used for representing the association relationship between all citizens and used vehicles in the H city from 1 month in 2020 to 6 months in 2020.
For the scene b, the association relationship between the person and the vehicle is used for representing the association relationship between Zhang III and Li IV and the used vehicles from 1 month in 2020 to 6 months in 2020.
For example, the following table 1 shows the association relationship between people and vehicles determined by the electronic device.
TABLE 1
Time for snap shot of personnel Vehicle snapshot time Personnel identification Associated vehicle identification
2020/01/02-13:11 2020/01/02-13:10 A1 C1
2020/01/02-13:16 2020/01/02-13:18 A2 C2
2020/01/02-13:19 2020/01/02-13:19 A3 C3
2020/01/02-13:20 2020/01/02-13:20 A4 C4
2020/01/02-13:22 2020/01/02-13:21 A5 C5
2020/01/02-13:23 2020/01/02-13:22 A6 C6
2020/01/02-13:24 2020/01/02-13:23 A7 C7
2020/01/02-13:26 2020/01/02-13:24 A8 C8
2020/01/02-13:27 2020/01/02-13:25 A9 C9
2020/01/02-13:30 2020/01/02-13:26 A10 C10
2020/01/02-13:32 2020/01/02-13:30 A11 C11
2020/01/03-14:26 2020/01/03-14:26 A1 C12
2020/01/03-14:29 2020/01/03-14:28 A2 C2
2020/01/03-14:32 2020/01/03-14:30 A12 C14
2020/01/06-15:10 2020/01/06-15:10 A13 C15
2020/01/06-15:12 2020/01/06-15:11 A14 C16
2020/01/06-15:13 2020/01/06-15:12 A15 C17
2020/01/06-15:14 2020/01/06-15:13 A16 C18
2020/01/06-15:17 2020/01/06-15:14 A17 C19
2020/01/06-15:18 2020/01/06-15:15 A18 C20
2020/01/06-15:23 2020/01/06-15:24 A1 C21
In conjunction with table 1 above, the vehicles associated with person A1 include three vehicles, C1, C12, and C21.
Based on the technical scheme, the application provides a data processing method, and the electronic equipment can analyze the acquired multiple images and determine the spatiotemporal information of the personnel and the spatiotemporal information of the vehicles appearing in the multiple images. The electronic equipment matches the spatiotemporal information of the personnel with the spatiotemporal information of the vehicles, can determine the vehicles used by the personnel at each time point, and further establishes the human-vehicle association relationship according to the vehicles used by the personnel at each time point. Therefore, the electronic equipment can directly determine the association relationship between the person and the vehicle according to the image.
S403 described above is explained in detail below. Referring to fig. 4, as shown in fig. 5, S403 may be implemented by S403a, S403b, and S403 c.
S403a, the electronic device determines a first person and a first vehicle.
Both the first person and the first vehicle belong to the target object. A first preset condition is met between the snapshot time of the first person and the snapshot time of the first vehicle.
In one possible implementation manner, the first preset condition is that a difference value between the snapshot time of the first person and the snapshot time of the first vehicle is smaller than or equal to a preset threshold value.
The size of the preset threshold value can be determined according to actual conditions. For example, the magnitude of the preset time difference is 10 seconds.
For example, the electronic device may determine a person to be snapped at 13/4/5/2020.
And S403b, the electronic equipment determines that a second preset condition is met between the snapshot position of the first person and the snapshot position of the first vehicle.
Optionally, if the snapshot position of the first person is the center point coordinate of the first person, the second preset condition in this application includes: the snapshot location of the first person is located within a geographic coordinate area of the first vehicle.
If the snapshot position of the first person is a vertical projection area of the first person on the horizontal ground, the second preset condition comprises: the snapshot location of the first person overlaps with the geographic coordinate region of the first vehicle.
More specifically, the second preset condition in the present application may be any one of the following conditions:
the condition a, the distance between the center coordinates of the first person and the center coordinates of the geographic coordinate area of the first vehicle is less than or equal to a preset distance.
Condition b, the center coordinates of the first person are located within the geographic coordinate area of the first vehicle.
The condition c is that there is an intersection between the vertical projection area of the first person on the level ground and the geographical coordinate area of the first vehicle.
The condition d, the center coordinate of the first person is located within the appropriately expanded geographic coordinate area of the first vehicle.
And e, the intersection exists between the vertical projection area of the first person on the horizontal ground and the appropriately expanded geographic coordinate area of the first vehicle.
S403c, the electronic device determines that the first person is associated with the first vehicle.
In the event 403b is satisfied, the electronic device determines that the first person is associated with the first vehicle.
Further optionally, the electronic device may further determine, according to the obtained human-vehicle association relationship, feature information (at least one of quantity or frequency) that each person uses the same vehicle, or determine a rule that each person uses the vehicle.
With reference to fig. 4, as shown in fig. 5, after S403, the data processing method provided in the embodiment of the present application may further include S500 and S501, and further may further include S502.
S500, the electronic device determines the characteristics of each vehicle used by each person in the person-vehicle association relationship.
For each person, the vehicles with the same characteristics are the same vehicle.
In one possible implementation, a method for an electronic device to determine characteristics of each vehicle used by each person includes:
the electronic device builds a vehicle characteristic analysis model.
After the transportation information related to the persons is determined, a transportation characteristic analysis model is called to perform characteristic analysis on all transportation corresponding to each person, and whether the transportation in the images is the same transportation is judged.
The electronic device determines vehicles having the same characteristics determined according to the algorithm as the same vehicle and determines vehicles having different characteristics as different vehicles.
Thus, according to the characteristic analysis model, the electronic equipment identifies the electric vehicles, balance cars, motorcycles and other vehicles without license plate marks.
It should be noted that the vehicle characteristic analysis model may be preset in the electronic device, or may be trained by the electronic device, which is not limited in this application.
S501, the electronic equipment determines feature information of the same vehicle used by each person according to the determined features of the vehicles.
The characteristic information includes at least one of a quantity, a frequency, or a usage rule.
Illustratively, table 2 shows the association of person a with the vehicle.
TABLE 2
Person(s) Transportation means Face snapshot time Vehicle snapshot time
A J1
2020/01/02-13:22:02 2020/01/02-13:22:04
A J2 2020/01/04-15:17:08 2020/01/04-15:17:09
A J1 2020/01/17-15:17:08 2020/01/17-15:17:09
A J1 2020/02/01-11:47:20 2020/02/01-11:47:20
A J3 2020/02/07-19:27:20 2020/02/07-19:27:20
A J1 2020/02/18-23:08:34 2020/02/18-23:08:36
The electronic device determines that person a uses three vehicles, i.e., vehicle J1, vehicle J2, and vehicle J3, in common between month 02 of 2020 and month 02 of 2020 by the association relationship shown in table 2.
The electronic device determines the frequency of travel of the person a using each vehicle according to the association relationship shown in table 2, as shown in table 3 below.
TABLE 3
Personnel Transportation means Frequency of 1 month occurrence 2 month occurrence frequency
A J1
2 times per month 4 times per month
A J2
1 time/month 0 times per month
A J3
0 times/month 1 time/month
Based on table 3, the electronic device determines that person a used vehicle J1 2 times/month, vehicle J2 1 times/month, and vehicle J3 0 times/month in 1 month.
Person A used vehicle J1 4 times/month, vehicle J2 0 times/month, and vehicle J3 1 times/month in month 1.
S502, the electronic equipment determines the rule that each person in the person-vehicle association relation uses the vehicle.
Optionally, in the embodiment of the present application, rules of using a vehicle by each person are divided into two cases, namely regular travel (denoted as case 1) and irregular travel (denoted as case 2), which are described below:
case 1, regular trip
Regular travel refers to that the transportation, the travel route and the travel time used by the personnel in the query time period are regular.
For example, for person B, person B is associated with electric vehicle J4 at about 6 am a day, many times a month, and the travel route is home to the dish market. And the travel route is from the vegetable market to home, and is related to the electric vehicle J4 at about 6 points 30.
Eight points in the morning are associated with bicycle J5, and the travel route is home to business. At night 6 points are associated with bicycle J5, and the travel route is company to home.
At 8 pm, point is associated with bicycle J6, and the travel route is home to park. Nine o' clock at night and bicycle J6 management, the trip route is park to home.
According to the above analysis of the association between the person B and the vehicle, the travel route of the person B and the travel time of the person B, the person B can be determined to be a regularly traveling person. The class of personnel is usually a person following disciplinary law, and the person B is marked as a person on regular trips so as to distinguish the person B from the person needing to be monitored intensively in a later period.
Case 2, irregular trip
Irregular travel refers to that the personnel use the transportation means in the inquiry time period, the travel route and the travel time are random, no regularity exists,
For example, person E is associated with J7, J8, J9, three vehicles, respectively, during the query time period. The number of trips of the person E using J7 is 5 times, the number of trips using J8 is 6 times, and the number of trips using J9 is 5 times in the query time period. There is no regularity in the time when the person E uses each vehicle, and the travel route includes places such as internet cafes, night shops, and the like.
According to the above analysis of the association of the person E with the vehicle, the travel route of the person E, and the travel time of the person E, it can be determined that the person E is a person who travels irregularly. The risk that this type of personnel is unusual personnel is higher, consequently marks personnel E as the personnel of irregularly going out to further judge personnel E in the later stage, confirm whether to carry out key monitoring to personnel E.
Based on the technical scheme, the electronic equipment analyzes the transportation means of each person, can determine whether each person is in regular travel or not, and provides basis for subsequent monitoring of the persons.
It should be noted that the present application provides a solution for analyzing the use of a person and a vehicle to determine whether the person is going out regularly.
In practical applications, the electronic device may further analyze a user of the vehicle, or determine the user of the vehicle for the vehicle, so as to determine whether the vehicle or the user is abnormal. The specific implementation manner is similar to the above method, and details are not described herein.
With reference to fig. 5, as shown in fig. 8, in a case where the electronic device provided in the embodiment of the present application is a terminal device, the electronic device can further execute the following S801.
S801, displaying the association relationship between people and vehicles or displaying the characteristic information of the vehicles used by each person by the electronic equipment.
The characteristic information includes at least one of a quantity, a frequency, or a usage rule.
An example, taking an electronic device as a mobile phone as an example, as shown in (a) in fig. 9, is an operation interface for a worker to query the association relationship between people and vehicles on a query interface displayed by the mobile phone. The query parameters input by the staff comprise: target population F, time parameter 2020-1-2020-2.
As shown in fig. 9 (b), after the person-vehicle association relationship of each person in the target group F is determined for the mobile phone, a display interface for selecting a person is displayed.
As shown in fig. 9 (c), after the person a in the target group is selected for the worker, the content displayed on the interface is displayed on the mobile phone. In this case, the display contents of the display interface of the mobile phone include the table shown in table 1 above.
It should be noted that, when the electronic device displays the human-vehicle association relationship of the target group F, the electronic device may directly display the human-vehicle association relationship of all the people in the target group F without displaying the people selection interface, which is not limited in this application.
An example, as shown in fig. 10 (a), is an operation interface for a worker to inquire about the number and frequency of vehicles used on a query interface displayed on a mobile phone. The query parameters input by the staff comprise: and (3) identifying the target population F, wherein the time parameter is from 1 month in 2020 to 2 months in 2020.
As shown in (b) of fig. 10, after the number and frequency of use of vehicles by each person in the target group F are determined for the cellular phone, a display interface of the selected person is displayed.
As shown in fig. 10 (c), after selecting person a in the target group for the worker, the cell phone displays an interface of the number and frequency of vehicles used by person a, and in this case, the cell phone display interface display contents include the table shown in table 2 above.
It should be noted that, when the electronic device displays the number and the frequency of the vehicles used by the target group F, the number and the frequency of the vehicles used by all the people in the target group F may also be directly displayed without displaying the people selection interface, which is not limited in the present application.
An example, as shown in fig. 11 (a), is an operation interface for a worker to inquire about the rules of the vehicles used by the target crowd F on a query interface displayed on a mobile phone. The query parameters input by the staff comprise: target population F, time parameter 2020-1-2020-2.
As shown in fig. 11 (b), after the person-vehicle association relationship of each person in the target group F is determined for the mobile phone, a display interface for selecting a person is displayed.
As shown in fig. 11 (c), after the worker selects the person a in the target group, the mobile phone displays that the display result of the regular interface of the vehicle used by the person a is irregular travel.
It should be noted that, when the electronic device displays the rule of the transportation means used by the target group F, the rule of the transportation means used by all the persons in the target group F may also be directly displayed without displaying the person selection interface, which is not limited in the present application.
The scheme provided by the embodiment of the application is mainly introduced from the perspective of a method. To implement the above functions, it includes hardware structures and/or software modules for performing the respective functions. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application also provides a data processing device. The data processing device may be the electronic device, a CPU in the electronic device, or a control module for analyzing an image in the electronic device.
Fig. 12 is a schematic structural diagram of a data processing apparatus 120 according to an embodiment of the present disclosure. The data processing device 120 is configured to execute the data processing method shown in fig. 4, fig. 5, or fig. 8. The data processing apparatus 120 may include a determination unit 1201, a processing unit 1202, and a communication unit 1203.
A communication unit 1203, configured to receive the query request; the inquiry request comprises an inquiry parameter and is used for requesting to inquire the human-vehicle association relation corresponding to the inquiry parameter. The human-vehicle association relationship is used for representing vehicles used by personnel under the condition of inquiring parameters. For example, in conjunction with fig. 4, 5, or 8, the communication unit 1203 may be configured to perform S400.
The communication unit 1203 is further configured to acquire a snapshot image corresponding to the query parameter. For example, in conjunction with fig. 4, 5, or 8, the communication unit 1203 may be configured to perform S401.
And the processing unit 1202 is used for identifying the snapshot image so as to acquire the space-time information of the target objects in the snapshot image, wherein the target objects comprise people and vehicles. For example, in conjunction with fig. 4, 5, or 8, the communication unit 1203 may be configured to perform S402.
And a determining unit 1201, configured to determine a human-vehicle association relationship according to the acquired spatio-temporal information. For example, in conjunction with fig. 4, 5, or 8, the communication unit 1203 may be configured to perform S403.
Optionally, the determining unit 1201 is specifically configured to: determining a first person and a first vehicle; the first person and the first vehicle both belong to the target object; the method comprises the following steps that a first preset condition is met between the snapshot time of a first person and the snapshot time of a first vehicle; determining that a second preset condition is met between the snapshot position of the first person and the snapshot position of the first vehicle; a first person is determined to be associated with a first vehicle.
Optionally, the processing unit 1202 is specifically configured to: identifying a vehicle in the snapshot image, determining a geographic coordinate area of the vehicle, and taking the geographic coordinate area of the vehicle as a snapshot position of the vehicle; the geographic coordinate region of the vehicle includes a vertical projection region of the vehicle on a level ground.
Optionally, if the snapshot position of the first person is the center point coordinate of the first person, the second preset condition includes: the snapshot location of the first person is located within a geographic coordinate area of the first vehicle. If the snapshot position of the first person is a vertical projection area of the first person on the horizontal ground, the second preset condition comprises: the snapshot location of the first person overlaps the geographic coordinate area of the first vehicle.
Optionally, the determining unit 1201 is further configured to: determining the characteristics of each vehicle used by each person in the person-vehicle association relationship; for each person, the vehicles with the same characteristics are the same vehicle; determining the characteristic information of the same vehicle used by each person according to the determined characteristics of the vehicles; the characteristic information includes at least one of a quantity, a frequency, or a usage rule.
Optionally, the data processing apparatus 120 further comprises a display unit 1205, and the display unit 1205 is configured to display at least one of: the association relationship between people and vehicles, the characteristic information of the vehicles used by each person, and the rule of the vehicles used by each person.
Of course, the data processing apparatus 120 provided in the embodiment of the present application includes, but is not limited to, the above modules.
In practical implementation, the determining unit 1201 and the processing unit 1202 may be implemented by the processor 31 shown in fig. 3 calling the program code in the memory 32. For a specific implementation process, reference may be made to the description of the data processing method portion shown in fig. 4, fig. 5, or fig. 8, which is not described herein again.
Another embodiment of the present application further provides a computer-readable storage medium, in which computer instructions are stored, and when the computer instructions are executed on an electronic device, the electronic device is caused to perform the steps performed by the electronic device in the method flows shown in the foregoing method embodiments.
Another embodiment of the present application further provides a chip system, which is applied to an electronic device. The system-on-chip includes one or more interface circuits, and one or more processors. The interface circuit and the processor are interconnected by a line. The interface circuit is configured to receive signals from a memory of the electronic device and to send the signals to the processor, the signals including computer instructions stored in the memory. When the processor executes the computer instructions, the electronic device performs the steps performed by the electronic device in the method flow illustrated in the above-described method embodiments.
In another embodiment of the present application, a computer program product is also provided, which includes instructions that, when executed on an electronic device, cause the electronic device to perform the steps performed by the electronic device in the method flow shown in the above method embodiment.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented using a software program, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The processes or functions according to the embodiments of the present application are generated in whole or in part when the computer-executable instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). Computer-readable storage media can be any available media that can be accessed by a computer or data storage device including one or more available media integrated servers, data centers, and the like. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), among others.
The foregoing is only illustrative of the present application. Those skilled in the art should appreciate that changes and substitutions can be made in the embodiments provided herein without departing from the scope of the present disclosure.

Claims (8)

1. A data processing method, comprising:
receiving a query request; the query request comprises query parameters, and the query request is used for requesting to query the human-vehicle association relation corresponding to the query parameters; the human-vehicle incidence relation is used for representing a vehicle used by a person under the condition of the query parameter;
acquiring a plurality of snapshot images corresponding to the query parameters;
identifying the plurality of snap-shot images to acquire space-time information of target objects appearing in the plurality of snap-shot images, wherein the target objects comprise people and vehicles; the space-time information of the target object comprises the snapshot time of the person and the geographic coordinates of the person, and the space-time information of the target object further comprises the snapshot time of the vehicle and the geographic coordinates of the vehicle, wherein the geographic coordinates of the vehicle comprise a vertical projection area of the vehicle on a horizontal ground; the geographic coordinates of the personnel are coordinates of a central point of the personnel, or the geographic coordinates of the personnel are vertical projection areas of the personnel on the horizontal ground;
determining the human-vehicle association relationship according to the acquired space-time information;
the time-space information comprises snapshot time and snapshot positions; the determining the human-vehicle association relationship according to the acquired space-time information comprises the following steps:
determining a first person and a first vehicle; a first preset condition is met between the snapshot time of the first person and the snapshot time of the first vehicle; the first person and the first vehicle both belong to the target object;
determining that a second preset condition is met between the snapshot position of the first person and the snapshot position of the first vehicle; the snapshot position of the first person is the geographic coordinate of the first person, and the snapshot position of the first vehicle is the geographic coordinate of the first vehicle;
determining that the first person is associated with the first vehicle;
if the snapshot position of the first person is the center point coordinate of the first person, the second preset condition includes: the snapshot location of the first person is located within a geographic coordinate area of the first vehicle; if the snapshot position of the first person is a vertical projection area of the first person on the horizontal ground, the second preset condition includes: the snapshot location of the first person overlaps with a geographic coordinate region of the first vehicle.
2. The data processing method of claim 1, wherein the identifying the plurality of snap-shots to obtain spatiotemporal information of the target object in the plurality of snap-shots comprises:
identifying a vehicle in the plurality of snap shots;
determining the geographic coordinate areas of the vehicles in the plurality of snap-shot images, and taking the geographic coordinate areas of the vehicles in the plurality of snap-shot images as the snap-shot positions of the vehicles in the plurality of snap-shot images.
3. The data processing method according to claim 1 or 2, characterized in that the data processing method further comprises:
determining characteristics of each vehicle used by each person in the person-vehicle association relationship; for each of the persons, the vehicles with the same characteristics are the same vehicle;
determining the characteristic information of the same vehicle used by each person according to the determined characteristics of the vehicles; the characteristic information includes at least one of a quantity, a frequency, or a usage rule.
4. The data processing method according to claim 1 or 2, characterized in that the data processing method further comprises:
and determining the rule of using the vehicle by each person in the person-vehicle association relationship.
5. A data processing apparatus, comprising: a communication unit, a processing unit and a determination unit;
the communication unit is used for receiving a query request; the query request comprises query parameters, and the query request is used for requesting to query the human-vehicle association relation corresponding to the query parameters; the human-vehicle incidence relation is used for representing a vehicle used by a person under the condition of the query parameter;
the communication unit is further used for acquiring a plurality of snapshot images corresponding to the query parameters;
the processing unit is used for identifying the plurality of snap-shot images acquired by the communication unit so as to acquire space-time information of target objects appearing in the plurality of snap-shot images, wherein the target objects comprise people and vehicles; the space-time information of the target object comprises the snapshot time of the person and the geographic coordinates of the person, and the space-time information of the target object further comprises the snapshot time of the vehicle and the geographic coordinates of the vehicle, wherein the geographic coordinates of the vehicle comprise a vertical projection area of the vehicle on a horizontal ground; the geographic coordinates of the personnel are coordinates of a central point of the personnel, or the geographic coordinates of the personnel are vertical projection areas of the personnel on the horizontal ground;
the determining unit is used for determining the human-vehicle association relationship according to the space-time information acquired by the processing unit;
the time-space information includes a snapshot time and a snapshot position, and the determining unit is specifically configured to: determining a first person and a first vehicle; a first preset condition is met between the snapshot time of the first person and the snapshot time of the first vehicle; the first person and the first vehicle both belong to the target object; determining that a second preset condition is met between the snapshot position of the first person and the snapshot position of the first vehicle; the snapshot position of the first person is the geographic coordinate of the first person, and the snapshot position of the first vehicle is the geographic coordinate of the first vehicle; and determining that the first person is associated with the first vehicle;
if the snapshot position of the first person is the center point coordinate of the first person, the second preset condition includes: the snapshot location of the first person is located within a geographic coordinate area of the first vehicle;
if the capturing position of the first person is a vertical projection area of the first person on the horizontal ground, the second preset condition comprises: the snapshot location of the first person overlaps with a geographic coordinate region of the first vehicle.
6. The data processing apparatus according to claim 5, wherein the processing unit is specifically configured to: identifying vehicles in the plurality of snap-shot images, determining geographic coordinate regions of the vehicles in the plurality of snap-shot images, and taking the geographic coordinate regions of the vehicles in the plurality of snap-shot images as snap-shot positions of the vehicles in the plurality of snap-shot images;
the determination unit is further configured to: determining characteristics of each vehicle used by each person in the person-vehicle association relationship; for each of the persons, the vehicles with the same characteristics are the same vehicle; determining the characteristic information of the same vehicle used by each person according to the determined characteristics of the vehicles; the characteristic information includes at least one of a quantity, a frequency, or a usage rule.
7. An electronic device, comprising: a memory for storing a computer program and a processor for executing the computer program to perform the data processing method of any one of claims 1-4.
8. A computer-readable storage medium, having stored thereon a computer program which, when run on a computer, causes the computer to carry out the data processing method of any one of claims 1 to 4.
CN202011521905.5A 2020-12-21 2020-12-21 Data processing method and device, electronic equipment and storage medium Active CN112632316B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011521905.5A CN112632316B (en) 2020-12-21 2020-12-21 Data processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011521905.5A CN112632316B (en) 2020-12-21 2020-12-21 Data processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112632316A CN112632316A (en) 2021-04-09
CN112632316B true CN112632316B (en) 2023-03-14

Family

ID=75320423

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011521905.5A Active CN112632316B (en) 2020-12-21 2020-12-21 Data processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112632316B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9031948B1 (en) * 2011-07-06 2015-05-12 Shawn B. Smith Vehicle prediction and association tool based on license plate recognition
CN106534798A (en) * 2016-12-06 2017-03-22 武汉烽火众智数字技术有限责任公司 Integrated multidimensional data application system for security monitoring and method thereof
CN110008379A (en) * 2019-03-19 2019-07-12 北京旷视科技有限公司 Monitoring image processing method and processing device
CN110874362A (en) * 2019-10-29 2020-03-10 青岛海信网络科技股份有限公司 Data association analysis method and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180131864A1 (en) * 2016-11-04 2018-05-10 International Business Machines Corporation Image parameter-based spatial positioning
CN111065044B (en) * 2019-10-30 2021-11-16 武汉烽火众智数字技术有限责任公司 Big data based data association analysis method and device and computer storage medium
CN111079033A (en) * 2019-11-29 2020-04-28 武汉烽火众智数字技术有限责任公司 Personnel positioning analysis method based on intelligent community data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9031948B1 (en) * 2011-07-06 2015-05-12 Shawn B. Smith Vehicle prediction and association tool based on license plate recognition
CN106534798A (en) * 2016-12-06 2017-03-22 武汉烽火众智数字技术有限责任公司 Integrated multidimensional data application system for security monitoring and method thereof
CN110008379A (en) * 2019-03-19 2019-07-12 北京旷视科技有限公司 Monitoring image processing method and processing device
CN110874362A (en) * 2019-10-29 2020-03-10 青岛海信网络科技股份有限公司 Data association analysis method and device

Also Published As

Publication number Publication date
CN112632316A (en) 2021-04-09

Similar Documents

Publication Publication Date Title
US11330096B2 (en) Emergency data statistics aggregation with data privacy protection
US20140140578A1 (en) Parking enforcement system and method of parking enforcement
CN110837582B (en) Data association method and device, electronic equipment and computer-readable storage medium
CN107909668B (en) Sign-in method and terminal equipment
JP2004166024A (en) Monitoring camera system and monitoring method
US10026003B2 (en) Method and arrangement for receiving data about site traffic derived from imaging processing
KR101738443B1 (en) Method, apparatus, and system for screening augmented reality content
CN108846911A (en) A kind of Work attendance method and device
CN106303399A (en) The collection of vehicle image data, offer method, equipment, system and server
CN112819989A (en) Park patrol method, device and equipment
KR102274108B1 (en) Method and computer readable storage medium to operate parking lot for monitoring designated vehicles
CN107610452B (en) Quick car booking method and system for short-distance Bluetooth hotspot positioning
CN111065044B (en) Big data based data association analysis method and device and computer storage medium
CN112201044B (en) Road violation vehicle identification method and system, storage medium and terminal
CN108320497A (en) Pedestrian running red light behavioral value method, apparatus and computer readable storage medium
CN112632316B (en) Data processing method and device, electronic equipment and storage medium
KR20150092402A (en) System and method for smart vehicular camera technology using visual metadata tagging and wireless communication, and information trading services
CN111539274B (en) Method for informing vehicle owner to move vehicle
CN112953952A (en) Industrial security situation awareness method, platform, electronic device and storage medium
TWI450211B (en) Taxi calling system with matching function and method thereof
CN113781792B (en) Parking detection system, method and related equipment
CN113365216B (en) Tracking early warning method, system and equipment
KR20130006387A (en) A method & a system for finding lost child using smart phone, and a storage medium
CN113473092A (en) Production workshop management system, method, equipment and computer program product
CN113888881A (en) Method and system for analyzing and planning urban roadside parking resources based on microcomputer

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant