CN108287873B - Data processing method and related product - Google Patents
Data processing method and related product Download PDFInfo
- Publication number
- CN108287873B CN108287873B CN201711472718.0A CN201711472718A CN108287873B CN 108287873 B CN108287873 B CN 108287873B CN 201711472718 A CN201711472718 A CN 201711472718A CN 108287873 B CN108287873 B CN 108287873B
- Authority
- CN
- China
- Prior art keywords
- retrieval
- records
- search
- comparison
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/73—Querying
- G06F16/738—Presentation of query results
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/7867—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, title and artist information, manually generated time, location and usage information, user ratings
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computational Linguistics (AREA)
- Library & Information Science (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Image Analysis (AREA)
Abstract
The application provides a data processing method and a related product, wherein the method comprises the following steps: displaying retrieval range information on a retrieval record statistical page of a video monitoring system, wherein the retrieval range information comprises a time interval; receiving a designated time period selected from the time interval; acquiring retrieval records of the specified time period to obtain N retrieval records, wherein N is an integer greater than 1; performing aggregation processing on the N retrieval records to obtain M retrieval result sets, wherein each retrieval result set corresponds to an object, and M is a positive integer smaller than N; determining the retrieval frequency of each object according to the M retrieval result sets to obtain M frequency values; and displaying the M retrieval result sets according to the M frequency values. The embodiment of the application enriches the functions of the video monitoring system and has more practical value.
Description
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a data processing method and a related product.
Background
With the rapid development of economy, society and culture, the influence at home and abroad is increasing day by day, more and more foreign people flow to cities, the population increase accelerates the urbanization process and brings more challenges to city management, but video monitoring provides technical support for city safety, at present, cameras are arranged in cities, the video monitoring technology is mature day by day, and a large number of images can be captured at every moment. At present, video monitoring systems are prevalent, and users can monitor targets through the video monitoring systems, but the existing video monitoring systems are single in function.
Content of application
The embodiment of the application provides a data processing method and a related product, which can enrich the functions of a video monitoring system and have higher practical value.
In a first aspect, an embodiment of the present application provides a data processing method, where the method includes the following steps:
displaying retrieval range information on a retrieval record statistical page of a video monitoring system, wherein the retrieval range information comprises a time interval;
receiving a designated time period selected from the time interval;
acquiring retrieval records of the specified time period to obtain N retrieval records, wherein N is an integer greater than 1;
performing aggregation processing on the N retrieval records to obtain M retrieval result sets, wherein each retrieval result set corresponds to an object, and M is a positive integer smaller than N;
determining the retrieval frequency of each object according to the M retrieval result sets to obtain M frequency values;
and displaying the M retrieval result sets according to the M frequency values.
In a second aspect, an embodiment of the present application provides a data processing apparatus, including:
the display unit is used for displaying search range information on a search record statistical page of the video monitoring system, wherein the search range information comprises a time interval;
the receiving unit is used for receiving a designated time period selected from the time interval;
the acquisition unit is used for acquiring the retrieval records of the specified time period to obtain N retrieval records, wherein N is an integer greater than 1;
the aggregation unit is used for performing aggregation processing on the N retrieval records to obtain M retrieval result sets, each retrieval result set corresponds to one object, and M is a positive integer smaller than N;
the determining unit is used for determining the retrieval frequency of each object according to the M retrieval result sets to obtain M frequency values;
the display unit is further specifically configured to display the M search result sets according to the M frequency values.
In a third aspect, an embodiment of the present application provides a data processing apparatus, including: a processor and a memory; and one or more programs stored in the memory and configured to be executed by the processor, the programs including instructions for some or all of the steps as described in the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium is used to store a computer program, where the computer program is used to make a computer execute some or all of the steps described in the first aspect of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, where the computer program product comprises a non-transitory computer-readable storage medium storing a computer program, the computer program being operable to cause a computer to perform some or all of the steps as described in the first aspect of embodiments of the present application. The computer program product may be a software installation package.
The embodiment of the application has the following beneficial effects:
it can be seen that in the embodiment of the present application, retrieval range information is displayed on a retrieval record statistical page of a video monitoring system, the retrieval range information includes a time interval, a specified time period selected from the time interval is received, retrieval records of the specified time period can be obtained, N retrieval records are obtained, N is an integer greater than 1, the N retrieval records are aggregated to obtain M retrieval result sets, each retrieval result set corresponds to an object, M is a positive integer less than N, the retrieval frequency of each object is determined according to the M retrieval result sets to obtain M frequency values, the M retrieval result sets are displayed according to the M frequency values, thereby, the retrieval records within a period of time can be aggregated on the retrieval record statistical page, and the aggregated results are displayed according to the retrieval frequency, thereby enriching the functions of the video monitoring system, has more practical value.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1A is a schematic flow chart of a data processing method disclosed in an embodiment of the present application;
FIG. 1B is a schematic diagram illustrating a statistical demonstration of search records according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart diagram of another data processing method disclosed in the embodiments of the present application;
fig. 3 is another schematic structural diagram of a data processing apparatus according to an embodiment of the present application;
fig. 4A is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application;
fig. 4B is a schematic structural diagram of an aggregation unit of the data processing apparatus depicted in fig. 4A according to an embodiment of the present disclosure;
FIG. 4C is a schematic diagram of a first alignment module of the aggregation unit depicted in FIG. 4B according to an embodiment of the present disclosure;
FIG. 4D is a schematic diagram of another structure of the data processing apparatus depicted in FIG. 4A according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of another data processing apparatus disclosed in the embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The data processing apparatus according to the embodiment of the present application may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem, and various forms of User Equipment (UE), Mobile Stations (MS), terminal devices (terminal device), servers, video matrices, monitoring platforms, and so on. For convenience of description, the above-mentioned apparatuses are collectively referred to as a data processing device.
It should be noted that, the data processing apparatus in the embodiment of the present application may be equipped with a video monitoring system, and may be connected to a plurality of cameras, each of which may be used to capture a video image, and each of which may have a position mark corresponding to the camera, or may have a number corresponding to the camera. Typically, the camera may be located in a public place, such as a school, museum, intersection, pedestrian street, office building, garage, airport, hospital, subway station, bus station, supermarket, hotel, entertainment venue, and the like. After the camera shoots the video image, the video image can be stored in a memory of a system where the data processing device is located. The memory can store a plurality of registered image libraries, the registered image libraries can contain a plurality of warehousing objects, each warehousing object corresponds to an identity information set, and the identity information set comprises at least one of the following contents: at least one registered face image, identification number, home address, family member, political component, nationality, telephone number, name, graduation number, school number, house number, bank card number, social account number, job number, license number, and the like.
Further optionally, in this embodiment of the application, each frame of video image shot by the camera corresponds to one attribute information, where the attribute information is at least one of the following: camera number, shooting time of the video image, position of the video image, attribute parameters (format, size, resolution, etc.) of the video image, number of the video image, and character feature attributes in the video image. The character attributes in the video image may include, but are not limited to: the number of people in the video image, the position of people, the angle of the face, etc.
It should be further noted that the video image acquired by each camera is usually a dynamic human face image, and therefore, in the embodiment of the present application, angle information of the human face image may be planned, where the human face angle may include but is not limited to: horizontal rotation angle, pitch angle or inclination. For example, it is possible to define that the dynamic face image data requires a interocular distance of not less than 30 pixels, and it is recommended to have more than 60 pixels. The horizontal rotation angle is not more than +/-30 degrees, the pitch angle is not more than +/-20 degrees, and the inclination angle is not more than +/-45 degrees. The horizontal rotation angle is recommended not to exceed +/-15 degrees, the pitch angle is recommended not to exceed +/-10 degrees, and the inclination angle is recommended not to exceed +/-15 degrees. For example, whether the face image is blocked by other objects may also be screened, in general, the ornaments should not block the main face area, such as dark sunglasses, masks, and exaggerated jewelry, and of course, there is also a possibility that dust is full on the camera, so that the face image is blocked, and thus, the shot video image may be partially unclear. The picture format of the video image in the embodiment of the present application may include, but is not limited to: BMP, JPEG, JPEG2000, PNG and the like, the size of the video image can be 10-30KB, each video image can also correspond to information such as shooting time, the unified serial number of a camera for shooting the video image, the link of a panoramic big image corresponding to the face image and the like (the face image and the global image establish a characteristic corresponding relation file).
Further, the user may input a search image and perform a search according to the search image, and then a search record may be obtained, where the search record may include at least one of the following: the attribute information (source, size, format, shooting time, shooting place, etc.) of the retrieval image, the retrieval result (i.e., whether there is an image matching the retrieval image, or an image matching the retrieval image, etc.), the object corresponding to the retrieval image, for example, the name, native place, identification number, mobile phone number, bank card number, height, weight, academic calendar, work history, home address, social relationship, etc., of the object, and the user information (account number, password, IP address, position, name, age, etc.) initiating the retrieval action. Each search can generate a corresponding search record.
The following describes embodiments of the present application in detail.
Please refer to fig. 1A, which is a flowchart illustrating an embodiment of a data processing method according to an embodiment of the present application. It may include the following steps 101-106, which are detailed as follows:
101. and displaying retrieval range information on a retrieval record statistical page of the video monitoring system, wherein the retrieval range information comprises a time interval.
The retrieval range information in the embodiment of the present application may be at least in a time interval, for example, from number 10/month 9 in 2017 to number 12/month 28 in 2017.
Alternatively, when the user enters the search log statistics page, the search log statistics page may display yesterday's search log statistics by default (e.g., the system automatically counts once when 0 point is passed).
102. Receiving a designated time period selected from the time interval.
The specified time period can be set by a user, or the system defaults, and the user can select the specified time period according to the self retrieval requirement.
103. And acquiring the retrieval records of the specified time period to obtain N retrieval records, wherein N is an integer greater than 1.
The retrieval records of the specified time period can be obtained to obtain N retrieval records, that is, the retrieval is performed N times in the specified time period, and each retrieval can generate a corresponding retrieval record.
Optionally, the retrieval range information further includes: a plurality of search elements; the steps 102 and 103 may further include the following steps:
receiving at least one search element selected from the plurality of search elements; in step 103, the retrieval record of the specified time period is obtained, which may be implemented as follows:
and acquiring the retrieval record of the specified time period according to the at least one retrieval element.
Wherein the search element may include at least one of: the method includes the steps that a designated camera (each camera can correspond to a number, for example, a camera number 001), a designated region (for example, a nan mountain region of Shenzhen city), a designated face angle (for example, an elevation angle of 45 degrees), a designated age, a designated gender, a designated height, a designated retrieval reason (namely, a retrieval purpose), a designated face feature (for example, a beautiful nevus exists on the face), and the like are designated, a user can input retrieval elements and a designated time period according to actual needs, and then retrieval records corresponding to the designated time period can be obtained according to at least one retrieval element.
104. And aggregating the N retrieval records to obtain M retrieval result sets, wherein each retrieval result set corresponds to an object, and M is a positive integer smaller than N.
Because not every record in N retrieval records can correspond to an object, further, N retrieval records can be classified by different objects to obtain M retrieval result sets, each retrieval result set can correspond to an object, each retrieval result set at least comprises one retrieval record, and M is a positive integer not greater than N.
Optionally, in the step 104, the aggregating process performed on the N search records may include the following steps:
41. acquiring a retrieval image corresponding to each retrieval record in the N retrieval records to obtain N retrieval images;
42. comparing the N retrieval images pairwise to obtain P comparison values, wherein P is an integer larger than N;
43. selecting a comparison value larger than a second preset threshold value from the P comparison values to obtain Q comparison values, wherein Q is a positive integer smaller than or equal to P;
44. and merging the N retrieval records according to the Q comparison values.
The second preset threshold may be set by the user, or may be set by default. Each retrieval record can be retrieved based on one retrieval image, further, N retrieval records can be obtained through the N retrieval records to obtain N retrieval images, of course, some of the N retrieval images may be based on the same object, therefore, two by two retrieval images can be compared to obtain P comparison values, a comparison value exists between every two retrieval images, the comparison value is usually between 0 and 1, a comparison value larger than a second preset threshold value can be selected from the P comparison values to obtain Q comparison values, if a certain comparison value is larger than the second preset threshold value, the similarity between retrieval objects corresponding to the comparison value is very high, basically, the retrieval records corresponding to the comparison value can be considered to be the same object, further, the retrieval records corresponding to the comparison value can be merged, Q is a positive integer smaller than or equal to P, and thus, n search records may be merged based on the Q comparison values to obtain M search result sets, each search result set corresponds to one object, and certainly, each search result set also includes at least one search record. Therefore, the retrieval records are aggregated through similarity comparison between the retrieval objects, the retrieval records are classified by the objects, the aggregation result can be conveniently watched by a user, and the follow-up analysis is also convenient.
Further optionally, in step 44, merging the N search records according to the Q comparison values, which may be implemented as follows:
and merging the retrieval records corresponding to each of the Q comparison values, wherein each comparison value corresponds to two retrieval records, and reserving the un-merged retrieval records in the N retrieval records.
The two retrieval records corresponding to each of the Q comparison values may be merged, and of course, the un-merged retrieval records in the N retrieval records need to be retained, and each retrieval record of the merged retrieval records may correspond to one retrieval result set, so that M retrieval result sets may be obtained, so that aggregation processing is realized, after the aggregation processing, the N retrieval records are divided into different objects, each object corresponds to one retrieval result set, and large data analysis is also realized.
Further optionally, in the step 42, pairwise comparing the N search images may include the following steps:
421. carrying out face segmentation on the retrieval image i and the retrieval image j to obtain a face region i and a face region j, wherein the retrieval image i and the retrieval image j are any two retrieval images in the N retrieval images;
422. carrying out binarization processing on the face area i and the face area j to obtain a binarized face area i and a binarized face area j;
423. respectively extracting feature points of the binarization face area i and the binarization face area j to obtain a first feature point set and a second feature point set;
424. and comparing the first characteristic point set with the second characteristic point set.
The search images i and j are from any two of the N search images. Since neither the search image i nor the search image j contains only a face, but also other parts, in order to realize rapid face recognition, face segmentation can be performed on the search image i and the search image j, and a face region i can be obtained from the search image i. Obtaining a face region j from a retrieval image j, further performing binarization processing on the face region i and the face region j to obtain a binarized face region i and a binarized face region j, wherein the binarization processing aims at further reducing the complexity of the image and is beneficial to realizing rapid face comparison, and on the basis, Feature points can be extracted from the binarized face region i and the binarized face region j in a harris corner detection algorithm or Scale Invariant Feature Transform (SIFT) manner, so that a first Feature point set corresponding to the binarized face image i and a second Feature point set corresponding to the binarized face image j can be obtained, and finally, the first Feature point set and the second Feature point set are compared, and the image complexity can be greatly reduced due to the steps 221-223, the face comparison efficiency is improved, and rapid image matching can be realized.
105. And determining the retrieval frequency of each object according to the M retrieval result sets to obtain M frequency values.
Each search result set corresponds to one object, each search result set may correspond to the number of times (for example, 20 times) the object was searched, information of the person to be searched (the time to be searched, the account number of the person to be searched, the unit of the person to whom the person belongs, the name, the search result, and the search image), for example, the number of times the object was searched may be set as the frequency value of the object, or the number of people who were searched for may be set as the frequency value of the object. Each of the M search result sets corresponds to one object, so that each search result set also corresponds to the search frequency of one object, and M frequency values are obtained, and the larger the frequency value is, the more times the object is searched is indicated, or the object is searched by more users.
106. And displaying the M retrieval result sets according to the M frequency values.
The M retrieval result sets can be sorted from high to low according to the frequency value, and the sorted M retrieval result sets are displayed.
Optionally, the M search result sets may be presented in the form of list information, and the list information may include at least one of the following: time period, number of aggregated retrieval records, face photos, number of retrieval people and number of times of retrieval. When the user clicks each retrieval record, the right side switches to view the retrieval records, and each retrieval result set can comprise the following contents: the time of retrieval, the account number of the person to be retrieved, the unit of belonging, the name, the reason of retrieval, and the picture to be retrieved. The face photo in the list information may be a recently searched photo.
Optionally, after the step 106, the following steps may be further included:
and selecting a frequency value larger than a first preset threshold value from the M frequency values to obtain K frequency values, and marking objects corresponding to the K frequency values as attention objects, wherein K is a positive integer.
The first preset threshold may be set by the user, or may be set by default by the system. If a certain frequency value is greater than the first preset threshold value, it may be considered that more users are retrieving the object corresponding to the frequency value, or the object corresponding to the frequency value is retrieved many times, so that the object corresponding to the K frequency values may be marked as an object of interest, that is, a certain object is retrieved many times or retrieved by more users, which indicates that the user has attracted extensive attention.
Optionally, the M search result sets may be sorted according to the M frequency values, for example, the larger the frequency value, the higher the ranking. Referring to fig. 1B, it can be seen that the higher the number of searches in the graph, the higher the ranking. Therefore, ordered statistics of the retrieval records is realized, and convenience is brought to user analysis.
It can be seen that in the embodiment of the present application, retrieval range information is displayed on a retrieval record statistical page of a video monitoring system, the retrieval range information includes a time interval, a specified time period selected from the time interval is received, retrieval records of the specified time period can be obtained, N retrieval records are obtained, N is an integer greater than 1, the N retrieval records are aggregated to obtain M retrieval result sets, each retrieval result set corresponds to an object, M is a positive integer less than N, the retrieval frequency of each object is determined according to the M retrieval result sets to obtain M frequency values, the M retrieval result sets are displayed according to the M frequency values, thereby, the retrieval records within a period of time can be aggregated on the retrieval record statistical page, and the aggregated results are displayed according to the retrieval frequency, thereby enriching the functions of the video monitoring system, has more practical value.
In accordance with the above, please refer to fig. 2, which is a flowchart illustrating an embodiment of a data processing method according to an embodiment of the present application. The data processing method described in this embodiment may include the following steps:
201. and displaying retrieval range information on a retrieval record statistical page of the video monitoring system, wherein the retrieval range information comprises a time interval.
202. Receiving a designated time period selected from the time interval.
203. And acquiring the retrieval records of the specified time period to obtain N retrieval records, wherein N is an integer greater than 1.
204. And aggregating the N retrieval records to obtain M retrieval result sets, wherein each retrieval result set corresponds to an object, and M is a positive integer smaller than N.
205. And determining the retrieval frequency of each object according to the M retrieval result sets to obtain M frequency values.
206. And displaying the M retrieval result sets according to the M frequency values.
The detailed description of the steps 201-206 can refer to the data processing method described in fig. 1A, and will not be described herein again.
207. And selecting a frequency value larger than a first preset threshold value from the M frequency values to obtain K frequency values, and marking objects corresponding to the K frequency values as attention objects, wherein K is a positive integer.
208. Acquiring an activity area of a target object, wherein the target object is any object of interest.
The activity area can be set by the user, or the system defaults. The target object may be from any object of interest, that is, an activity area may be preset, for example, the activity area may be an area controlled by a plurality of cameras, that is, the target object can only move in the activity area, and in real life, the target object may also encounter an important monitoring object, and the activity area thereof is usually specified so that the target object moves in the activity area.
209. And when the target object appears in the inactive area, performing alarm processing.
The non-active area is an area outside the active area, and can track a target object in real time, and if the target tinning occurs in the non-active area, an alarm can be given, for example, the position where the target object occurs is determined, for example, a certain camera shoots the target object, and then an administrator in charge of the camera can be notified, so as to monitor the target object anytime and anywhere.
It can be seen that in the embodiment of the present application, retrieval range information is displayed on a retrieval record statistics page of a video monitoring system, the retrieval range information includes a time interval, a specified time period selected from the time interval is received, retrieval records of the specified time period can be obtained, N retrieval records are obtained, N is an integer greater than 1, aggregation processing is performed on the N retrieval records, M retrieval result sets are obtained, each retrieval result set corresponds to an object, M is a positive integer less than N, a retrieval frequency of each object is determined according to the M retrieval result sets, M frequency values are obtained, the M retrieval result sets are displayed according to the M frequency values, a frequency value greater than a first preset threshold value is selected from the M frequency values, K frequency values are obtained, an object corresponding to the K frequency values is marked as an object of interest, K is a positive integer, an active region of a target object is obtained, the target object is any concerned object, and when the target object appears in the inactive area, the alarm processing is carried out, so that the retrieval records in a period of time can be aggregated, and an object with high retrieval frequency is determined from the aggregated result to be used as the concerned object, so that the retrieval records can be fully utilized.
The following is a device for implementing the data processing method, specifically as follows:
in accordance with the above, please refer to fig. 3, in which fig. 3 is a data processing apparatus according to an embodiment of the present application, including: a processor and a memory; and one or more programs stored in the memory and configured to be executed by the processor, the programs including instructions for performing the steps of:
displaying retrieval range information on a retrieval record statistical page of a video monitoring system, wherein the retrieval range information comprises a time interval;
receiving a designated time period selected from the time interval;
acquiring retrieval records of the specified time period to obtain N retrieval records, wherein N is an integer greater than 1;
performing aggregation processing on the N retrieval records to obtain M retrieval result sets, wherein each retrieval result set corresponds to an object, and M is a positive integer smaller than N;
determining the retrieval frequency of each object according to the M retrieval result sets to obtain M frequency values;
and displaying the M retrieval result sets according to the M frequency values.
In one possible example, in connection with the aggregating the N retrieval records, the program includes instructions for:
acquiring a retrieval image corresponding to each retrieval record in the N retrieval records to obtain N retrieval images;
comparing the N retrieval images pairwise to obtain P comparison values, wherein P is an integer larger than N;
selecting a comparison value larger than a second preset threshold value from the P comparison values to obtain Q comparison values, wherein Q is a positive integer smaller than or equal to P;
and merging the N retrieval records according to the Q comparison values.
In one possible example, in the merging the N search records according to the Q comparison values, the program includes instructions for:
and merging the retrieval records corresponding to each of the Q comparison values, wherein each comparison value corresponds to two retrieval records, and reserving the un-merged retrieval records in the N retrieval records.
In one possible example, in the pairwise comparing the N search images, the program includes instructions for:
carrying out face segmentation on the retrieval image i and the retrieval image j to obtain a face region i and a face region j, wherein the retrieval image i and the retrieval image j are any two retrieval images in the N retrieval images;
carrying out binarization processing on the face area i and the face area j to obtain a binarized face area i and a binarized face area j;
respectively extracting feature points of the binarization face area i and the binarization face area j to obtain a first feature point set and a second feature point set;
and comparing the first characteristic point set with the second characteristic point set.
In one possible example, the retrieval range information further includes: a plurality of search elements;
the program further includes instructions for performing the steps of:
receiving at least one search element selected from the plurality of search elements;
in the aspect of the acquiring the retrieval record of the specified time period, the program includes instructions for performing the steps of:
and acquiring the retrieval record of the specified time period according to the at least one retrieval element.
In one possible example, the program further comprises instructions for performing the steps of:
and selecting a frequency value larger than a first preset threshold value from the M frequency values to obtain K frequency values, and marking objects corresponding to the K frequency values as attention objects, wherein K is a positive integer.
Referring to fig. 4A, fig. 4A is a schematic structural diagram of a data processing apparatus according to the present embodiment. The data processing device is applied to a data processing device, and the data processing device can comprise: presentation unit 401, receiving unit 402, obtaining unit 403, aggregating unit 404 and determining unit 405, wherein,
the display unit 401 is configured to display search range information on a search record statistics page of a video monitoring system, where the search range information includes a time interval;
a receiving unit 402, configured to receive a specified time period selected from the time interval;
an obtaining unit 403, configured to obtain the search records of the specified time period, to obtain N search records, where N is an integer greater than 1;
an aggregation unit 404, configured to perform aggregation processing on the N search records to obtain M search result sets, where each search result set corresponds to an object, and M is a positive integer smaller than N;
a determining unit 405, configured to determine a retrieval frequency of each object according to the M retrieval result sets, so as to obtain M frequency values;
the display unit 401 is further specifically configured to display the M search result sets according to the M frequency values.
Optionally, as shown in fig. 4B, fig. 4B is a detailed structure of the aggregation unit 404 of the data processing apparatus depicted in fig. 4A, where the aggregation unit 404 may include a first obtaining module 4041, a first comparison module 4042, a selecting module 4043, and a combining module 4044, which are as follows:
a first obtaining module 4041, configured to obtain a retrieval image corresponding to each retrieval record in the N retrieval records, to obtain N retrieval images;
the first comparison module 4042 is configured to compare every two of the N search images to obtain P comparison values, where P is an integer greater than N;
a selecting module 4043, configured to select, from the P comparison values, a comparison value larger than a second preset threshold to obtain Q comparison values, where Q is a positive integer smaller than or equal to P;
a merging module 4044, configured to merge the N search records according to the Q comparison values.
Optionally, the merging module 4044 is specifically configured to:
and merging the retrieval records corresponding to each of the Q comparison values, wherein each comparison value corresponds to two retrieval records, and reserving the un-merged retrieval records in the N retrieval records.
Alternatively, as shown in fig. 4C, fig. 4C is a detailed structure of the first comparison module 4042 of the aggregation unit 404 depicted in fig. 4B, and the first comparison module 4042 may include: the segmentation module 501, the processing module 502, the extraction module 503 and the second comparison module 504 are specifically as follows:
a segmentation module 501, configured to perform face segmentation on the retrieval image i and the retrieval image j to obtain a face region i and a face region j, where the retrieval image i and the retrieval image j are any two retrieval images in the N retrieval images;
a processing module 502, configured to perform binarization processing on the face region i and the face region j to obtain a binarized face region i and a binarized face region j;
an extraction module 503, configured to perform feature point extraction on the binarized face region i and the binarized face region j, respectively, to obtain a first feature point set and a second feature point set;
a second comparing module 504, configured to compare the first feature point set with the second feature point set.
Optionally, the retrieval range information further includes: a plurality of search elements;
the receiving unit 402 is further specifically configured to:
receiving at least one search element selected from the plurality of search elements;
in terms of the acquiring the retrieval record of the specified time period, the acquiring unit 403 is specifically configured to:
and acquiring the retrieval record of the specified time period according to the at least one retrieval element.
Optionally, as shown in fig. 4D, fig. 4D is a further modified structure of the data processing apparatus depicted in fig. 4A, which may further include a marking unit 406, as follows:
a marking unit 406, configured to select a frequency value greater than a first preset threshold value from the M frequency values to obtain K frequency values, and mark an object corresponding to the K frequency values as an attention object, where K is a positive integer.
It can be seen that, in the data processing apparatus described in the embodiment of the present application, search range information is displayed on a search record statistics page of a video monitoring system, where the search range information includes a time interval, a specified time period selected from the time interval is received, a search record of the specified time period can be obtained, N search records are obtained, where N is an integer greater than 1, the N search records are aggregated to obtain M search result sets, each search result set corresponds to an object, M is a positive integer less than N, a search frequency of each object is determined according to the M search result sets to obtain M frequency values, the M search result sets are displayed according to the M frequency values, so that, in the search record statistics page, the search records in a period of time are aggregated, and the aggregated result is displayed according to the search frequency, the functions of the video monitoring system are enriched, and the practical value is higher.
It is to be understood that the functions of each program module of the data processing apparatus in this embodiment may be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process may refer to the relevant description of the foregoing method embodiment, which is not described herein again.
As shown in fig. 5, for convenience of description, only the portions related to the embodiments of the present application are shown, and details of the specific technology are not disclosed, please refer to the method portion of the embodiments of the present application. The data processing device may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), a vehicle-mounted computer, etc., taking the data processing device as a mobile phone as an example:
fig. 5 is a block diagram illustrating a partial structure of a mobile phone related to a data processing apparatus provided in an embodiment of the present application. Referring to fig. 5, the handset includes: radio Frequency (RF) circuit 910, memory 920, input unit 930, sensor 950, audio circuit 960, Wireless Fidelity (WiFi) module 970, processor 980, and power supply 990. Those skilled in the art will appreciate that the handset configuration shown in fig. 5 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 5:
the input unit 930 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 930 may include a touch display 933, a biometric recognition device 931, and other input devices 932. The biometric device 931 may be a fingerprint recognition device, or a face recognition device, or an iris recognition device, etc. The input unit 930 may also include other input devices 932. In particular, other input devices 932 may include, but are not limited to, one or more of physical keys, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
Wherein the processor 980 is configured to perform the following steps:
displaying retrieval range information on a retrieval record statistical page of a video monitoring system, wherein the retrieval range information comprises a time interval;
receiving a designated time period selected from the time interval;
acquiring retrieval records of the specified time period to obtain N retrieval records, wherein N is an integer greater than 1;
performing aggregation processing on the N retrieval records to obtain M retrieval result sets, wherein each retrieval result set corresponds to an object, and M is a positive integer smaller than N;
determining the retrieval frequency of each object according to the M retrieval result sets to obtain M frequency values;
and displaying the M retrieval result sets according to the M frequency values.
The processor 980 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 920 and calling data stored in the memory 920, thereby integrally monitoring the mobile phone. Alternatively, processor 980 may include one or more processing units; alternatively, processor 980 may integrate a processor that handles primarily the operating system, user interface, applications, etc. and a modem processor that handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 980.
Further, the memory 920 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The handset may also include at least one sensor 950, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the touch display screen according to the brightness of ambient light, and the proximity sensor may turn off the touch display screen and/or the backlight when the mobile phone moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
WiFi belongs to short-distance wireless transmission technology, and the mobile phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 970, and provides wireless broadband Internet access for the user. Although fig. 5 shows the WiFi module 970, it is understood that it does not belong to the essential constitution of the handset, and can be omitted entirely as needed within the scope not changing the essence of the invention.
The handset also includes a power supply 990 (e.g., a battery) for powering the various components, which may optionally be logically connected to the processor 980 via a power management system, such that the power management system may be used to manage charging, discharging, and power consumption.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In the embodiments shown in fig. 1A to fig. 2, the method flows of the steps may be implemented based on the structure of the mobile phone.
In the embodiments shown in fig. 3, 4A to 4D, the functions of the units can be implemented based on the structure of the mobile phone.
Embodiments of the present application also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the data processing methods as described in the above method embodiments.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the data processing methods as set forth in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory comprises: various media capable of storing program codes, such as a usb disk, a read-only memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and the like.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash disk, ROM, RAM, magnetic or optical disk, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (7)
1. A method of data processing, the method comprising the steps of:
displaying retrieval range information on a retrieval record statistical page of a video monitoring system, wherein the retrieval range information comprises a time interval;
receiving a designated time period selected from the time interval;
acquiring retrieval records of the specified time period to obtain N retrieval records, wherein N is an integer greater than 3;
performing aggregation processing on the N retrieval records to obtain M retrieval result sets, where each retrieval result set corresponds to one object, and M is a positive integer smaller than N, and specifically includes: acquiring a retrieval image corresponding to each retrieval record in the N retrieval records to obtain N retrieval images; comparing the N retrieval images pairwise to obtain P comparison values, wherein P is an integer larger than N; selecting more than the first comparison value from the P comparison valuesObtaining Q comparison values by the comparison values of two preset thresholds, wherein Q is a positive integer less than or equal to P; merging the retrieval records corresponding to each of the Q comparison values, wherein each comparison value corresponds to two retrieval records, and reserving the un-merged retrieval records in the N retrieval recordsRecording device;
Determining the retrieval frequency of each object according to the M retrieval result sets to obtain M frequency values;
and displaying the M retrieval result sets according to the M frequency values.
2. The method of claim 1, wherein the pairwise comparing the N search images comprises:
carrying out face segmentation on the retrieval image i and the retrieval image j to obtain a face region i and a face region j, wherein the retrieval image i and the retrieval image j are any two retrieval images in the N retrieval images;
carrying out binarization processing on the face area i and the face area j to obtain a binarized face area i and a binarized face area j;
respectively extracting feature points of the binarization face area i and the binarization face area j to obtain a first feature point set and a second feature point set;
and comparing the first characteristic point set with the second characteristic point set.
3. The method of claim 1 or 2, wherein retrieving the scope information further comprises: a plurality of search elements;
the method further comprises the following steps:
receiving at least one search element selected from the plurality of search elements;
the acquiring of the retrieval record of the specified time period includes:
and acquiring the retrieval record of the specified time period according to the at least one retrieval element.
4. The method according to claim 1 or 2, characterized in that the method further comprises:
and selecting a frequency value larger than a first preset threshold value from the M frequency values to obtain K frequency values, and marking objects corresponding to the K frequency values as attention objects, wherein K is a positive integer.
5. A data processing apparatus, characterized in that the apparatus comprises:
the display unit is used for displaying search range information on a search record statistical page of the video monitoring system, wherein the search range information comprises a time interval;
the receiving unit is used for receiving a designated time period selected from the time interval;
the acquisition unit is used for acquiring the retrieval records of the specified time period to obtain N retrieval records, wherein N is an integer greater than 3;
the aggregation unit is used for performing aggregation processing on the N retrieval records to obtain M retrieval result sets, each retrieval result set corresponds to one object, and M is a positive integer smaller than N; the polymerization unit includes: the first acquisition module is used for acquiring a retrieval image corresponding to each retrieval record in the N retrieval records to obtain N retrieval images; the first comparison module is used for comparing every two of the N retrieval images to obtain P comparison values, wherein P is an integer larger than N; a selecting module, configured to select, from the P comparison values, a comparison value larger than a second preset threshold to obtain Q comparison values, where Q is a positive integer smaller than or equal to P; a merging module, configured to merge the retrieval records corresponding to each of the Q comparison values, where each comparison value corresponds to two retrieval records, and retain the un-merged retrieval records in the N retrieval records;
the determining unit is used for determining the retrieval frequency of each object according to the M retrieval result sets to obtain M frequency values;
the display unit is further specifically configured to display the M search result sets according to the M frequency values.
6. The apparatus of claim 5, wherein the first comparison module comprises:
the segmentation module is used for carrying out face segmentation on the retrieval image i and the retrieval image j to obtain a face region i and a face region j, wherein the retrieval image i and the retrieval image j are any two retrieval images in the N retrieval images;
the processing module is used for carrying out binarization processing on the face area i and the face area j to obtain a binarized face area i and a binarized face area j;
the extraction module is used for respectively extracting the feature points of the binarization face area i and the binarization face area j to obtain a first feature point set and a second feature point set;
and the second comparison module is used for comparing the first characteristic point set with the second characteristic point set.
7. A computer-readable storage medium for storing a computer program, wherein the computer program causes a computer to perform the method according to any one of claims 1-4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711472718.0A CN108287873B (en) | 2017-12-29 | 2017-12-29 | Data processing method and related product |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711472718.0A CN108287873B (en) | 2017-12-29 | 2017-12-29 | Data processing method and related product |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108287873A CN108287873A (en) | 2018-07-17 |
CN108287873B true CN108287873B (en) | 2020-08-11 |
Family
ID=62831923
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711472718.0A Active CN108287873B (en) | 2017-12-29 | 2017-12-29 | Data processing method and related product |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108287873B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111241139B (en) * | 2020-01-15 | 2022-09-30 | 深圳平安医疗健康科技服务有限公司 | Data statistical method, device, computer equipment and storage medium |
CN111915616A (en) * | 2020-05-26 | 2020-11-10 | 华瑞新智科技(北京)有限公司 | Method and device for infrared temperature measurement based on weak supervision image segmentation |
CN113434732A (en) * | 2021-06-04 | 2021-09-24 | 浙江大华技术股份有限公司 | Data retrieval method, device and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101739455A (en) * | 2009-12-24 | 2010-06-16 | 北京世纪互联宽带数据中心有限公司 | Analysis method of streaming media information on demand and method thereof |
CN103186571A (en) * | 2011-12-28 | 2013-07-03 | 腾讯科技(深圳)有限公司 | Method and device for displaying mobile media information in mobile search system |
CN106126698A (en) * | 2016-06-29 | 2016-11-16 | 武汉斗鱼网络科技有限公司 | A kind of retrieval method for pushing based on Lucence and system |
CN106897398A (en) * | 2017-02-08 | 2017-06-27 | 北京奇艺世纪科技有限公司 | A kind of video display method and device |
CN106937087A (en) * | 2017-02-07 | 2017-07-07 | 深圳云天励飞技术有限公司 | A kind of method for processing video frequency and device |
CN107291810A (en) * | 2017-05-18 | 2017-10-24 | 深圳云天励飞技术有限公司 | Data processing method, device and storage medium |
CN107316011A (en) * | 2017-05-24 | 2017-11-03 | 杭州励飞软件技术有限公司 | Data processing method, device and storage medium |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8358808B2 (en) * | 2010-01-08 | 2013-01-22 | University Of Washington | Video-based vehicle detection and tracking using spatio-temporal maps |
-
2017
- 2017-12-29 CN CN201711472718.0A patent/CN108287873B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101739455A (en) * | 2009-12-24 | 2010-06-16 | 北京世纪互联宽带数据中心有限公司 | Analysis method of streaming media information on demand and method thereof |
CN103186571A (en) * | 2011-12-28 | 2013-07-03 | 腾讯科技(深圳)有限公司 | Method and device for displaying mobile media information in mobile search system |
CN106126698A (en) * | 2016-06-29 | 2016-11-16 | 武汉斗鱼网络科技有限公司 | A kind of retrieval method for pushing based on Lucence and system |
CN106937087A (en) * | 2017-02-07 | 2017-07-07 | 深圳云天励飞技术有限公司 | A kind of method for processing video frequency and device |
CN106897398A (en) * | 2017-02-08 | 2017-06-27 | 北京奇艺世纪科技有限公司 | A kind of video display method and device |
CN107291810A (en) * | 2017-05-18 | 2017-10-24 | 深圳云天励飞技术有限公司 | Data processing method, device and storage medium |
CN107316011A (en) * | 2017-05-24 | 2017-11-03 | 杭州励飞软件技术有限公司 | Data processing method, device and storage medium |
Non-Patent Citations (1)
Title |
---|
基于日志分析的信息检索技术研究与实现;陈浩然;《中国优秀硕士学位论文全文数据库 信息科技辑》;20091115(第11期);I138-1492 * |
Also Published As
Publication number | Publication date |
---|---|
CN108287873A (en) | 2018-07-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106844484B (en) | Information searching method and device and mobile terminal | |
CN107273510B (en) | Photo recommendation method and related product | |
US10068130B2 (en) | Methods and devices for querying and obtaining user identification | |
CN107729815B (en) | Image processing method, image processing device, mobile terminal and computer readable storage medium | |
CN106407984B (en) | Target object identification method and device | |
CN106131627B (en) | A kind of method for processing video frequency, apparatus and system | |
CN108022274B (en) | Image processing method, image processing device, computer equipment and computer readable storage medium | |
WO2021093375A1 (en) | Method, apparatus, and system for detecting people walking together, electronic device and storage medium | |
CN111368934A (en) | Image recognition model training method, image recognition method and related device | |
CN105956518A (en) | Face identification method, device and system | |
CN104239535A (en) | Method and system for matching pictures with characters, server and terminal | |
WO2020048392A1 (en) | Application virus detection method, apparatus, computer device, and storage medium | |
CN106921791B (en) | Multimedia file storage and viewing method and device and mobile terminal | |
WO2022227562A1 (en) | Identity recognition method and apparatus, and electronic device, storage medium and computer program product | |
CN107992822B (en) | Image processing method and apparatus, computer device, computer-readable storage medium | |
CN108287873B (en) | Data processing method and related product | |
CN105162984B (en) | Telephone number recognition methods and device | |
CN108965977B (en) | Method, device, storage medium, terminal and system for displaying live gift | |
CN106851345B (en) | Information pushing method, system and server | |
TW202044107A (en) | Method, device and electronic apparatus for image processing and storage medium thereof | |
CN112101216A (en) | Face recognition method, device, equipment and storage medium | |
CN107948093A (en) | Adjust the method and device that network speed is applied in terminal device | |
CN110955788A (en) | Information display method and electronic equipment | |
CN105335714A (en) | Photograph processing method, device and apparatus | |
CN108021669B (en) | Image classification method and device, electronic equipment and computer-readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |