CN114173086A - User data screening method based on data processing - Google Patents

User data screening method based on data processing Download PDF

Info

Publication number
CN114173086A
CN114173086A CN202111245780.2A CN202111245780A CN114173086A CN 114173086 A CN114173086 A CN 114173086A CN 202111245780 A CN202111245780 A CN 202111245780A CN 114173086 A CN114173086 A CN 114173086A
Authority
CN
China
Prior art keywords
user
monitoring
video
user monitoring
videos
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202111245780.2A
Other languages
Chinese (zh)
Inventor
陈正跃
夏志齐
谭子奕
宋琛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202111245780.2A priority Critical patent/CN114173086A/en
Publication of CN114173086A publication Critical patent/CN114173086A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices

Abstract

The invention provides a user data screening method based on data processing, and relates to the technical field of data processing. In the invention, user monitoring videos respectively sent by a plurality of user monitoring terminal devices are obtained to obtain a plurality of corresponding user monitoring videos, wherein the plurality of user monitoring terminal devices are respectively used for carrying out image acquisition on a monitored environment area; determining video characteristic information of each user monitoring video in a plurality of user monitoring videos, wherein the video characteristic information is used for representing the characteristics of the corresponding user monitoring video; and determining a video screening mode of each user monitoring video based on the video characteristic information of each user monitoring video, and screening the corresponding user monitoring video based on the video screening mode to obtain a target monitoring video corresponding to the user monitoring video. Based on the method, the problem that the screening effect of the user monitoring video is poor in the prior art can be solved.

Description

User data screening method based on data processing
Technical Field
The invention relates to the technical field of data processing, in particular to a user data screening method based on data processing.
Background
In the monitoring technology, at least a user monitoring terminal device (such as an image capturing device like a camera) disposed at a front end and a user monitoring server disposed at a back end are generally included. The user monitoring terminal equipment at the front end is generally used for monitoring a user to form a monitoring video, and then the monitoring video is sent to the user monitoring server at the rear end, and the user monitoring server generally screens the monitoring video after receiving the monitoring video based on the consideration of factors such as data processing capacity and the like. However, the inventor researches and finds that in the prior art, a single video is generally screened or de-duplicated and screened according to a fixed proportion, so that the screening effect of the user monitoring video is poor.
Disclosure of Invention
In view of the above, an object of the present invention is to provide a user data screening method based on data processing, so as to solve the problem in the prior art that the screening effect for the user monitoring video is not good.
In order to achieve the above purpose, the embodiment of the invention adopts the following technical scheme:
a user data screening method based on data processing is applied to a user monitoring server, the user monitoring server is in communication connection with a plurality of user monitoring terminal devices, and the user data screening method based on data processing comprises the following steps:
acquiring user monitoring videos respectively sent by the user monitoring terminal devices to obtain a plurality of user monitoring videos corresponding to the user monitoring terminal devices, wherein the user monitoring terminal devices are respectively used for carrying out image acquisition on a monitored environment area to obtain a plurality of corresponding user monitoring videos, and each user monitoring video comprises a plurality of frames of user monitoring video frames;
determining video feature information of each user monitoring video in the plurality of user monitoring videos, wherein the video feature information is used for representing features of the corresponding user monitoring video;
determining a video screening mode of each user monitoring video based on the video characteristic information of each user monitoring video, and screening the corresponding user monitoring video based on the video screening mode to obtain a target monitoring video corresponding to the user monitoring video, wherein each target monitoring video comprises at least one frame of user monitoring video.
In some preferred embodiments, in the method for screening user data based on data processing, the step of obtaining the user monitoring videos respectively sent by the user monitoring terminal devices to obtain the user monitoring videos corresponding to the user monitoring terminal devices includes:
when a monitoring starting instruction is received, generating corresponding monitoring starting notification information, and sending the monitoring starting notification information to each user monitoring terminal device in the plurality of user monitoring terminal devices, wherein each user monitoring terminal device is used for acquiring images of a monitoring environment area in which the user monitoring terminal device is located after receiving the monitoring starting notification information;
and respectively acquiring a user monitoring video acquired and sent by each user monitoring terminal device in the plurality of user monitoring terminal devices based on the monitoring starting notification information.
In some preferred embodiments, in the method for screening user data based on data processing, the step of generating corresponding monitoring start notification information when receiving a monitoring start instruction, and sending the monitoring start notification information to each of the plurality of user monitoring terminal devices includes:
judging whether a monitoring starting instruction is received or not;
and when the monitoring starting instruction is judged to be received, generating monitoring starting notification information carrying an equipment synchronization instruction, and sending the monitoring starting notification information to each user monitoring terminal equipment in the plurality of user monitoring terminal equipments, wherein each user monitoring terminal equipment is used for sending starting confirmation information to each other user monitoring terminal equipment based on the equipment synchronization instruction carried in the monitoring starting notification information after receiving the monitoring starting notification information, and starting image acquisition on the monitoring environment area after receiving the starting confirmation information sent by each other user monitoring terminal equipment.
In some preferred embodiments, in the method for screening user data based on data processing, the step of generating corresponding monitoring start notification information when receiving a monitoring start instruction, and sending the monitoring start notification information to each of the plurality of user monitoring terminal devices includes:
judging whether a monitoring starting instruction is received or not;
and when the monitoring starting instruction is judged to be received, generating monitoring starting notification information carrying a monitoring stopping instruction, and sending the monitoring starting notification information to each user monitoring terminal device in the plurality of user monitoring terminal devices, wherein each user monitoring terminal device is used for starting image acquisition on the monitoring environment area after receiving the monitoring starting notification information, acquiring the data volume of the acquired user monitoring video based on the monitoring stopping instruction carried in the monitoring starting notification information, and stopping image acquisition on the monitoring environment area when the data volume of the currently acquired user monitoring video is larger than or equal to a data volume threshold value.
In some preferred embodiments, in the method for screening user data based on data processing, the step of respectively obtaining a user monitoring video acquired and sent by each of the plurality of user monitoring terminal devices based on the monitoring start notification information includes:
acquiring current time information after the monitoring start notification information is sent to each user monitoring terminal device in the plurality of user monitoring terminal devices;
judging whether the current time information belongs to target time information or not, and generating corresponding monitoring stop notification information when the current time information belongs to the target time information;
and respectively sending the monitoring stop notification information to each user monitoring terminal device in the plurality of user monitoring terminal devices, wherein each user monitoring terminal device is used for stopping image acquisition in the monitoring environment area after receiving the monitoring stop notification information, and sending the acquired user monitoring video to the user monitoring server.
In some preferred embodiments, in the data processing-based user data filtering method, the step of determining video feature information of each user surveillance video in the plurality of user surveillance videos includes:
for each user monitoring video in the plurality of user monitoring videos, carrying out object identification processing on a user monitoring video frame included in the user monitoring video to obtain a target user object corresponding to the user monitoring video frame included in the user monitoring video;
and determining the object identity information of the target user object corresponding to the user monitoring video frame included in the user monitoring video as the video characteristic information of the user monitoring video aiming at each user monitoring video in the plurality of user monitoring videos.
In some preferred embodiments, in the data processing-based user data filtering method, the step of determining video feature information of each user surveillance video in the plurality of user surveillance videos includes:
determining a monitoring environment area where the user monitoring terminal equipment corresponding to each user monitoring video in the plurality of user monitoring videos is located;
and determining the area position information of the monitoring environment area where the user monitoring terminal equipment is located corresponding to each user monitoring video in the user monitoring videos as the video characteristic information of the user monitoring video.
In some preferred embodiments, in the data processing-based user data screening method, the step of determining a video screening method for each user surveillance video based on the video feature information of each user surveillance video, and performing screening processing on the corresponding user surveillance video based on the video screening method to obtain a target surveillance video corresponding to the user surveillance video includes:
for every two user monitoring videos in the plurality of user monitoring videos, determining a video feature correlation representation value between the two user monitoring videos based on the video feature information of the two user monitoring videos, wherein the video feature correlation representation value is used for representing the video feature correlation degree between the two corresponding user monitoring videos;
determining a video screening mode of each user monitoring video based on a video feature correlation representation value between every two user monitoring videos in the plurality of user monitoring videos, and screening the corresponding user monitoring video based on the video screening mode of each user monitoring video to obtain a target monitoring video corresponding to the user monitoring video.
In some preferred embodiments, in the method for screening user data based on data processing, the step of determining, for each two user surveillance videos in the plurality of user surveillance videos, a video feature correlation characterization value between the two user surveillance videos based on the video feature information of the two user surveillance videos includes:
for every two user monitoring videos in the plurality of user monitoring videos, calculating video frame similarity between every two user monitoring video frames included in the two user monitoring videos, and calculating an average value of the video frame similarity between every two user monitoring video frames included in the two user monitoring videos as a first characteristic correlation relation representation value between the two user monitoring videos;
for every two user monitoring videos in the plurality of user monitoring videos, respectively counting object identity information of the target user object identified in the two user monitoring videos, and taking object correlation between the object identity information of the target user object in the two user monitoring videos as a second characteristic correlation relation representation value between the two user monitoring videos;
for every two user monitoring videos in the plurality of user monitoring videos, calculating area position distance information between area position information of monitoring environment areas where the user monitoring terminal equipment is located corresponding to the two user monitoring videos, and determining an area position distance representation value with a negative correlation based on the area position distance information to serve as a third feature correlation representation value between the two user monitoring videos;
acquiring a first weight coefficient, a second weight coefficient and a third weight coefficient which respectively correspond to the first feature correlation characterization value, the second feature correlation characterization value and the third feature correlation characterization value, wherein the sum of the first weight coefficient, the second weight coefficient and the third weight coefficient is 1, the first weight coefficient is greater than the second weight coefficient, and the second weight coefficient is greater than the third weight coefficient;
and for every two user monitoring videos in the plurality of user monitoring videos, performing weighted summation calculation on the first feature correlation characteristic value, the second feature correlation characteristic value and the third feature correlation characteristic value between the two user monitoring videos based on the first weight coefficient, the second weight coefficient and the third weight coefficient to obtain a video feature correlation characteristic value between the two user monitoring videos.
In some preferred embodiments, in the data processing-based user data screening method, the step of determining a video screening manner of each user surveillance video based on a video feature correlation characterization value between every two user surveillance videos in the plurality of user surveillance videos, and performing screening processing on the corresponding user surveillance video based on the video screening manner of each user surveillance video to obtain a target surveillance video corresponding to the user surveillance video includes:
clustering the user monitoring videos to obtain at least one corresponding monitoring video set based on a video feature correlation representation value between every two user monitoring videos in the user monitoring videos, wherein each monitoring video set in the at least one monitoring video set comprises at least one user monitoring video;
counting the number of the user monitoring videos included in each monitoring video set to obtain the number of target videos corresponding to the monitoring video set, and determining a screening degree characterization value which has a positive correlation with the number of the target videos and corresponds to the monitoring video set based on the number of the target videos, wherein the screening degree characterization value is used for characterizing the maximum proportion or the maximum number of the screened user monitoring video frames after screening processing is performed on each user monitoring video in the corresponding monitoring video set;
and for each monitoring video set, sequentially determining each user monitoring video included in the monitoring video set as a first user monitoring video, and executing target screening operation based on the screening degree representation value corresponding to the monitoring video set to obtain a target monitoring video corresponding to each user monitoring video included in the monitoring video set.
In the user data screening method based on data processing provided by the embodiment of the invention, after user monitoring videos respectively sent by a plurality of user monitoring terminal devices are obtained, video characteristic information for each of a plurality of user surveillance videos may be determined first, then, the video screening mode of each user monitoring video can be determined based on the video characteristic information of each user monitoring video, and the corresponding user monitoring video is screened based on the determined video screening mode to obtain the corresponding target monitoring video, so that, since the video screening method for screening is determined based on the video feature information of the user monitoring video, the method has the advantages that the method has high matching degree with the user monitoring video, so that the reliability of the video screening mode obtained based on the method can be guaranteed, and the problem of poor screening effect on the user monitoring video in the prior art is solved.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
Fig. 1 is a block diagram of a user monitoring server according to an embodiment of the present invention.
Fig. 2 is a schematic flowchart of steps included in the user data screening method based on data processing according to an embodiment of the present invention.
Fig. 3 is a system block diagram of modules included in the data processing-based user data screening system according to the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, an embodiment of the present invention provides a user monitoring server. Wherein the user monitoring server may include a memory and a processor.
In detail, the memory and the processor are electrically connected directly or indirectly to realize data transmission or interaction. For example, they may be electrically connected to each other via one or more communication buses or signal lines. The memory can have stored therein at least one software function (computer program) which can be present in the form of software or firmware. The processor may be configured to execute the executable computer program stored in the memory, so as to implement the user data screening method based on data processing provided by the embodiment of the present invention.
For example, in some preferred embodiments, the Memory may be, but is not limited to, Random Access Memory (RAM), Read Only Memory (ROM), Programmable Read-Only Memory (PROM), Erasable Read-Only Memory (EPROM), electrically Erasable Read-Only Memory (EEPROM), and the like.
For example, in some preferred embodiments, the Processor may be a general-purpose Processor including a Central Processing Unit (CPU), a Network Processor (NP), a System on Chip (SoC), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components.
For example, in some preferred embodiments, the structure shown in fig. 1 is merely illustrative, and the user monitoring server may further include more or fewer components than those shown in fig. 1, or have a different configuration than that shown in fig. 1, such as a communication unit for information interaction with other devices.
With reference to fig. 2, an embodiment of the present invention further provides a user data screening method based on data processing, which is applicable to the user monitoring server. The method steps defined by the relevant processes of the data processing-based user data screening method can be realized by the user monitoring server, and the user monitoring server can be in communication connection with a plurality of user monitoring terminal devices.
The specific process shown in FIG. 2 will be described in detail below.
Step S110, obtaining the user monitoring videos respectively sent by the plurality of user monitoring terminal devices, and obtaining a plurality of user monitoring videos corresponding to the plurality of user monitoring terminal devices.
In the embodiment of the present invention, the user monitoring server may obtain the user monitoring videos respectively sent by the plurality of user monitoring terminal devices, and obtain a plurality of user monitoring videos corresponding to the plurality of user monitoring terminal devices. The plurality of user monitoring terminal devices are respectively used for carrying out image acquisition on the monitored environment area to obtain a plurality of corresponding user monitoring videos, and each user monitoring video comprises a plurality of frames of user monitoring video frames.
Step S120, determining video feature information of each user surveillance video in the plurality of user surveillance videos.
In an embodiment of the present invention, the user monitoring server may determine video feature information of each of the plurality of user monitoring videos. The video feature information is used for representing the features of the corresponding user monitoring video.
Step S130, determining a video screening mode of each user monitoring video based on the video characteristic information of each user monitoring video, and screening the corresponding user monitoring video based on the video screening mode to obtain a target monitoring video corresponding to the user monitoring video.
In the embodiment of the present invention, the user monitoring server may determine a video screening manner of each user monitoring video based on the video feature information of each user monitoring video, and perform screening processing on the corresponding user monitoring video based on the video screening manner to obtain a target monitoring video corresponding to the user monitoring video. Wherein each target surveillance video comprises at least one user surveillance video frame.
Based on step S110, step S120 and step S130 in the above embodiment, after user monitoring videos respectively transmitted by a plurality of user monitoring terminal devices are acquired, video characteristic information for each of a plurality of user surveillance videos may be determined first, then, the video screening mode of each user monitoring video can be determined based on the video characteristic information of each user monitoring video, and the corresponding user monitoring video is screened based on the determined video screening mode to obtain the corresponding target monitoring video, so that, since the video screening method for screening is determined based on the video feature information of the user monitoring video, the method has the advantages that the method has high matching degree with the user monitoring video, so that the reliability of the video screening mode obtained based on the method can be guaranteed, and the problem of poor screening effect on the user monitoring video in the prior art is solved.
For example, in some preferred embodiments, the step S110 in the above embodiments may include the following steps to obtain a plurality of user monitoring videos corresponding to the plurality of user monitoring terminal devices:
firstly, when a monitoring starting instruction is received, generating corresponding monitoring starting notification information, and sending the monitoring starting notification information to each user monitoring terminal device in the plurality of user monitoring terminal devices, wherein each user monitoring terminal device is used for acquiring images of a monitoring environment area after receiving the monitoring starting notification information;
and secondly, respectively acquiring a user monitoring video acquired and sent by each user monitoring terminal device in the plurality of user monitoring terminal devices based on the monitoring starting notification information.
For example, in some preferred embodiments, the step of generating corresponding monitoring start notification information when receiving the monitoring start instruction, and sending the monitoring start notification information to each of the plurality of user monitoring terminal devices may include:
firstly, judging whether a monitoring starting instruction is received or not;
secondly, when the monitoring starting instruction is judged to be received, monitoring starting notification information carrying an equipment synchronization instruction is generated and sent to each user monitoring terminal equipment in the plurality of user monitoring terminal equipment, wherein each user monitoring terminal equipment is used for sending starting confirmation information to each other user monitoring terminal equipment based on the equipment synchronization instruction carried in the monitoring starting notification information after receiving the monitoring starting notification information, and starting image acquisition on the monitored environment area after receiving the starting confirmation information sent by each other user monitoring terminal equipment.
For another example, in another preferred embodiment, the step of generating corresponding monitoring start notification information when receiving the monitoring start instruction, and sending the monitoring start notification information to each of the plurality of user monitoring terminal devices may include:
firstly, judging whether a monitoring starting instruction is received or not;
secondly, when the monitoring starting instruction is judged to be received, monitoring starting notification information carrying a monitoring stopping instruction is generated, and the monitoring starting notification information is sent to each user monitoring terminal device in the plurality of user monitoring terminal devices, wherein each user monitoring terminal device is used for starting image acquisition on the monitored environment area after receiving the monitoring starting notification information, acquiring the data volume of the acquired user monitoring video based on the monitoring stopping instruction carried in the monitoring starting notification information, and stopping image acquisition on the monitored environment area when the data volume of the currently acquired user monitoring video is larger than or equal to a data volume threshold value (can be configured according to actual requirements).
For example, in some preferred embodiments, the step of respectively obtaining the user monitoring video acquired and sent by each of the plurality of user monitoring terminal devices based on the monitoring start notification information may include:
firstly, after the monitoring start notification information is sent to each user monitoring terminal device in the plurality of user monitoring terminal devices, current time information is obtained;
secondly, judging whether the current time information belongs to target time information or not, and generating corresponding monitoring stop notification information when the current time information belongs to the target time information;
and then, respectively sending the monitoring stop notification information to each user monitoring terminal device in the plurality of user monitoring terminal devices, wherein each user monitoring terminal device is used for stopping image acquisition in the monitoring environment area after receiving the monitoring stop notification information, and sending the acquired user monitoring video to the user monitoring server.
For example, in some preferred embodiments, the step S120 in the above embodiments may include the following steps to determine the video feature information of each user monitoring video:
firstly, for each user monitoring video in the plurality of user monitoring videos, performing object recognition processing (for example, recognizing based on a neural network model for performing object recognition) on a user monitoring video frame included in the user monitoring video to obtain a target user object corresponding to the user monitoring video frame included in the user monitoring video;
secondly, for each user monitoring video in the plurality of user monitoring videos, determining the object identity information of the target user object corresponding to the user monitoring video frame included in the user monitoring video as the video feature information of the user monitoring video.
For another example, in other preferred embodiments, the step S120 in the above embodiments may include the following steps to determine the video feature information of each user monitoring video:
firstly, aiming at each user monitoring video in the plurality of user monitoring videos, determining a monitoring environment area where the user monitoring terminal equipment corresponding to the user monitoring video is located;
secondly, for each user monitoring video in the plurality of user monitoring videos, determining the area position information of the monitoring environment area where the user monitoring terminal equipment corresponding to the user monitoring video is located as the video characteristic information of the user monitoring video.
For example, in some preferred embodiments, the step S130 in the foregoing embodiments may include the following steps, so as to perform screening processing on the corresponding user monitoring video based on the video screening manner, to obtain a target monitoring video corresponding to the user monitoring video:
firstly, for every two user monitoring videos in the plurality of user monitoring videos, determining a video feature correlation representation value between the two user monitoring videos based on the video feature information of the two user monitoring videos, wherein the video feature correlation representation value is used for representing the video feature correlation degree between the two corresponding user monitoring videos;
secondly, determining a video screening mode of each user monitoring video based on a video feature correlation characteristic value between every two user monitoring videos in the plurality of user monitoring videos, and screening the corresponding user monitoring video based on the video screening mode of each user monitoring video to obtain a target monitoring video corresponding to the user monitoring video.
For example, in some preferred embodiments, the step of determining, for each two user surveillance videos in the plurality of user surveillance videos, a video feature correlation relationship characterization value between the two user surveillance videos based on the video feature information of the two user surveillance videos may include:
firstly, for every two user monitoring videos in the plurality of user monitoring videos, calculating video frame similarity between every two user monitoring video frames included in the two user monitoring videos, and calculating an average value of the video frame similarity between every two user monitoring video frames included in the two user monitoring videos as a first characteristic correlation relation representation value between the two user monitoring videos;
secondly, for every two user monitoring videos in the plurality of user monitoring videos, respectively counting object identity information of the target user objects identified in the two user monitoring videos, and regarding object correlation between the object identity information of the target user objects in the two user monitoring videos (for example, determining correlation between the target user objects based on the object identity information, if the correlation between a couple is greater than that of a relativity, the correlation between a relativity and a friend may be greater than that of a co-worker, wherein a specific correlation degree value may be defined and configured in advance) as a second feature correlation representation value between the two user monitoring videos;
then, for every two user monitoring videos in the plurality of user monitoring videos, calculating area position distance information between area position information of monitoring environment areas where the user monitoring terminal devices are located corresponding to the two user monitoring videos, and determining an area position distance representation value with a negative correlation based on the area position distance information to serve as a third feature correlation representation value between the two user monitoring videos;
then, obtaining a first weight coefficient, a second weight coefficient and a third weight coefficient corresponding to the first feature correlation characterization value, the second feature correlation characterization value and the third feature correlation characterization value respectively, wherein the sum of the first weight coefficient, the second weight coefficient and the third weight coefficient is 1, the first weight coefficient is greater than the second weight coefficient, and the second weight coefficient is greater than the third weight coefficient;
finally, for every two user monitoring videos in the plurality of user monitoring videos, based on the first weight coefficient, the second weight coefficient and the third weight coefficient, performing weighted summation calculation on the first feature correlation characteristic value, the second feature correlation characteristic value and the third feature correlation characteristic value between the two user monitoring videos to obtain a video feature correlation characteristic value between the two user monitoring videos.
For example, in some preferred embodiments, the step of determining a video screening manner of each user surveillance video based on a video feature correlation characterization value between every two user surveillance videos in the plurality of user surveillance videos, and performing screening processing on the corresponding user surveillance video based on the video screening manner of each user surveillance video to obtain a target surveillance video corresponding to the user surveillance video may include:
firstly, clustering processing (such as KNN algorithm) is carried out on the user monitoring videos based on a video feature correlation representation value between every two user monitoring videos in the user monitoring videos to obtain at least one corresponding monitoring video set, wherein each monitoring video set in the at least one monitoring video set comprises at least one user monitoring video;
secondly, counting the number of the user monitoring videos included in each monitoring video set to obtain the number of target videos corresponding to the monitoring video set, and determining a screening degree characterization value which has a positive correlation with the number of the target videos and corresponds to the monitoring video set based on the number of the target videos, wherein the screening degree characterization value is used for characterizing the maximum proportion or the maximum number of the screened user monitoring video frames after screening processing is performed on each user monitoring video in the corresponding monitoring video set;
then, for each monitoring video set, sequentially determining each user monitoring video included in the monitoring video set as a first user monitoring video, and executing a target screening operation based on a screening degree characterization value corresponding to the monitoring video set to obtain a target monitoring video corresponding to each user monitoring video included in the monitoring video set.
For example, in some preferred embodiments, the target screening operation in the above embodiments may include the following first to sixth steps:
the method comprises the steps that firstly, the video quantity of user monitoring videos included in a monitoring video set where a first user monitoring video is located is counted, and whether the video quantity is larger than a first preset value (such as 1) or not is determined;
secondly, if the number of the videos is larger than the first preset value, determining at least one user monitoring video with the largest video feature correlation relationship representation value between the user monitoring video and the first user monitoring video from the user monitoring videos included in the monitoring video set where the first user monitoring video is located, and using the user monitoring video as the associated user monitoring video of the first user monitoring video;
thirdly, if the number of the videos is less than or equal to the first preset value, based on the video feature correlation relationship representation values between the user monitoring videos included in each two monitoring video sets, determining a monitoring video set with the largest set correlation characteristic value between the monitoring video sets with the first user monitoring video (wherein the set correlation characteristic value is the average value of video feature correlation characteristic values between user monitoring videos included in two monitoring video sets) as a target monitoring video set in other monitoring video sets, determining at least one user monitoring video with the largest video feature correlation representation value between the user monitoring video and the first user monitoring video from the user monitoring videos included in the target monitoring video set, and using the user monitoring video as a related user monitoring video of the first user monitoring video;
fourthly, in at least one associated user monitoring video corresponding to the first user monitoring video, determining at least one associated user monitoring video with the smallest area position distance between corresponding user monitoring terminal equipment as at least one target associated user monitoring video, and determining the association degree of each target associated user monitoring video and the first user monitoring video in the data volume dimension (the smaller the data volume difference value is, the larger the corresponding data volume association degree is), so as to obtain at least one data volume association degree;
fifthly, determining a target data volume relevance degree in the at least one data volume relevance degree, and screening out at least one representative data volume relevance degree from the at least one data volume relevance degree based on the target data volume relevance degree, wherein the target data volume relevance degree is an average value of the at least one data volume relevance degree, and each representative data volume relevance degree is greater than or equal to the target data volume relevance degree;
and sixthly, updating the screening degree representation value corresponding to the monitoring video set where the first user monitoring video is located based on the quantity of the representative data volume relevance (if the quantity of the representative data volume relevance is larger, the updated screening degree representation value is larger, and if the quantity of the representative data volume relevance is smaller, the updated screening degree representation value is smaller), screening the first user monitoring video based on the updated screening degree representation value (if a part of user monitoring video frames with the maximum similarity are screened, the maximum proportion or the maximum quantity of screening is determined based on the updated screening degree representation value), and obtaining the target monitoring video corresponding to the first user monitoring video.
With reference to fig. 3, an embodiment of the present invention further provides a user data screening system based on data processing, which is applicable to the user monitoring server. The user data screening system based on data processing may include the following modules:
the user monitoring video acquisition module is used for acquiring user monitoring videos sent by the user monitoring terminal devices respectively to obtain a plurality of user monitoring videos corresponding to the user monitoring terminal devices, wherein the user monitoring terminal devices are used for acquiring images of monitored environment areas respectively to obtain a plurality of corresponding user monitoring videos, and each user monitoring video comprises a plurality of frames of user monitoring video frames;
the video characteristic information determining module is used for determining video characteristic information of each user monitoring video in the user monitoring videos, wherein the video characteristic information is used for representing the characteristics of the corresponding user monitoring video;
and the user monitoring video screening module is used for determining a video screening mode of each user monitoring video based on the video characteristic information of each user monitoring video, screening the corresponding user monitoring video based on the video screening mode, and obtaining a target monitoring video corresponding to the user monitoring video, wherein each target monitoring video comprises at least one user monitoring video frame.
In summary, after the user monitoring videos respectively sent by the plurality of user monitoring terminal devices are obtained by the user data screening method based on data processing provided by the present invention, video characteristic information for each of a plurality of user surveillance videos may be determined first, then, the video screening mode of each user monitoring video can be determined based on the video characteristic information of each user monitoring video, and the corresponding user monitoring video is screened based on the determined video screening mode to obtain the corresponding target monitoring video, so that, since the video screening method for screening is determined based on the video feature information of the user monitoring video, the method has the advantages that the method has high matching degree with the user monitoring video, so that the reliability of the video screening mode obtained based on the method can be guaranteed, and the problem of poor screening effect on the user monitoring video in the prior art is solved.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A user data screening method based on data processing is characterized in that the method is applied to a user monitoring server, the user monitoring server is in communication connection with a plurality of user monitoring terminal devices, and the user data screening method based on data processing comprises the following steps:
acquiring user monitoring videos respectively sent by the user monitoring terminal devices to obtain a plurality of user monitoring videos corresponding to the user monitoring terminal devices, wherein the user monitoring terminal devices are respectively used for carrying out image acquisition on a monitored environment area to obtain a plurality of corresponding user monitoring videos, and each user monitoring video comprises a plurality of frames of user monitoring video frames;
determining video feature information of each user monitoring video in the plurality of user monitoring videos, wherein the video feature information is used for representing features of the corresponding user monitoring video;
determining a video screening mode of each user monitoring video based on the video characteristic information of each user monitoring video, and screening the corresponding user monitoring video based on the video screening mode to obtain a target monitoring video corresponding to the user monitoring video, wherein each target monitoring video comprises at least one user monitoring video frame.
2. The method for screening user data based on data processing according to claim 1, wherein the step of obtaining the user monitoring videos respectively sent by the plurality of user monitoring terminal devices to obtain the plurality of user monitoring videos corresponding to the plurality of user monitoring terminal devices comprises:
when a monitoring starting instruction is received, generating corresponding monitoring starting notification information, and sending the monitoring starting notification information to each user monitoring terminal device in the plurality of user monitoring terminal devices, wherein each user monitoring terminal device is used for acquiring images of a monitoring environment area in which the user monitoring terminal device is located after receiving the monitoring starting notification information;
and respectively acquiring a user monitoring video acquired and sent by each user monitoring terminal device in the plurality of user monitoring terminal devices based on the monitoring starting notification information.
3. The method for screening user data based on data processing according to claim 2, wherein the step of generating corresponding monitoring start notification information and sending the monitoring start notification information to each of the plurality of user monitoring terminal devices when receiving the monitoring start instruction comprises:
judging whether a monitoring starting instruction is received or not;
and when the monitoring starting instruction is judged to be received, generating monitoring starting notification information carrying an equipment synchronization instruction, and sending the monitoring starting notification information to each user monitoring terminal equipment in the plurality of user monitoring terminal equipments, wherein each user monitoring terminal equipment is used for sending starting confirmation information to each other user monitoring terminal equipment based on the equipment synchronization instruction carried in the monitoring starting notification information after receiving the monitoring starting notification information, and starting image acquisition on the monitoring environment area after receiving the starting confirmation information sent by each other user monitoring terminal equipment.
4. The method for screening user data based on data processing according to claim 2, wherein the step of generating corresponding monitoring start notification information and sending the monitoring start notification information to each of the plurality of user monitoring terminal devices when receiving the monitoring start instruction comprises:
judging whether a monitoring starting instruction is received or not;
and when the monitoring starting instruction is judged to be received, generating monitoring starting notification information carrying a monitoring stopping instruction, and sending the monitoring starting notification information to each user monitoring terminal device in the plurality of user monitoring terminal devices, wherein each user monitoring terminal device is used for starting image acquisition on the monitoring environment area after receiving the monitoring starting notification information, acquiring the data volume of the acquired user monitoring video based on the monitoring stopping instruction carried in the monitoring starting notification information, and stopping image acquisition on the monitoring environment area when the data volume of the currently acquired user monitoring video is larger than or equal to a data volume threshold value.
5. The method for screening user data based on data processing according to claim 2, wherein the step of respectively obtaining the user monitoring video acquired and transmitted by each of the plurality of user monitoring terminal devices based on the monitoring start notification information comprises:
acquiring current time information after the monitoring start notification information is sent to each user monitoring terminal device in the plurality of user monitoring terminal devices;
judging whether the current time information belongs to target time information or not, and generating corresponding monitoring stop notification information when the current time information belongs to the target time information;
and respectively sending the monitoring stop notification information to each user monitoring terminal device in the plurality of user monitoring terminal devices, wherein each user monitoring terminal device is used for stopping image acquisition in the monitoring environment area after receiving the monitoring stop notification information, and sending the acquired user monitoring video to the user monitoring server.
6. The data processing-based user data screening method of claim 1, wherein the step of determining video feature information of each of the plurality of user surveillance videos comprises:
for each user monitoring video in the plurality of user monitoring videos, carrying out object identification processing on a user monitoring video frame included in the user monitoring video to obtain a target user object corresponding to the user monitoring video frame included in the user monitoring video;
and determining the object identity information of the target user object corresponding to the user monitoring video frame included in the user monitoring video as the video characteristic information of the user monitoring video aiming at each user monitoring video in the plurality of user monitoring videos.
7. The data processing-based user data screening method of claim 1, wherein the step of determining video feature information of each of the plurality of user surveillance videos comprises:
determining a monitoring environment area where the user monitoring terminal equipment corresponding to each user monitoring video in the plurality of user monitoring videos is located;
and determining the area position information of the monitoring environment area where the user monitoring terminal equipment is located corresponding to each user monitoring video in the user monitoring videos as the video characteristic information of the user monitoring video.
8. The method for screening user data based on data processing according to any one of claims 1 to 7, wherein the step of determining a video screening method for each user surveillance video based on the video feature information of each user surveillance video, and performing screening processing on the corresponding user surveillance video based on the video screening method to obtain a target surveillance video corresponding to the user surveillance video comprises:
for every two user monitoring videos in the plurality of user monitoring videos, determining a video feature correlation representation value between the two user monitoring videos based on the video feature information of the two user monitoring videos, wherein the video feature correlation representation value is used for representing the video feature correlation degree between the two corresponding user monitoring videos;
determining a video screening mode of each user monitoring video based on a video feature correlation representation value between every two user monitoring videos in the plurality of user monitoring videos, and screening the corresponding user monitoring video based on the video screening mode of each user monitoring video to obtain a target monitoring video corresponding to the user monitoring video.
9. The method as claimed in claim 8, wherein the step of determining, for each two user surveillance videos in the plurality of user surveillance videos, a video feature correlation relationship characterization value between the two user surveillance videos based on the video feature information of the two user surveillance videos comprises:
for every two user monitoring videos in the plurality of user monitoring videos, calculating video frame similarity between every two user monitoring video frames included in the two user monitoring videos, and calculating an average value of the video frame similarity between every two user monitoring video frames included in the two user monitoring videos as a first characteristic correlation relation representation value between the two user monitoring videos;
for every two user monitoring videos in the plurality of user monitoring videos, respectively counting object identity information of the target user object identified in the two user monitoring videos, and taking object correlation between the object identity information of the target user object in the two user monitoring videos as a second characteristic correlation relation representation value between the two user monitoring videos;
for every two user monitoring videos in the plurality of user monitoring videos, calculating area position distance information between area position information of monitoring environment areas where the user monitoring terminal equipment is located corresponding to the two user monitoring videos, and determining an area position distance representation value with a negative correlation based on the area position distance information to serve as a third feature correlation representation value between the two user monitoring videos;
acquiring a first weight coefficient, a second weight coefficient and a third weight coefficient which respectively correspond to the first feature correlation characterization value, the second feature correlation characterization value and the third feature correlation characterization value, wherein the sum of the first weight coefficient, the second weight coefficient and the third weight coefficient is 1, the first weight coefficient is greater than the second weight coefficient, and the second weight coefficient is greater than the third weight coefficient;
and for every two user monitoring videos in the plurality of user monitoring videos, performing weighted summation calculation on the first feature correlation characteristic value, the second feature correlation characteristic value and the third feature correlation characteristic value between the two user monitoring videos based on the first weight coefficient, the second weight coefficient and the third weight coefficient to obtain a video feature correlation characteristic value between the two user monitoring videos.
10. The method according to claim 8, wherein the step of determining a video screening manner of each user surveillance video based on the video feature correlation characterization value between every two user surveillance videos in the plurality of user surveillance videos, and screening the corresponding user surveillance video based on the video screening manner of each user surveillance video to obtain a target surveillance video corresponding to the user surveillance video comprises:
clustering the user monitoring videos to obtain at least one corresponding monitoring video set based on a video feature correlation representation value between every two user monitoring videos in the user monitoring videos, wherein each monitoring video set in the at least one monitoring video set comprises at least one user monitoring video;
counting the number of the user monitoring videos included in each monitoring video set to obtain the number of target videos corresponding to the monitoring video set, and determining a screening degree characterization value which has a positive correlation with the number of the target videos and corresponds to the monitoring video set based on the number of the target videos, wherein the screening degree characterization value is used for characterizing the maximum proportion or the maximum number of the screened user monitoring video frames after screening processing is performed on each user monitoring video in the corresponding monitoring video set;
and for each monitoring video set, sequentially determining each user monitoring video included in the monitoring video set as a first user monitoring video, and executing target screening operation based on the screening degree representation value corresponding to the monitoring video set to obtain a target monitoring video corresponding to each user monitoring video included in the monitoring video set.
CN202111245780.2A 2021-10-26 2021-10-26 User data screening method based on data processing Withdrawn CN114173086A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111245780.2A CN114173086A (en) 2021-10-26 2021-10-26 User data screening method based on data processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111245780.2A CN114173086A (en) 2021-10-26 2021-10-26 User data screening method based on data processing

Publications (1)

Publication Number Publication Date
CN114173086A true CN114173086A (en) 2022-03-11

Family

ID=80477344

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111245780.2A Withdrawn CN114173086A (en) 2021-10-26 2021-10-26 User data screening method based on data processing

Country Status (1)

Country Link
CN (1) CN114173086A (en)

Similar Documents

Publication Publication Date Title
CN109993020B (en) Human face distribution alarm method and device
CN114140713A (en) Image recognition system and image recognition method
CN114140712A (en) Automatic image recognition and distribution system and method
CN114140710A (en) Monitoring data transmission method and system based on data processing
CN114724215A (en) Sensitive image identification method and system
CN114139016A (en) Data processing method and system for intelligent cell
CN113868471A (en) Data matching method and system based on monitoring equipment relationship
CN113949881A (en) Service processing method and system based on smart city data
CN114095734A (en) User data compression method and system based on data processing
CN114173086A (en) User data screening method based on data processing
CN115375886A (en) Data acquisition method and system based on cloud computing service
CN114677615A (en) Environment detection method and system
CN114928467A (en) Network security operation and maintenance association analysis method and system
CN114153654A (en) User data backup method and system based on data processing
CN113902412A (en) Environment monitoring method based on data processing
CN113808088A (en) Pollution detection method and system
CN116821777B (en) Novel basic mapping data integration method and system
CN114156495B (en) Laminated battery assembly processing method and system based on big data
CN114173152A (en) Communication video screening method and system
CN114139017A (en) Safety protection method and system for intelligent cell
CN114140709A (en) Monitoring data distribution method and system based on data processing
CN115442392A (en) Data processing platform and data acquisition method
CN114154011A (en) Safe storage method and system based on intelligent health data effective confirmation
CN114140741A (en) Monitoring data processing method based on data processing
CN114167763A (en) Validity confirmation method of intelligent health data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20220311

WW01 Invention patent application withdrawn after publication