CN113869372A - Image clustering method, electronic device and storage medium - Google Patents

Image clustering method, electronic device and storage medium Download PDF

Info

Publication number
CN113869372A
CN113869372A CN202111034186.9A CN202111034186A CN113869372A CN 113869372 A CN113869372 A CN 113869372A CN 202111034186 A CN202111034186 A CN 202111034186A CN 113869372 A CN113869372 A CN 113869372A
Authority
CN
China
Prior art keywords
bayonet
bayonets
target
target image
clustering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111034186.9A
Other languages
Chinese (zh)
Inventor
王凯垚
周明伟
陈立力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202111034186.9A priority Critical patent/CN113869372A/en
Publication of CN113869372A publication Critical patent/CN113869372A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/231Hierarchical techniques, i.e. dividing or merging pattern sets so as to obtain a dendrogram

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application discloses an image clustering method, electronic equipment and a computer-readable storage medium. The method comprises the following steps: acquiring a target image set, wherein the target image set comprises a plurality of target images shot by a plurality of bayonets in a preset area within preset time; based on the association degree between different bayonets, dividing a plurality of bayonets into a first preset number group, and dividing preset time into a plurality of time domains, wherein each group of bayonets belongs to a space domain, and one time domain and one space domain form a time-space domain; dividing the target image set according to a time-space domain to obtain a plurality of target image subsets; and clustering each target image subset to obtain a clustering result. By the method, the recall rate of the clustering result can be improved without influencing the accuracy rate of the clustering result.

Description

Image clustering method, electronic device and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image clustering method, an electronic device, and a computer-readable storage medium.
Background
Through image clustering technology, can cluster the image that the bayonet socket was shot in preset time and the preset region, form the archives of different objects (can be people, also can be other living bodies) to help the better management and control of carrying out the object of policeman department.
The existing image clustering method is based on a deep learning technology, extracts the characteristics of objects in images and clusters the images based on the characteristics. Due to differences of shooting conditions under different conditions, the similarity of images of the same object is sometimes slightly lower than a clustering threshold, so that omission of images in archives is caused, and the recall rate of clustering results is low. However, if the clustering threshold is lowered, the accuracy of the clustering result is affected. Therefore, how to improve the recall rate of the clustering result without influencing the accuracy rate of the clustering result is an urgent problem to be solved.
Disclosure of Invention
The application provides an image clustering method, an electronic device and a computer readable storage medium, which can improve the recall rate of clustering results without influencing the accuracy rate of the clustering results.
In order to solve the technical problem, the application adopts a technical scheme that: an image clustering method is provided. The method comprises the following steps: acquiring a target image set, wherein the target image set comprises a plurality of target images shot by a plurality of bayonets in a preset area within preset time; based on the association degree between different bayonets, dividing a plurality of bayonets into a first preset number group, and dividing preset time into a plurality of time domains, wherein each group of bayonets belongs to a space domain, and one time domain and one space domain form a time-space domain; dividing the target image set according to a time-space domain to obtain a plurality of target image subsets; and clustering each target image subset to obtain a clustering result.
In order to solve the above technical problem, another technical solution adopted by the present application is: an electronic device is provided, which comprises a processor and a memory connected with the processor, wherein the memory stores program instructions; the processor is configured to execute the program instructions stored by the memory to implement the above-described method.
In order to solve the above technical problem, the present application adopts another technical solution: there is provided a computer readable storage medium storing program instructions that when executed are capable of implementing the above method.
Through the mode, the clustering is performed by taking the time-space domain (the target image subset) as a unit, and compared with the method of directly clustering the target image set, the clustering method has the advantage that the probability of similar objects in the target image subset is reduced, so that the accuracy of a clustering result can be improved. Secondly, because this application carries out the partition of space domain to a plurality of bayonet socket based on the degree of association between the bayonet socket of difference, can make the degree of association between the bayonet socket that the time-space domain that space domain and time domain constitute corresponds higher, further improve the rate of accuracy of clustering result. On the basis, the clustering threshold is reduced, so that the recall rate of the clustering result can be improved without influencing the accuracy rate of the clustering result.
Drawings
FIG. 1 is a schematic flow chart diagram illustrating an embodiment of an image clustering method provided in the present application;
FIG. 2 is a schematic illustration of the time-space domain of the present application;
FIG. 3 is a schematic flow chart diagram illustrating another embodiment of an image clustering method provided in the present application;
FIG. 4 is a schematic view of the detailed process of S22 in FIG. 3;
FIG. 5 is a schematic flow chart diagram illustrating a further embodiment of an image clustering method provided by the present application;
FIG. 6 is a schematic view of a bayonet view of the present application;
FIG. 7 is a schematic flow chart diagram illustrating a further embodiment of an image clustering method provided by the present application;
FIG. 8 is a schematic view of the detailed process of S42 in FIG. 7;
FIG. 9 is a schematic structural diagram of an embodiment of an electronic device of the present application;
FIG. 10 is a schematic structural diagram of an embodiment of a computer-readable storage medium of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first", "second" and "third" in this application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any indication of the number of technical features indicated. Thus, a feature defined as "first," "second," or "third" may explicitly or implicitly include at least one of the feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless explicitly specifically limited otherwise.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those skilled in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Fig. 1 is a schematic flowchart of an embodiment of an image clustering method provided in the present application. It should be noted that, if the result is substantially the same, the flow sequence shown in fig. 1 is not limited in this embodiment. As shown in fig. 1, the present embodiment may include:
s11: a target image set is acquired.
The target image set comprises a plurality of target images shot by a plurality of bayonets in a preset area within preset time.
The target image is provided with corresponding identification information, and the identification information is used for identifying the shooting time and the shooting bayonet of the target image. The target image may be understood as an image obtained by the subject passing through the imaging mount during the imaging time. Shooting time of the target images in the target image set is within preset time, and shooting bayonets belong to preset areas. The preset time may be in units of hours, days, months, etc. For example, the preset time is one day.
S12: based on the association degree between different bayonets, a plurality of bayonets are divided into a first preset number group, and preset time is divided into a plurality of time domains.
Each group of bayonets belongs to a space domain, and a time domain and a space domain form a time-space domain. Thus, a first predetermined number of time-space domains can be obtained. Fig. 2 is a schematic diagram of the time-space domain. As shown in fig. 2, the horizontal axis represents the preset time (T), and the horizontal axis represents the preset region (R). The preset time is divided into 6 time domains, which are T1, T2, T3, T4, T5, and T6, respectively. The preset region is divided into 5 spatial domains, R1, R2, R3, R4, and R5, respectively. Thus, the time domain and the space domain can form a total of 6 × 5 to 30 time-space domains, and the time-space domain can be denoted as (Tx, Ry).
A time domain threshold may be set, and the preset time is divided into a plurality of time domains according to the time domain threshold. For example, the preset time is one day, the time domain threshold is 2 hours, and the preset time is divided into 12 time domains, each of which has a length of 2 hours.
The bayonets are divided based on the association degrees among different bayonets, and the bayonets with high association degrees can be divided into the same group, so that the association degrees among the bayonets corresponding to the same space domain/time-space domain are high. The association degree between the two bayonets changes and reflects the possibility that the same object sequentially passes through the two bayonets, so that the subsequent clustering is carried out by taking a time-space domain as a unit, and the accuracy of a clustering result can be improved.
S13: and dividing the target image set according to a time-space domain to obtain a plurality of target image subsets.
One time-space domain corresponds to one target image subset.
S14: and clustering each target image subset to obtain a clustering result.
It will be appreciated that similar objects may interfere with the clustering. Compared with the preset time and the preset area (the target image set), the probability that similar objects appear in the same time-space domain (the target image subset) is low, so that the clustering is performed by taking the time-space domain as a unit, and the clustering accuracy can be improved.
In this step, for each target image subset, the target image subsets may be clustered to obtain a plurality of corresponding second archives. The archive (first archive/second archive) related to the present application may be composed of the corresponding target image, or may be composed of the features of the object in the target image.
As an implementation, the second file may be directly used as a clustering result. As another implementation manner, considering that the target may pass through different time domains in the same spatial domain or may pass through different spatial domains in the same spatial domain, in order to improve the recall rate of the clustering result, the second archive may be archived, and the archive result is taken as the clustering result.
Through the implementation of the embodiment, the time-space domain (target image subset) is taken as a unit for clustering, and compared with the method for directly clustering the target image set, the probability of similar objects in the target image subset is reduced, so that the accuracy of a clustering result can be improved. Secondly, because this application carries out the partition of space domain to a plurality of bayonet socket based on the degree of association between the bayonet socket of difference, can make the degree of association between the bayonet socket that the time-space domain that space domain and time domain constitute corresponds higher, further improve the rate of accuracy of clustering result. On the basis, the clustering threshold is reduced, so that the recall rate of the clustering result can be improved without influencing the accuracy rate of the clustering result.
Before S12, the association degree between different bayonets needs to be determined. Specifically, the following may be mentioned:
fig. 3 is a schematic flowchart of another embodiment of an image clustering method provided in the present application. It should be noted that, if the result is substantially the same, the flow sequence shown in fig. 3 is not limited in this embodiment. As shown in fig. 3, the present embodiment may include:
s21: and clustering the target image set to obtain a plurality of first files.
Each first file corresponds to an object.
S22: and determining the association degree between different bayonets based on the bayonet tracks of the first files.
The target images in the first file are arranged according to the sequence of the shooting time, the shooting time corresponding to each target image in the first file after arrangement can form a time sequence (time track), and the shooting bayonets corresponding to each target image in the first file after arrangement can form a bayonet sequence (bayonet track).
For example, the first file comprises five target images { F1, F2, F3, F4, F5}, the shooting time of F1 is 1:30, and the shooting mount is CjF2, shooting time is 1:15, shooting bayonet is CjF3, shooting time 1:00, shooting gate CjF4, shooting time is 1:20, shooting bayonet is CkF5, shooting time is 1:45, shooting bayonet is Cl(ii) a After the alignment, the first file is F ═ F3, F2, F4, F1, F5, the time trajectory is T ═ {1:00, 1:15, 1:20, 1:30, 1:45}, the bayonet trajectory is C ═ Cj,Cj,Ck,Cj,Cl}。
As an implementation, the association degree between different bayonets may be determined directly based on the original bayonet trajectory of the first archive.
It is understood that if adjacent bayonets in the bayonet trajectory are the same, the calculation efficiency is affected, and invalid calculation may be caused. Therefore, as another implementation manner, the original bayonet trajectory of the first file may be subjected to deduplication processing, and then the association degree between different bayonets is determined based on the bayonet trajectory subjected to deduplication processing. The duplicate removal processing is performed on adjacent repeated bayonets in the bayonet track, and the sequence of the duplicate removal processing is the sequence of the shooting time. That is, if two adjacent gates overlap, one of the gates having the imaging time earlier is removed. For example, for C ═ Cj,Cj,Ck,Cj,ClAfter the deduplication treatment, C' ═ C is obtainedj,Ck,Cj,Cl}。
Referring to fig. 4 in combination, S22 may include the following sub-steps:
s221: the number of passes for each bayonet pair is determined. The bayonet pair is composed of two adjacent bayonets in a bayonet track.
The bayonet pairs consisting of the same two bayonets are the same bayonet pairs. For example, the bayonet locus C ═ { C ═ Ca,Cb,Ca,CcThe corresponding bayonet pair is { C }a,Cb},{Cb,Ca},{Ca,Cc},{Ca,CbAnd { C }b,CaThe same bayonet pairs.
S222: and obtaining the association degree between different bayonets based on the number of times each bayonet pair is passed.
The number of passes of a bayonet pair may also be referred to as the number of occurrences of a bayonet pair in the bayonet tracks of all first profiles.
Each bayonet can be used as a target bayonet, the target bayonet and each other bayonet form a target bayonet pair, and the ratio of the number of times of passing the target bayonet pair to the total number of times of passing the target bayonet is used as the association degree of the target bayonet pair. Wherein, the total number of times of passing of the target bayonet is the sum of the number of times of passing of all bayonet pairs including the target bayonet.
The association degree of the target bayonet pair is the association degree between two bayonets included in the target bayonet pair. The bayonet pair containing the target bayonet is a bayonet pair formed by participating in the target bayonet.
For example, the number of times of passage of each bayonet pair obtained by counting the bayonet tracks of the first file is as follows: bayonet pair { C1,C2Is x times of being passed1Bayonet pair { C1,C3Is x times of being passed2Bayonet pair { C1,C4Is x times of being passed3Bayonet pair { C2,C3Is x times of being passed4Bayonet pair { C2,C4Is x times of being passed5Bayonet pair { C3,C4Is x times of being passed6. Is marked as { { C1,C2}:x1,{C1,C3}:x2,{C1,C4}:x3,{C2,C3}:x4,{C2,C4}:x5,{C3,C4}:x6}。
C is to be1As the target bayonet, a bayonet pair including the target bayonet is { C1,C2}、{C1,C3And { C }1,C4The total number of times of passing the target bayonet is x1+x2+x3(ii) a C is to be2As the target bayonet, a bayonet pair including the target bayonet is { C1,C2}、{C2,C3And { C }2,C4The total number of times of passing the target bayonet is x1+x4+x5… …, respectively; and so on.
Thereby, a degree of association between different bayonets can be obtained:
Figure BDA0003246450550000071
Figure BDA0003246450550000072
Figure BDA0003246450550000073
Figure BDA0003246450550000074
Figure BDA0003246450550000075
Figure BDA0003246450550000076
Figure BDA0003246450550000077
Figure BDA0003246450550000078
Figure BDA0003246450550000079
Figure BDA00032464505500000710
Figure BDA00032464505500000711
Figure BDA00032464505500000712
wherein, P (C)m/Cn) Show the bayonet CmAnd bayonet CnThe degree of correlation between them.
Fig. 5 is a schematic flowchart of another embodiment of an image clustering method provided in the present application. It should be noted that, if the result is substantially the same, the flow sequence shown in fig. 5 is not limited in this embodiment. The present embodiment is a further extension of S12. As shown in fig. 5, the present embodiment may include:
s31: and connecting the two bayonets with the association degrees meeting the preset conditions in sequence to form a bayonet diagram.
Under the limitation of a preset condition (threshold), two bayonet connections with the association degree greater than the association degree threshold can be used.
Alternatively, for each bayonet, the bayonet pairs formed by the bayonet and other bayonets can be arranged in the order of the degree of association from large to small. Under the limit of a preset condition (ranking), two bayonet connections contained in a specified number of bayonet pairs ranked at the top are connected in sequence. For example, the specified number is 2. For bayonet C1When x is1>x2>x3When, C1The bayonet pairs formed by the bayonet pairs and other bayonets have the following arrangement results according to the sequence from the large degree of association to the small degree of association: { C1,C2}、{C1,C3}、{C1,C4H, mixing C1And C2Connecting, and C1And C3And (4) connecting. For C1The processing procedures of other bayonets are similar and are not described herein. The designated number may be determined based on the number of spatial domains desired to be divided (a first preset number), and the designated number is inversely related to the first preset number. That is, the more the specified number is, the more the connection relationships between the bayonets included in the bayonet chart are, and it is subsequently more difficult to cut the bayonet chart by removing the connection relationships.
S32: and removing the connection relation of the bayonet pairs with the second preset number in the bayonet diagram so as to maximize the product of the bayonet numbers corresponding to the bayonet subgraphs with the first preset number obtained after removal.
And the number of bayonets corresponding to each bayonet subgraph is less than a number threshold, and each bayonet subgraph corresponds to one group of bayonets.
And removing the connection relation of the bayonet pairs, namely splitting the connecting line segment between the two bayonets contained in the bayonet pairs. The bayonet diagram can be divided into several independent parts (bayonet subgraphs) by splitting/removal. The removing/splitting process may be understood as a process of removing the association relationship of the bayonet pairs with a low association degree in the bayonet graph, so that the association degree between the bayonets corresponding to different bayonet subgraphs is low, and the association degree between the bayonets corresponding to the same bayonet subgraph is high.
The second preset number may be manually specified or calculated by the system. The second preset number may be regarded as an optimal number of cuts of the bayonet diagram. The optimal cutting times are the minimum cutting times which can cut the bayonet graphs into a first preset number of bayonet subgraphs, and the number of bayonets corresponding to each bayonet subgraph is smaller than a number threshold value. That is, if the segmentation of the x line segments can "segment the bayonet graph into a first preset number of bayonet subgraphs, and the number of bayonets corresponding to each bayonet subgraph is less than the number threshold", and the segmentation of the x-1 line segments cannot "segment the bayonet graph into a first preset number of bayonet subgraphs, and the number of bayonets corresponding to each bayonet subgraph is less than the number threshold", then x is the optimal segmentation number.
Bayonet with bayonet subgraphThe product of the quantities is maximal and can be recorded as
Figure BDA0003246450550000081
Figure BDA0003246450550000091
Where S denotes the product, X denotes the number of bayonet subgraphs, SiAnd representing the number of bayonets corresponding to the ith bayonet subgraph. The product size of the number of bayonets corresponding to the bayonet subgraph is positively correlated with the correlation degree of different bayonets corresponding to the bayonet subgraph. Therefore, under the condition that the product of the number of bayonets corresponding to the removed bayonet subgraph is the largest, the association degree between different bayonets corresponding to the obtained bayonet subgraph is the highest under the optimal cutting times. In the present application, a segment that is segmented when the product of the number of bayonets corresponding to the bayonet subgraph is the maximum is referred to as an optimal segmentation segment.
This step is illustrated in connection with fig. 6. Fig. 6 is a schematic view of a bayonet diagram, and as shown in fig. 6, the bayonet diagram is composed of bayonets a to H and their association relationship/connection line segments. The optimal cutting times are 1, namely, the cutting of the bayonet diagram is only needed once. When the segmentation line segment is AD, obtaining 1 bayonet subgraph which corresponds to 8 bayonets (A-H), wherein the product is 8; when the segmentation line segment is DG, obtaining 2 bayonet subgraphs, wherein one bayonet subgraph corresponds to 5 bayonets (A-E), the other bayonet subgraph corresponds to 3 bayonets (F-H), and the product is 5 x 3 ═ 15; when the segmentation line segment is GF, obtaining 2 bayonet subgraphs, wherein one bayonet subgraph corresponds to 7 bayonets (A-E, G, H), the other bayonet subgraph corresponds to 1 bayonet (F), and the product is 7 x 1-7; … … are provided. Thus, the optimal segmentation line segment is DG, and the bayonets A-H can be divided into 2 groups by segmenting DG, one group comprises A-E, and the other group comprises F-H.
If the second file needs to be shifted further in S14, S14 may be expanded as follows:
fig. 7 is a schematic flowchart of a further embodiment of the image clustering method provided in the present application. It should be noted that, if the result is substantially the same, the flow sequence shown in fig. 7 is not limited in this embodiment. The present embodiment is a further extension of S12. As shown in fig. 7, the present embodiment may include:
s41: and clustering each target image subset to obtain a plurality of second archives corresponding to each target image subset.
Each second file corresponds to an object.
S42: and combining the plurality of second files to obtain a clustering result.
It is understood that the same object may pass through a critical spatial domain at the same time domain. Therefore, the second file corresponding to the critical space domain in the same time domain can be combined. In this manner, referring to fig. 8 in combination, S42 may include the following sub-steps:
s421: and determining the space domain passed by each first file based on the bayonet tracks of the plurality of first files.
S422: and obtaining a critical bayonet pair set based on the spatial domain passed by the first file.
The critical bayonet pair set comprises a plurality of critical bayonet pairs, and each critical bayonet pair consists of a last bayonet of a previous space domain and a first bayonet of a next space domain, through which the first file passes.
The former space domain and the latter space domain can be regarded as critical space domains, so that the two bayonets contained in the critical bayonet pair have high association degree.
After this step is performed, the process may directly proceed to S423.
Or, in order to improve the accuracy of the gear-combining result, before entering S423, denoising may be performed on the set of critical bayonet pairs. In particular, the number of lapsed for each critical bayonet pair may be determined; and removing the critical bayonet pairs which are passed for times less than the time threshold value in the critical bayonet pair set. For example, if the set number threshold is P, the critical bayonet pair with the elapsed number less than P is removed.
S423: and sequentially closing the second files passing through the critical bayonet pairs in the same time domain.
The second file passing through the critical bayonet pair in the same time domain is closed, and the second file passing through one of the bayonet pairs in the same time domain and the second file passing through the other bayonet pair in the critical bayonet pair are closed. That is, for the critical bayonet pair (bayonet 1, bayonet 2), the second file passing through bayonet 1 and the second file passing through bayonet 2 are shifted in the same time domain.
For example, first 0 ~ 2 second archives that the point passed through bayonet 1 and second archive that passes through bayonet 2 close the shelves, and then 2 ~ 4 second archives that the point passed through bayonet 1 and second archive that passes through bayonet 2 close the shelves, and so on in proper order.
Because the association degree of the two bayonets contained in the critical bayonet pair is high, the second file passing through the critical bayonet pair in the same time domain is combined, and the recall rate of the clustering result can be improved.
In this step, before S423, second files corresponding to adjacent time domains in the same space domain may be sequentially combined. For example, second archives corresponding to 0-2 points and second archives corresponding to 2-4 points under the same space domain are closed, then the second archives corresponding to 2-4 points and archives corresponding to 4-6 points are closed, and so on. Therefore, S423 is performed on the basis of the archive closing result of the second archive corresponding to the adjacent time domain in the same spatial domain, and the recall rate of the clustering result can be further improved by expanding the activity time range of the same object.
In addition, in order to further increase the recall rate of the clustering result, in this step, after S423, a full archive closing may be performed on different second archives, that is, all archives obtained after the archive closing in S423 are closed, in consideration that the same object may appear in different spatial domains in different time domains.
Fig. 9 is a schematic structural diagram of an embodiment of an electronic device according to the present application. As shown in fig. 9, the electronic device comprises a processor 51, a memory 52 coupled to the processor 51.
Wherein the memory 52 stores program instructions for implementing the method of any of the above embodiments; the processor 51 is operative to execute program instructions stored by the memory 52 to implement the steps of the above-described method embodiments. The processor 51 may also be referred to as a CPU (Central Processing Unit). The processor 51 may be an integrated circuit chip having signal processing capabilities. The processor 51 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
FIG. 10 is a schematic structural diagram of an embodiment of a computer-readable storage medium of the present application. As shown in fig. 10, the computer readable storage medium 60 of the embodiment of the present application stores program instructions 61, and the program instructions 61 implement the method provided by the above-mentioned embodiment of the present application when executed. The program instructions 61 may form a program file stored in the computer readable storage medium 60 in the form of a software product, so as to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned computer-readable storage medium 60 includes: various media capable of storing program codes, such as a usb disk, a mobile hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, or terminal devices, such as a computer, a server, a mobile phone, and a tablet.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. The above embodiments are merely examples and are not intended to limit the scope of the present disclosure, and all modifications, equivalents, and flow charts using the contents of the specification and drawings of the present disclosure or those directly or indirectly applied to other related technical fields are intended to be included in the scope of the present disclosure.

Claims (12)

1. An image clustering method, comprising:
acquiring a target image set, wherein the target image set comprises a plurality of target images shot by a plurality of bayonets in a preset area within preset time;
dividing the plurality of bayonets into a first preset number group and dividing the preset time into a plurality of time domains based on the association degree between different bayonets, wherein each group of bayonets belongs to a space domain, and one time domain and one space domain form a time-space domain;
dividing the target image set according to the time-space domain to obtain a plurality of target image subsets;
and clustering each target image subset to obtain a clustering result.
2. The method of claim 1, wherein the dividing the plurality of bayonets into a preset number of groups based on the degree of association between the different bayonets comprises:
sequentially connecting the two bayonets with the association degrees meeting the preset conditions to form a bayonet diagram;
and removing the connection relation of a second preset number of bayonet pairs in the bayonet graph so as to maximize the product of the bayonet numbers corresponding to the first preset number of bayonet subgraphs obtained after removal, wherein the bayonet number corresponding to each bayonet subgraph is smaller than a number threshold, and each bayonet subgraph corresponds to one group of bayonets.
3. The method of claim 1, wherein prior to said dividing said number of bayonets into a preset number of groups based on a degree of association between said different bayonets, said method further comprises:
clustering the target image set to obtain a plurality of first files;
and determining the association degree between different bayonets based on the bayonet tracks of the plurality of first files.
4. The method of claim 3, wherein determining a degree of association between different ones of the bayonets based on the bayonet trajectories of the first plurality of profiles comprises:
determining the number of times each bayonet pair is passed through, wherein each bayonet pair consists of two adjacent bayonets in the bayonet track;
and obtaining different association degrees between the bayonets based on the number of times of passing of each bayonet pair.
5. The method of claim 4, wherein obtaining the degree of association between the plurality of bayonets based on the number of passes for each of the bayonet pairs comprises:
taking each bayonet as a target bayonet;
and respectively forming a target bayonet pair by the target bayonet and each other bayonet, and taking the ratio of the number of times of passing the target bayonet pair to the total number of times of passing the target bayonet as the association degree of the target bayonet pair, wherein the total number of times of passing the target bayonet is the sum of the number of times of passing all the bayonet pairs including the target bayonet.
6. The method of claim 2, wherein said clustering each of said target image subsets to obtain a clustering result comprises:
clustering each target image subset to obtain a plurality of second archives corresponding to each target image subset;
and combining the plurality of second files to obtain a clustering result.
7. The method according to claim 6, wherein the merging the plurality of second files to obtain a clustering result comprises:
determining a space domain passed by each first file based on bayonet tracks of the plurality of first files;
obtaining a critical bayonet pair set based on the space domain passed by the first file, wherein the critical bayonet pair set comprises a plurality of critical bayonet pairs, and each critical bayonet pair consists of the last bayonet of the previous space domain passed by the first file and the first bayonet of the next space domain;
and sequentially closing the second files passing through the critical bayonet pairs in the same time domain.
8. The method of claim 7, wherein before the engaging two second shifts respectively passing through two bayonets constituting the critical bayonet pair in the same time domain, the method further comprises:
determining the number of times each critical bayonet pair is passed;
and removing the critical bayonet pairs which are passed for times less than a time threshold value in the critical bayonet pair set.
9. The method of claim 7, wherein before the engaging two second shifts respectively passing through two bayonets constituting the critical bayonet pair in the same time domain, the method further comprises:
and sequentially combining the second files corresponding to the adjacent time domains in the same space domain.
10. The method of claim 9, wherein after sequentially engaging the second gear that passes through the critical bayonet pair in the same time domain, the method further comprises:
and performing full-quantity gear combination on different second gears.
11. An electronic device comprising a processor, a memory coupled to the processor, wherein,
the memory stores program instructions;
the processor is configured to execute the program instructions stored by the memory to implement the method of any of claims 1-10.
12. A computer-readable storage medium, characterized in that the storage medium stores program instructions that, when executed, implement the method of any of claims 1-10.
CN202111034186.9A 2021-09-03 2021-09-03 Image clustering method, electronic device and storage medium Pending CN113869372A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111034186.9A CN113869372A (en) 2021-09-03 2021-09-03 Image clustering method, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111034186.9A CN113869372A (en) 2021-09-03 2021-09-03 Image clustering method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN113869372A true CN113869372A (en) 2021-12-31

Family

ID=78989564

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111034186.9A Pending CN113869372A (en) 2021-09-03 2021-09-03 Image clustering method, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN113869372A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115687249A (en) * 2022-12-30 2023-02-03 浙江大华技术股份有限公司 Image gathering method and device, terminal and computer readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115687249A (en) * 2022-12-30 2023-02-03 浙江大华技术股份有限公司 Image gathering method and device, terminal and computer readable storage medium

Similar Documents

Publication Publication Date Title
WO2021217934A1 (en) Method and apparatus for monitoring number of livestock, and computer device and storage medium
CN110909205B (en) Video cover determination method and device, electronic equipment and readable storage medium
KR20180031024A (en) Future audience prediction of video segments to optimize system resource utilization
US20130177252A1 (en) Detecting Video Copies
JP2022518469A (en) Information processing methods and devices, storage media
TW202105374A (en) File application method, device and storage medium
US9665773B2 (en) Searching for events by attendants
CN109800318B (en) Filing method and device
Sakarya et al. Video scene detection using graph-based representations
CN113869372A (en) Image clustering method, electronic device and storage medium
CN113706502B (en) Face image quality assessment method and device
CN111488813B (en) Video emotion marking method and device, electronic equipment and storage medium
CN112052251B (en) Target data updating method and related device, equipment and storage medium
CN111177436A (en) Face feature retrieval method, device and equipment
WO2023284181A1 (en) Method for filtering face images, electronic device, and computer-readable non-transitory storage medium
CN111046882A (en) Disease name standardization method and system based on profile hidden Markov model
CN111479168B (en) Method, device, server and medium for marking multimedia content hot spot
DE112021003550T5 (en) DOWNSAMPLING GENOMIC SEQUENCE DATA
CN113158773A (en) Training method and training device for living body detection model
CN114863364B (en) Security detection method and system based on intelligent video monitoring
CN103916677A (en) Advertisement video identifying method and device
CN113743533B (en) Picture clustering method and device and storage medium
CN116156416A (en) Method and device for extracting liveplace based on signaling data
CN114268730A (en) Image storage method and device, computer equipment and storage medium
CN112000293B (en) Monitoring data storage method, device, equipment and storage medium based on big data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination