CN114860750A - Data synchronization method and device, electronic equipment and computer readable medium - Google Patents

Data synchronization method and device, electronic equipment and computer readable medium Download PDF

Info

Publication number
CN114860750A
CN114860750A CN202210807802.8A CN202210807802A CN114860750A CN 114860750 A CN114860750 A CN 114860750A CN 202210807802 A CN202210807802 A CN 202210807802A CN 114860750 A CN114860750 A CN 114860750A
Authority
CN
China
Prior art keywords
face
data
information
updated
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210807802.8A
Other languages
Chinese (zh)
Other versions
CN114860750B (en
Inventor
张岳
马小川
徐玉阳
李颖
王宣
徐家辉
孙玉红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Galaxy Technology Beijing Co ltd
Original Assignee
China Galaxy Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Galaxy Technology Beijing Co ltd filed Critical China Galaxy Technology Beijing Co ltd
Priority to CN202210807802.8A priority Critical patent/CN114860750B/en
Publication of CN114860750A publication Critical patent/CN114860750A/en
Application granted granted Critical
Publication of CN114860750B publication Critical patent/CN114860750B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/23Updating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/27Replication, distribution or synchronisation of data between databases or within a distributed database system; Distributed database system architectures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions

Abstract

The embodiment of the disclosure discloses a data synchronization method, a data synchronization device, an electronic device and a computer readable medium. One embodiment of the method comprises: updating the stored face data set to obtain an updated face data set; determining a figure information set corresponding to each updated face data in the updated face data set; carrying out information duplication elimination on each piece of figure information in the figure information set to obtain a duplication eliminated figure information set; according to the duplication-removed human information set, carrying out duplication removal on the updated human face data set to obtain a duplication-removed human face data set; determining a historical identification frequency information set corresponding to each piece of character information in the de-duplicated character information set to obtain a historical identification frequency information set group; and synchronizing the face data subsets corresponding to the de-duplicated face data sets to the panel machine device corresponding to each node in the information network of the panel machine device. The embodiment can efficiently synchronize the corresponding face data to be updated to each panel machine device in real time.

Description

Data synchronization method and device, electronic equipment and computer readable medium
Technical Field
Embodiments of the present disclosure relate to the field of computer technologies, and in particular, to a data synchronization method, apparatus, electronic device, and computer-readable medium.
Background
Currently, various fields often utilize data synchronization technology to realize real-time update of data. For the implementation of data synchronization, the following methods are generally adopted: first, update data transmitted by each data storage node is received. The resulting updated data set is then aggregated. And finally, sending the summarized total update data set to each data storage node.
However, the inventor finds that when the data synchronization is realized in the above manner, the following technical problems often exist:
firstly, each data storage node does not necessarily need the summarized total update data set, and the total update data set is sent to each data storage node in real time, so that the network bandwidth is high and the transmission efficiency is low in the data transmission process;
secondly, the data to be transmitted comprises pictures, and the pictures are often high-resolution pictures, so that the network bandwidth is high and the transmission efficiency is low in the data transmission process;
third, for data to be transmitted that includes pictures, the processing of the pictures is often not accurate enough, possibly resulting in loss of picture content for the pictures.
The above information disclosed in this background section is only for enhancement of understanding of the background of the inventive concept and, therefore, it may contain information that does not form the prior art that is already known to a person of ordinary skill in the art in this country.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose data synchronization methods, apparatuses, electronic devices and computer readable media to solve one or more of the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a data synchronization method, including: in response to receiving a data file to be updated of the face, which is sent by each node in an information network of the panel machine device and aims at a target time period, updating a stored face data set according to the obtained data file set to be updated of the face to obtain an updated face data set; determining a person information set corresponding to each updated face data in the updated face data set, wherein the person information includes: character encoding information; according to the character coding information set, carrying out information duplication elimination on each character information in the character information set to obtain a duplication eliminated character information set; according to the duplication-removed human information set, carrying out duplication removal on the updated human face data set to obtain a duplication-removed human face data set; determining a historical identification frequency information set corresponding to each piece of character information in the de-duplicated character information sets to obtain a historical identification frequency information set group, wherein the historical identification frequency information represents the frequency of the panel machine device for identifying the characters corresponding to the character information; and synchronizing the face data subsets corresponding to the face data sets after the duplication removal to panel machine devices corresponding to each node in the panel machine device information network according to the historical identification frequency information set group.
In a second aspect, some embodiments of the present disclosure provide a data synchronization apparatus, including: the updating unit is configured to respond to the received data file to be updated of the face, which is sent by each node in the information network of the panel machine device and aims at the target time period, and update the stored face data set according to the obtained data file set to be updated of the face to obtain an updated face data set; a first determination unit configured to determine a personal information set corresponding to each of the updated face data sets, wherein the personal information includes: character encoding information; the information duplication removing unit is configured to perform information duplication removal on each piece of figure information in the figure information set according to the figure coding information set to obtain a duplication removed figure information set; a data deduplication unit configured to perform deduplication on the updated face data set according to the deduplicated person information set to obtain a deduplicated face data set; a second determination unit configured to determine a historical recognition frequency information set corresponding to each piece of character information in the deduplicated character information set, to obtain a historical recognition frequency information set group, wherein the historical recognition frequency information represents the frequency of the panel machine device for recognizing the character corresponding to the character information; and the data synchronization unit is configured to synchronize the face data subsets corresponding to the face data sets after the duplication removal to panel machine devices corresponding to each node in the panel machine device information network according to the historical identification frequency information set.
In a third aspect, some embodiments of the present disclosure provide an electronic device, comprising: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement the method as described in any of the implementations of the first aspect.
In a fourth aspect, some embodiments of the disclosure provide a computer readable medium having a computer program stored thereon, where the program when executed by a processor implements a method as described in any of the implementations of the first aspect.
In a fifth aspect, some embodiments of the present disclosure provide a computer program product comprising a computer program that, when executed by a processor, implements the method described in any of the implementations of the first aspect above.
The above embodiments of the present disclosure have the following advantages: the data synchronization method of some embodiments of the present disclosure can efficiently synchronize the corresponding face data to be updated to each panel machine device in real time. Specifically, the reason why the face data to be updated cannot be synchronized to each panel computer device in real time and efficiently is that: each data storage node does not necessarily need the summarized total update data set, and the total update data set is sent to each data storage node in real time, so that the network bandwidth is high and the transmission efficiency is low in the data transmission process. Based on this, in the data synchronization method of some embodiments of the present disclosure, first, in response to receiving a data file to be updated of a face, which is sent by each node in an information network of a panel device and is specific to a target time period, a stored face data set is updated according to an obtained data file set to be updated of the face, so as to obtain an updated face data set. At least two pieces of updated face data which correspond to the same person may exist in the updated face data set. By updating the post-update face dataset, an up-to-date full-size face dataset for the target time period may be subsequently generated. And then, determining a person information set corresponding to each piece of updated face data in the updated face data set, so as to be used for carrying out data deduplication on each piece of updated face data in the updated face data set. And then, according to the person coding information set, carrying out information duplication removal on each piece of person information in the person information set to obtain a duplication-removed person information set, so as to be used for carrying out data duplication removal on each piece of updated face data in the updated face data set. Further, the updated face data set is deduplicated according to the deduplicated human information set, so that a deduplicated face data set without repeated face data can be accurately obtained. I.e. the de-duplicated face data set may be the latest face data set for the target time period. Furthermore, the historical recognition frequency information set corresponding to each piece of character information in the de-duplicated character information set is determined, so that the face data subset corresponding to each node can be determined in a targeted mode. Here, by determining the face data set corresponding to each node, it is not necessary to synchronize all the face data sets different from each other to each node subsequently. And finally, according to the historical identification frequency information set group, efficiently synchronizing the face data subsets corresponding to the face data sets after the duplication removal to panel machine devices corresponding to each node in the panel machine device information network. In the process of synchronizing the face data subsets, the network bandwidth in the data transmission process is greatly reduced, and the data transmission efficiency is greatly improved.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale.
Fig. 1 is a flow diagram of some embodiments of a data synchronization method according to the present disclosure;
FIG. 2 is a schematic block diagram of some embodiments of a data synchronization apparatus according to the present disclosure;
FIG. 3 is a schematic block diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Referring to fig. 1, a flow 100 of some embodiments of a data synchronization method according to the present disclosure is shown. The data synchronization method comprises the following steps:
step 101, in response to receiving a data file to be updated of a face, which is sent by each node in an information network of a panel machine device and is aimed at a target time period, updating a stored face data set according to the obtained data file set to be updated of the face to obtain an updated face data set.
In some embodiments, in response to receiving a data file to be updated of a face of a user for a target time period, which is sent by each node in an information network of a panel machine device, an executing entity (e.g., an electronic device) of the data synchronization method may update a stored face data set according to an obtained data file set to be updated of the face of the user, so as to obtain an updated face data set. The information network of the panel machine device can be a pre-established information network. At least one panel apparatus is located at each node in the panel apparatus information network. Each node has corresponding location information. For example, a panel apparatus information network includes: a first node, a second node, a third node, and a fourth node. The location information corresponding to the first node is "xi' an". The location information corresponding to the second node is "beijing". The location information corresponding to the third node is "Harbin". The location information corresponding to the fourth node is "shanghai". 3 panel devices are provided at the first node corresponding points to perform face recognition. 4 panel devices are provided at the second node corresponding points to perform face recognition. And 5 panel machine devices are arranged at the corresponding positions of the third node to perform face recognition. 3 panel devices are provided at the fourth node corresponding points to perform face recognition. The panel apparatus may be an apparatus for performing face recognition. Each node has a corresponding database to store face data. The execution body stores the full amount of the face data set which is updated in real time. The target period may be a preset period. For example, the target time period may be 1 week. Therefore, the executing body can update the face data every target time period so as to ensure that the data in the database corresponding to each node is updated in real time. The data file to be updated of the human face can be a file in which data to be updated of the human face is stored. The data to be updated of the face may be one of the following: the data to be deleted of the face, the data to be replaced of the face and the data to be added of the face are obtained. The face data in the face data set may include: a person face image, person contact information. Specifically, the person contact information may include: people communication mode and people office location information.
Optionally, the execution main body may receive, through an API (application programming interface) interface service of a hypertext Transfer Protocol over secure session Layer (HTTPS), a data file to be updated for a face of a target time period, where the data file is sent by each node in a panel machine device information network and is sent by a panel machine device corresponding to the target time period.
It should be noted that there may be repeated face data in the updated face data set.
For example, company a has located the electronic device (e.g., server) at location a and stores the latest face data set up to 11/1/month. Since the target time period is 1 week, the face data set of day 1/11 needs to be updated at day 8/11. There are 3 subsidiaries for company a. The 3 subsidiaries were: a first subsidiary, a second subsidiary and a third subsidiary. The first subsidiary company was in Beijing. The second subsidiary is located in Shanghai. The third subsidiary company is Shenzhen. The first subsidiary company has 4 panel machine units. The second subsidiary company has 3 panel machine units. A third subsidiary company exists 5 panel machine installations. The respective subsidiary companies are taken as nodes to generate the panel apparatus information network. On day 3 of 11 months, new employee B of company a temporarily worked in the first subsidiary company, and new face data was entered. Since the face data set of the new employee B has not been synchronized to the respective subsidiary at this time. Therefore, on day 4/11, new employee B temporarily worked in the second subsidiary, and new face data needs to be entered into the second subsidiary. Therefore, on day 11, 8, each subsidiary corresponds to a data file to be updated for a face between day 11, 1 and day 11, 8, which is sent by the panel device, and has a plurality of face data for new employee B. Thus, there may be duplicate face data for new employee B in the updated face data set, since the updated face data set is not deduplicated.
The electronic device may be hardware or software. When the electronic device is hardware, the electronic device may be implemented as a distributed cluster formed by a plurality of servers or terminal devices, or may be implemented as a single server or a single terminal device. When the electronic device is embodied as software, it may be installed in the above-listed hardware devices. It may be implemented, for example, as multiple software or software modules to provide distributed services, or as a single software or software module. And is not particularly limited herein.
It should be understood that the number of electronic devices is merely illustrative. There may be any number of electronic devices, as desired for implementation.
In some optional implementation manners of some embodiments, the data file to be updated of the data file set to be updated of the face includes: and the face adding coding data subfile, the face deleting data subfile and the face replacing coding data subfile.
In some optional implementation manners of some embodiments, the updating the stored face data set according to the obtained data file set to be updated of the face to obtain an updated face data set may include the following steps:
the execution main body can input each newly added face coding data in the newly added face coding data subfile into a decoding model in a first coding and decoding model to obtain a newly added face decoding data set.
Wherein a decoding model of the first encoding and decoding model may be at least one layer of convolutional neural network implementing upsampling. And after the face newly added coded data is input into a decoding model in the first coding and decoding model, the obtained face newly added decoding data is equal to the face newly added data input by the child node.
And secondly, for each face replacement encoding data subfile in the face replacement encoding data subfile set, the execution main body can input each face replacement encoding data in the face replacement encoding data subfile into a decoding model in a second encoding and decoding model to obtain a face replacement decoding data set.
Wherein a decoding model of the first encoding and decoding model may be at least one layer of a second convolutional neural network implementing upsampling. And after the face replacement coded data is input into a decoding model in the second coding and decoding model, the obtained face replacement decoded data is equal to the face new data replaced by the child nodes.
Optionally, the model structure of the second encoding and decoding model is the same as the model structure of the first encoding and decoding model.
And thirdly, the executing main body can update the face data set according to the obtained face newly-added decoding data set, the face deletion data subfile set and the face replacement decoding data set, so as to obtain the updated face data set.
In some optional implementation manners of some embodiments, the new face coding data in the new face coding data subfile is data obtained by coding the new face data, the face replacement coding data in the face replacement coding data subfile is data obtained by coding the face replacement data, and the face deletion data in the face deletion data subfile is a coding identifier of data to be deleted from the face. And each piece of face data has a unique corresponding code identification. The face deletion data can be the coded identification of the user corresponding to the face data to be deleted. The coded identifier corresponding to each user may be preset.
In some optional implementations of some embodiments, the face newly added encoded data is generated by:
firstly, obtaining a figure image of a newly added figure.
And secondly, inputting the figure image into a coding model in the first coding and decoding model to obtain a first coding matrix.
And the matrix dimension corresponding to the first coding matrix is smaller than the matrix dimension of the matrix corresponding to the character image. Wherein the coding model in the first coding and decoding model may be a multi-layer convolutional neural network implementing downsampling.
And thirdly, determining the first coding matrix as newly added coding data of the human face.
In some optional implementations of some embodiments, the face replacement encoding data is generated by:
the method comprises the steps of firstly, obtaining a historical person image, a person image to be replaced and replacement information corresponding to a replacement person. Wherein the historical personal image has the same image resolution as the personal image to be replaced, and the replacement information includes at least one of the following items: office place replacement information and communication mode replacement information. The above-mentioned historical personal image may be a personal image in the face data which is previously stored as the same person as the person corresponding to the personal image to be replaced.
And secondly, subtracting the matrix corresponding to the to-be-replaced character image from the matrix corresponding to the historical character image to obtain a subtraction matrix.
Wherein the subtraction matrix may represent image pixel value differences between the image of the person to be replaced and the historical images of the person.
And thirdly, inputting the subtraction matrix into a coding model in a second coding and decoding model to obtain a second coding matrix.
And the matrix dimension of the second coding matrix is smaller than that of the subtraction matrix.
And fourthly, respectively carrying out word embedding processing on the office place replacement information and the communication mode replacement information contained in the replacement information to obtain an office place replacement matrix and a communication mode replacement matrix.
And the matrix dimension of the office place replacement matrix and the matrix dimension of the communication mode replacement matrix are the same as the matrix dimension of the second coding matrix.
And fifthly, superposing the office place replacement matrix, the communication mode replacement matrix and the second coding matrix along a target direction to obtain a superposed matrix. Wherein the target direction may be a matrix vertical direction.
And sixthly, determining the superposition matrix as the face replacement encoding data.
Optionally, the coding model of the first coding and decoding model comprises: 3 convolutional layers and 3 pooling layers for downsampling. The decoding model of the first encoding and decoding model comprises: 1 convolutional layer and 1 pooling layer for upsampling. And inputting each new face data in the new face data subfile into the coding model in the first coding and decoding model to obtain a new face coding data set, wherein the new face coding data set comprises the following steps:
executing the face newly added coding data generation step for each face newly added data in each face newly added data:
in the first step, the execution main body may input the face newly added encoded data to a first convolution layer of the 3 convolution layers for downsampling to obtain a first convolution result.
In a second step, the execution body may input the first convolution result to the first pooling layer to obtain a first pooling result.
Wherein the first pooling layer may be a network layer that averages the first volumetric results.
Third, the execution body may input the first pooling result to a second convolutional layer of the 3 convolutional layers for down-sampling to obtain a second convolutional result.
Fourth, the execution main body may input the second convolution result to the second pooling layer to obtain a second pooling result.
Wherein the second pooling layer may be a network layer that averages the second convolution results.
In the fifth step, the execution body may input the second pooling result to a third convolutional layer of the 3 convolutional layers for downsampling to obtain a third convolutional result.
The execution subject may input the third convolution result to the third pooling layer to obtain a third pooling result.
Wherein the third pooling layer may be a network layer that averages the third convolution results.
Seventh, the executing entity may determine the third pooling result as new face code data.
Optionally, the execution main body may input the third pooling result to 1 convolutional neural network for face recognition of the person, so as to obtain a face recognition result for the newly added face decoding data.
Optionally, the executing body may input the third pooling result to 1 convolutional neural network for face age recognition, so as to obtain age information of the newly-added decoded data for the face.
Therefore, in the training process of the first coding and decoding model, after the face newly-added coding data output by the first coding and decoding model, the face newly-added coding data are input to the plurality of convolutional neural networks for identifying the figure information to form multitask training, and the obtained face newly-added coding data can be ensured to be more accurate subsequently. Wherein, the personal information may include but is not limited to at least one of the following: person age information, person face information.
Optionally, the step of inputting each new face encoded data in the new face encoded data subfile to a decoding model in the first encoding and decoding model to obtain a new face decoded data set may include the following steps:
and executing a face newly-added decoding data generation step for each face newly-added coding data in each face newly-added coding data:
in the first step, the execution main body may superimpose the first pooling result, the second pooling result, and the third pooling result along the target direction in a data filling manner, so as to obtain a post-superimposing pooling result.
In the second step, the execution body may input the pooled result after the stacking to 1 convolution layer for upsampling, so as to obtain a fourth convolution result.
And thirdly, the executing body can input the fourth convolution result to a fourth pooling layer to obtain a fourth pooling result.
And fourthly, the execution main body can generate new face decoding data according to the fourth pooling result.
As an inventive point of the embodiments of the present disclosure, the second technical problem and the third technical problem mentioned in the background art are solved, that is, "second, the data to be transmitted includes a picture, and the picture is often a high-resolution picture, and in the data transmission process, the network bandwidth is high, and the transmission efficiency is low. Third, for data to be transmitted that includes pictures, the processing of the pictures is often not accurate enough, possibly resulting in loss of picture content for the pictures. ". Based on the above, the present disclosure performs encoding and decoding processing on each face new data and each face replacement data, thereby greatly reducing network bandwidth in the data transmission process and improving data transmission efficiency. In addition, by adding a plurality of convolutional neural networks for identifying character information into the first coding and decoding model and the second coding and decoding model, the coding accuracy of each face new data and each face replacement data can be greatly improved, and the content loss of pictures in the transmission process is avoided.
And 102, determining a person information set corresponding to each piece of updated face data in the updated face data sets.
In some embodiments, the executing entity may determine, by means of a personal information query, a personal information set corresponding to each of the updated face data sets. Wherein the personal information in which at least two pieces of updated face data may exist in each piece of updated face data is the same.
And 103, carrying out information duplication elimination on each piece of personal information in the personal information set according to the personal coding information set to obtain a duplication eliminated personal information set.
In some embodiments, the executing entity may perform information duplication elimination on each piece of personal information in the personal information set according to the personal coding information set, so as to obtain an duplication eliminated personal information set. The personal code information may be a code identification uniquely determined by the personal information.
For example, the execution subject may determine each of the coded identifications corresponding to each of the pieces of personal information in the personal information set according to the personal coded information set. Then, the execution main body may perform deduplication processing on each code identifier to obtain each code identifier after deduplication. And finally, determining each piece of personal information in the personal information set, which corresponds to each code identifier after the duplication removal, as the duplicated personal information set.
For example, the personal information set is { "coded identity: 0011, character name: sun, people contact: 134 × 231 "," coded identifier: 0012, character name: plum, contact manner of person: 132 × 211 "," coded identification: 0011, character name: sun, people contact: 134 × 231 "," coded identifier: 0013, character name: week, person contact: 157 × 123 "," coded identification: 0014, character name: wang, people contact: 157**123"}. The human information set after the duplication removal is coded identification: 0011, character name: sun, people contact: 134 × 231 "," coded identifier: 0012, character name: plum, contact manner of person: 132 × 211 "," coded identification: 0013, character name: week, person contact: 157 × 123 "," coded identification: 0014, character name: wang, people contact: 157**123"}.
And 104, removing the duplication of the updated face data set according to the duplication-removed human information set to obtain a duplication-removed face data set.
In some embodiments, the executing entity may perform deduplication on the updated face data set according to the deduplicated person information set to obtain a deduplicated face data set.
And 105, determining a historical identification frequency information set corresponding to each piece of personal information in the deduplicated personal information sets to obtain a historical identification frequency information set group.
In some embodiments, the execution subject may determine the historical identification number information set corresponding to each piece of personal information in the deduplicated personal information set by means of a historical identification number information query, so as to obtain a historical identification number information set group. The historical identification frequency information can be the historical frequency of the person identified by the panel machine.
And 106, synchronizing the face data subsets corresponding to the face data sets after the duplication removal to panel machine devices corresponding to each node in the panel machine device information network according to the historical identification frequency information set group.
In some embodiments, the executing agent may synchronize, according to the historical recognition number information set, the subset of face data corresponding to the deduplicated face data set to the panel machine device corresponding to each node in the panel machine device information network in various manners. And each node corresponds to the panel machine device, and the face data subsets in one-to-one correspondence exist. The corresponding face data subsets may be the same or different between the nodes.
In some optional implementation manners of some embodiments, the synchronizing, according to the historical recognition number information set, the face data subset corresponding to the deduplicated face data set to the panel machine device corresponding to each node in the panel machine device information network may include:
in the first step, for each of the pieces of person information in the deduplicated person information set, the executing body may execute a first data synchronization step of:
the first sub-step, in response to determining that the personal information is not newly added personal information, the executing agent may filter out historical identification frequency information having a value greater than a target threshold from a set of historical identification frequency information corresponding to the personal information to obtain at least one piece of historical identification frequency information.
Wherein, the target threshold may be preset. For example, the target threshold may be 5.
Optionally, after the first substep, the first data synchronization step may further comprise:
a determination step: and in response to determining that the personal information is the newly added personal information, determining a node corresponding to the personal information as a target node. The person information has corresponding face data. The face data has a corresponding source node, namely a target node.
And a data synchronization step: and synchronizing the face data corresponding to the character information to the panel machine devices corresponding to the plurality of nodes in the information network of the panel machine device. The plurality of nodes are nodes except the target node from the nodes corresponding to the panel machine device information network.
In a second sub-step, the executing body may determine at least one panel device corresponding to the at least one historical recognition frequency information.
And the historical identification times in the at least one piece of historical identification time information correspond to panel machine devices in the at least one panel machine device one to one.
In a third sub-step, the executing agent may determine at least one node corresponding to the at least one panel device. The at least one panel machine device may have a one-to-one correspondence relationship with the at least one node. At least one panel machine device is a plurality of panel machine devices, and at least two of the plurality of panel machine devices may have a corresponding relationship with a certain node in at least one node, namely a many-to-one corresponding relationship.
For example, at least one panel apparatus comprises: a first panel trigger device, a second panel trigger device, and a third panel trigger device. Wherein the at least one node may include: a first node, a second node and a third node. The first panel machine device is a panel machine device below the first node. The second panel machine device is a panel machine device under the second node. The third panel maker device is a panel maker device under the third node.
For another example, at least one panel apparatus comprises: a first panel trigger device, a second panel trigger device, and a third panel trigger device. Wherein the at least one node may include: a first node and a second node. The first panel machine device is a panel machine device below the first node. The second panel trigger means and the third panel trigger means are panel trigger means under the second node.
As an example, the execution subject may determine at least one node corresponding to the at least one panel apparatus through a panel apparatus and node correspondence table. The panel machine device and node correspondence table can represent the correspondence between the panel machine device and the node.
In the fourth substep, the executing agent may screen out a node associated with a source of the face data corresponding to the personal information from the at least one node as a source node. The source node may be a node that uploads a data file to be updated of a face to which face data corresponding to personal information belongs.
In a fifth sub-step, the executing entity may remove a source node from the at least one node to obtain a set of removed nodes.
In a sixth sub-step, the executing body may determine, as target face data, face data corresponding to the person information in the deduplicated face data set.
And a seventh substep, wherein the executing agent can add the target face data into a message queue to synchronize to panel machine devices corresponding to the nodes in the removed node set.
The above embodiments of the present disclosure have the following advantages: the data synchronization method of some embodiments of the present disclosure can efficiently synchronize the corresponding face data to be updated to each panel machine device in real time. Specifically, the reason why the face data to be updated cannot be synchronized to each panel computer device in real time and efficiently is that: each data storage node does not necessarily need the summarized total update data set, and the total update data set is sent to each data storage node in real time, so that the network bandwidth is high and the transmission efficiency is low in the data transmission process. Based on this, in the data synchronization method of some embodiments of the present disclosure, first, in response to receiving a data file to be updated of a face, which is sent by each node in an information network of a panel device and is specific to a target time period, a stored face data set is updated according to an obtained data file set to be updated of the face, so as to obtain an updated face data set. At least two pieces of updated face data which correspond to the same person may exist in the updated face data set. By updating the post-update face dataset, an up-to-date full-size face dataset for the target time period may be subsequently generated. And then, determining a person information set corresponding to each piece of updated face data in the updated face data set, so as to be used for carrying out data deduplication on each piece of updated face data in the updated face data set. And then, according to the person coding information set, carrying out information duplication removal on each piece of person information in the person information set to obtain a duplication-removed person information set, so as to be used for carrying out data duplication removal on each piece of updated face data in the updated face data set. Further, the updated face data set is deduplicated according to the deduplicated human information set, so that a deduplicated face data set without repeated face data can be accurately obtained. I.e. the de-duplicated face data set may be the latest face data set for the target time period. Furthermore, the historical recognition frequency information set corresponding to each piece of character information in the de-duplicated character information set is determined, so that the face data subset corresponding to each node can be determined in a targeted mode. Here, by determining the face data set corresponding to each node, it is not necessary to synchronize all the face data sets different from each other to each node subsequently. And finally, according to the historical identification frequency information set group, efficiently synchronizing the face data subsets corresponding to the face data sets after the duplication removal to panel machine devices corresponding to each node in the panel machine device information network. In the process of synchronizing the face data subsets, the network bandwidth in the data transmission process is greatly reduced, and the data transmission efficiency is greatly improved.
With further reference to fig. 2, as an implementation of the methods shown in the above figures, the present disclosure provides some embodiments of a data synchronization apparatus, which correspond to those shown in fig. 1, and which may be applied in various electronic devices in particular.
As shown in fig. 2, a data synchronization apparatus 200 includes: an updating unit 201, a first determining unit 202, an information deduplication unit 203, a data deduplication unit 204, a second determining unit 205, and a data synchronization unit 206. The updating unit 201 is configured to, in response to receiving a data file to be updated of a human face, which is sent by each node in an information network of a panel device and is aimed at a target time period, update a stored human face data set according to an obtained data file set to be updated of the human face, so as to obtain an updated human face data set; a first determining unit 202 configured to determine a personal information set corresponding to each of the updated face data sets, wherein the personal information includes: character encoding information; an information duplication removing unit 203, configured to perform information duplication removal on each piece of personal information in the personal information set according to the personal coding information set, so as to obtain a duplication removed personal information set; a data deduplication unit 204 configured to perform deduplication on the updated face data set according to the deduplicated person information set to obtain a deduplicated face data set; a second determining unit 205 configured to determine a historical recognition frequency information set corresponding to each piece of character information in the deduplicated character information set, and obtain a historical recognition frequency information set group, wherein the historical recognition frequency information represents the frequency of the panel machine device recognizing the corresponding character of the character information; a data synchronization unit 206 configured to synchronize the face data subsets corresponding to the deduplicated face data sets to panel devices corresponding to each node in the panel device information network according to the historical recognition times information set.
It will be understood that the units described in the apparatus 200 correspond to the various steps in the method described with reference to fig. 1. Thus, the operations, features and resulting advantages described above with respect to the method are also applicable to the apparatus 200 and the units included therein, and are not described herein again.
Referring now to fig. 3, a block diagram of an electronic device (e.g., electronic device) 300 suitable for use in implementing some embodiments of the present disclosure is shown. The electronic device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 3, the electronic device 300 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 301 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 302 or a program loaded from a storage means 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data necessary for the operation of the electronic apparatus 300 are also stored. The processing device 301, the ROM 302, and the RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
Generally, the following devices may be connected to the I/O interface 305: input devices 306 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 307 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage devices 308 including, for example, magnetic tape, hard disk, etc.; and a communication device 309. The communication means 309 may allow the electronic device 300 to communicate wirelessly or by wire with other devices to exchange data. While fig. 3 illustrates an electronic device 300 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 3 may represent one device or may represent multiple devices, as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network through the communication device 309, or installed from the storage device 308, or installed from the ROM 302. The computer program, when executed by the processing apparatus 301, performs the above-described functions defined in the methods of some embodiments of the present disclosure.
It should be noted that the computer readable medium described above in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: in response to receiving a data file to be updated of the face, which is sent by each node in an information network of the panel machine device and aims at a target time period, updating a stored face data set according to the obtained data file set to be updated of the face to obtain an updated face data set; determining a person information set corresponding to each updated face data in the updated face data set, wherein the person information includes: character encoding information; according to the character coding information set, carrying out information duplication elimination on each character information in the character information set to obtain a duplication eliminated character information set; according to the duplication-removed human information set, carrying out duplication removal on the updated human face data set to obtain a duplication-removed human face data set; determining a historical identification frequency information set corresponding to each piece of character information in the de-duplicated character information sets to obtain a historical identification frequency information set group, wherein the historical identification frequency information represents the frequency of the panel machine device for identifying the characters corresponding to the character information; and synchronizing the face data subsets corresponding to the face data sets after the duplication removal to panel machine devices corresponding to each node in the panel machine device information network according to the historical identification frequency information set group.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by software, and may also be implemented by hardware. The described units may also be provided in a processor, and may be described as: a processor includes an updating unit, a first determining unit, an information deduplication unit, a data deduplication unit, a second determining unit, and a data synchronization unit. The names of these units do not limit the units themselves in some cases, and for example, the data deduplication unit may also be described as a "unit that deduplicates the updated face data set according to the deduplicable human feature information set to obtain a deduplicated face data set".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
Some embodiments of the present disclosure also provide a computer program product comprising a computer program which, when executed by a processor, implements any of the data synchronization methods described above.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (10)

1. A method of data synchronization, comprising:
in response to receiving a data file to be updated of the face, which is sent by each node in an information network of the panel machine device and aims at a target time period, updating a stored face data set according to the obtained data file set to be updated of the face to obtain an updated face data set;
determining a person information set corresponding to each updated face data in the updated face data set, wherein the person information comprises: character encoding information;
according to the figure coding information set, carrying out information duplication elimination on each figure information in the figure information set to obtain a duplication eliminated figure information set;
according to the duplication-removed human information set, carrying out duplication removal on the updated human face data set to obtain a duplication-removed human face data set;
determining a historical identification frequency information set corresponding to each piece of figure information in the duplication-removed figure information set to obtain a historical identification frequency information set group, wherein the historical identification frequency information represents the frequency of the panel machine device for identifying the figures corresponding to the figure information;
and synchronizing the face data subsets corresponding to the de-duplicated face data set to the panel machine device corresponding to each node in the information network of the panel machine devices according to the historical identification frequency information set group.
2. The method of claim 1, wherein the data files to be updated of the set of data files to be updated of the human face comprise: the face newly-added coding data subfile, the face deleted data subfile and the face replacement coding data subfile; and
the updating the stored face data set according to the obtained data file set to be updated of the face to obtain an updated face data set, including:
for each newly added face encoding data subfile in the newly added face encoding data subfile set, inputting each newly added face encoding data in the newly added face encoding data subfile to a decoding model in a first encoding and decoding model to obtain a newly added face decoding data set;
for each face replacement encoding data subfile in the face replacement encoding data subfile set, inputting each face replacement encoding data in the face replacement encoding data subfile into a decoding model in a second encoding and decoding model to obtain a face replacement decoding data set;
and updating the face data set according to the obtained face newly-added decoding data set, the face deletion data subfile set and the face replacement decoding data set to obtain the updated face data set.
3. The method of claim 2, wherein the synchronizing the subset of face data corresponding to the de-duplicated face data set to the panel machine device corresponding to each node in the panel machine device information network according to the historical recognition times information set comprises:
for each person information in the de-duplicated person information set, executing a data synchronization step:
in response to the fact that the character information is determined not to be newly added character information, historical identification frequency information with the value larger than a target threshold value is screened out from a historical identification frequency information set corresponding to the character information, and at least one piece of historical identification frequency information is obtained;
determining at least one panel machine device corresponding to the at least one piece of historical identification frequency information;
determining at least one node corresponding to the at least one panel machine device;
screening out nodes associated with the human face data source corresponding to the character information from the at least one node as source nodes;
removing the source node from the at least one node to obtain a removed node set;
determining the face data corresponding to the figure information in the de-duplicated face data set as target face data;
and adding the target face data into a message queue so as to synchronize to the panel machine device corresponding to each node in the removed node set.
4. The method of claim 3, wherein before the filtering out the historical identification times information having a value greater than a target threshold from the historical identification times information set corresponding to the personal information to obtain at least one piece of historical identification times information in response to determining that the personal information is not new personal information, the method further comprises:
in response to the fact that the figure information is determined to be newly added figure information, determining a node corresponding to the figure information to serve as a target node;
and synchronizing the face data corresponding to the character information to the panel machine device corresponding to a plurality of nodes in the information network of the panel machine device, wherein the plurality of nodes are nodes of the panel machine device information network except the target node.
5. The method according to claim 4, wherein the new face adding encoding data in the new face adding encoding data subfile is data obtained by encoding the new face adding data, the face replacing encoding data in the face replacing encoding data subfile is data obtained by encoding the face replacing data, and the face deleting data in the face deleting data subfile is an encoding identifier of data to be deleted of the face.
6. The method of claim 5, wherein the face fresh code data is generated by:
acquiring a figure image of the newly added figure;
inputting the character image into a coding model in a first coding and decoding model to obtain a first coding matrix, wherein the matrix dimension corresponding to the first coding matrix is smaller than the matrix dimension of the matrix corresponding to the character image;
and determining the first coding matrix as the newly added face coding data.
7. The method of claim 6, wherein the face replacement encoded data is generated by:
acquiring a historical figure image, a figure image to be replaced and replacement information corresponding to a replaced figure, wherein the historical figure image and the figure image to be replaced have the same image resolution, and the replacement information comprises at least one of the following items: office place replacement information and communication mode replacement information;
subtracting the matrix corresponding to the character image to be replaced from the matrix corresponding to the historical character image to obtain a subtraction matrix;
inputting the subtraction matrix into a coding model in a second coding and decoding model to obtain a second coding matrix, wherein the matrix dimension of the second coding matrix is smaller than that of the subtraction matrix;
word embedding processing is respectively carried out on the office place replacement information and the communication mode replacement information which are included in the replacement information, so that an office place replacement matrix and a communication mode replacement matrix are obtained, wherein the matrix dimension of the office place replacement matrix and the matrix dimension of the communication mode replacement matrix are the same as the matrix dimension of the second coding matrix;
superposing the office place replacement matrix, the communication mode replacement matrix and the second coding matrix along a target direction to obtain a superposed matrix;
and determining the superposition matrix as the face replacement encoding data.
8. A data synchronization apparatus, comprising:
the updating unit is configured to respond to the received data file to be updated of the face of the target time period, which is sent by each node in the information network of the panel machine device and corresponds to the panel machine device, and update the stored face data set according to the obtained data file set to be updated of the face of the target time period to obtain an updated face data set;
a first determination unit configured to determine a person information set corresponding to each of the updated face data sets, wherein the person information includes: character encoding information;
the information duplication removing unit is configured to perform information duplication removal on each piece of figure information in the figure information set according to the figure coding information set to obtain a duplication removed figure information set;
the data duplication removing unit is configured to remove duplication of the updated face data set according to the duplication-removed human information set to obtain a duplication-removed face data set;
a second determination unit configured to determine a historical recognition frequency information set corresponding to each piece of character information in the deduplicated character information set, resulting in a historical recognition frequency information set group, wherein the historical recognition frequency information represents the frequency of the panel machine device for recognizing the character corresponding to the character information;
and the data synchronization unit is configured to synchronize the face data subsets corresponding to the de-duplicated face data sets to panel machine devices corresponding to each node in the information network of the panel machine devices according to the historical identification time information set.
9. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
10. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-7.
CN202210807802.8A 2022-07-11 2022-07-11 Data synchronization method and device, electronic equipment and computer readable medium Active CN114860750B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210807802.8A CN114860750B (en) 2022-07-11 2022-07-11 Data synchronization method and device, electronic equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210807802.8A CN114860750B (en) 2022-07-11 2022-07-11 Data synchronization method and device, electronic equipment and computer readable medium

Publications (2)

Publication Number Publication Date
CN114860750A true CN114860750A (en) 2022-08-05
CN114860750B CN114860750B (en) 2022-09-20

Family

ID=82626699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210807802.8A Active CN114860750B (en) 2022-07-11 2022-07-11 Data synchronization method and device, electronic equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN114860750B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103399888A (en) * 2013-07-19 2013-11-20 广东电网公司电力科学研究院 Differential synchronization method and system for power grid model data
US20200159734A1 (en) * 2018-11-21 2020-05-21 BigObject Inc. Data tracking apparatus, method, and non-transitory computer readable storage medium thereof
CN113821517A (en) * 2021-11-23 2021-12-21 太平金融科技服务(上海)有限公司深圳分公司 Data synchronization method, device, equipment and storage medium
CN114064666A (en) * 2020-07-31 2022-02-18 上海晓信信息科技有限公司 Data warehouse synchronization system and method
CN114648798A (en) * 2022-03-18 2022-06-21 成都商汤科技有限公司 Face recognition method and device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103399888A (en) * 2013-07-19 2013-11-20 广东电网公司电力科学研究院 Differential synchronization method and system for power grid model data
US20200159734A1 (en) * 2018-11-21 2020-05-21 BigObject Inc. Data tracking apparatus, method, and non-transitory computer readable storage medium thereof
CN114064666A (en) * 2020-07-31 2022-02-18 上海晓信信息科技有限公司 Data warehouse synchronization system and method
CN113821517A (en) * 2021-11-23 2021-12-21 太平金融科技服务(上海)有限公司深圳分公司 Data synchronization method, device, equipment and storage medium
CN114648798A (en) * 2022-03-18 2022-06-21 成都商汤科技有限公司 Face recognition method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN114860750B (en) 2022-09-20

Similar Documents

Publication Publication Date Title
CN105745902A (en) Synchronizing event history for multiple clients
CN111931474B (en) Information table generation method, device, electronic equipment and computer readable medium
CN114327493A (en) Data processing method and device, electronic equipment and computer readable medium
CN114399814A (en) Deep learning-based obstruction removal and three-dimensional reconstruction method
US20210248159A1 (en) Subscription-Based Change Data Capture Mechanism Using Database Triggers
CN114860750B (en) Data synchronization method and device, electronic equipment and computer readable medium
CN113190517A (en) Data integration method and device, electronic equipment and computer readable medium
CN112182112A (en) Block chain based distributed data dynamic storage method and electronic equipment
CN112182108A (en) Block chain based distributed data storage updating method and electronic equipment
CN111858381A (en) Application program fault tolerance capability test method, electronic device and medium
CN111950003A (en) Method and device for generating unique identification information of user equipment and electronic equipment
CN115760607A (en) Image restoration method, device, readable medium and electronic equipment
CN114490718A (en) Data output method, data output device, electronic equipment and computer readable medium
CN115757933A (en) Recommendation information generation method, device, equipment, medium and program product
CN115665363A (en) Video conference method, device, equipment and storage medium
CN112699111B (en) Report generation method and device, electronic equipment and computer readable medium
CN115017149A (en) Data processing method and device, electronic equipment and storage medium
CN112434064A (en) Data processing method, device, medium and electronic equipment
CN113420400A (en) Routing relation establishing method, request processing method, device and equipment
CN112070888A (en) Image generation method, device, equipment and computer readable medium
CN112734962B (en) Attendance information generation method and device, computer equipment and readable storage medium
CN112650722B (en) File processing method and device based on android application program, electronic equipment and medium
CN111262776B (en) Method, device, electronic equipment and computer readable medium for sending notification message
CN116126818A (en) Retail data synchronization method, device, electronic equipment and computer readable medium
CN113792072A (en) Data transmission method, device, electronic equipment and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant