CN110992105A - Person image processing method and device, electronic device and storage medium - Google Patents

Person image processing method and device, electronic device and storage medium Download PDF

Info

Publication number
CN110992105A
CN110992105A CN201911266714.6A CN201911266714A CN110992105A CN 110992105 A CN110992105 A CN 110992105A CN 201911266714 A CN201911266714 A CN 201911266714A CN 110992105 A CN110992105 A CN 110992105A
Authority
CN
China
Prior art keywords
information
tag
data
person
processing unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911266714.6A
Other languages
Chinese (zh)
Inventor
潘志军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Mininglamp Software System Co ltd
Original Assignee
Beijing Mininglamp Software System Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Mininglamp Software System Co ltd filed Critical Beijing Mininglamp Software System Co ltd
Priority to CN201911266714.6A priority Critical patent/CN110992105A/en
Publication of CN110992105A publication Critical patent/CN110992105A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data

Landscapes

  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Data Mining & Analysis (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides a person portrait processing method and device, electronic equipment and a storage medium, and relates to the technical field of data processing. In the present application, first, people portrait data to be processed is obtained, where the people portrait data includes at least one identification information. Secondly, determining a target processing unit in a plurality of preset processing units based on target identification information in at least one type of identification information; then, a tagging operation is performed on the person representation data based on the target processing unit. Based on the scheme, the problem that data consistency is poor easily when people portrait data is processed by adopting the existing portrait processing technology can be solved.

Description

Person image processing method and device, electronic device and storage medium
Technical Field
The application relates to the technical field of data processing, in particular to a person portrait processing method and device, an electronic device and a storage medium.
Background
With the development of data processing technology, the application field of the technology is continuously widened. For example, in the public safety field, a big data technique based on this technique is used for managing persons, in which various kinds of information of the managed persons are collected, labels are given to the corresponding persons based on the information, and then, person image data of the persons can be formed based on the labels.
The inventor researches and finds that when the personnel portrait data is processed, the problem of poor data consistency is easy to occur by adopting the conventional personnel portrait processing technology.
Disclosure of Invention
In view of the above, an object of the present invention is to provide a method and an apparatus for processing a person image, an electronic device, and a storage medium, so as to solve the problem that data consistency is poor when the person image data is processed by using the conventional image processing technology.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
a person representation processing method, comprising:
acquiring personnel portrait data to be processed, wherein the personnel portrait data comprises at least one type of identification information;
determining a target processing unit in a plurality of preset processing units based on target identification information in the at least one type of identification information;
performing a tagging operation on the person representation data based on the target processing unit.
In a preferable selection of the embodiment of the application, in the person image processing method, the step of determining a target processing unit among a plurality of preset processing units based on target identification information in the at least one type of identification information includes:
acquiring identity information in the at least one type of identification information;
and determining a target processing unit in a plurality of preset processing units based on the identity information.
In a preferred option of an embodiment of the present application, in the people image processing method, the step of performing tagging operation on the people image data based on the target processing unit includes:
acquiring a tag processing instruction, wherein the tag processing instruction comprises a tag adding instruction, a tag modifying instruction or a tag deleting instruction;
and executing labeling operation corresponding to the label processing instruction on the label information in the at least one type of identification information through the target processing unit.
In a preferred option of the embodiment of the present application, the method for processing a human image further includes:
judging whether to execute updating operation based on label information included in the person portrait data when the labeling operation is executed;
when the labeling operation is executed, an update operation is executed based on label information included in the person image data, and record information of the update operation is generated and stored.
In a preferred alternative of the embodiment of the present invention, in the person image processing method, the step of generating and storing record information of the update operation includes:
acquiring operator information, operation time information and operation type information of the labeling operation;
and generating record information based on the operator information, the operation time information and the operation type information, and executing a first persistence operation on the record information to save the record information.
In a preferred option of the embodiment of the present application, the method for processing a human image further includes:
executing a second persistence operation on the personnel portrait data obtained by executing the tagging operation;
when the second persistence operation is not successfully executed, a deletion operation is executed on the record information based on the pre-generated persistence information, wherein the persistence information is generated when the first persistence operation is executed on the record information.
In a preferred option of the embodiment of the present application, the method for processing a human image further includes:
if the tag operation is executed, updating operation is executed based on tag information included in the personnel portrait data, and updating historical snapshot information of the tag information based on the tag operation;
wherein the history snapshot information includes start time information and end time information of information content that the tag information has before each time of being update-processed.
The embodiment of the present application further provides a person portrait processing apparatus, including:
the system comprises a data acquisition module, a data processing module and a data processing module, wherein the data acquisition module is used for acquiring personnel portrait data to be processed, and the personnel portrait data comprises at least one type of identification information;
the unit determining module is used for determining a target processing unit in a plurality of preset processing units based on target identification information in the at least one type of identification information;
and the operation execution module is used for executing labeling operation on the personnel portrait data based on the target processing unit.
On the basis, an embodiment of the present application further provides an electronic device, including:
a memory for storing a computer program;
and the processor is connected with the memory and is used for executing the computer program to realize the human image processing method.
On the basis of the above, the present application further provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed, the method for processing a person image is implemented.
When the person portrait data is processed, a target processing unit can be determined in a plurality of preset processing units based on target identification information in the person portrait data, and then tagging operation is performed on the person portrait data based on the target processing unit. Therefore, the same person portrait data can be processed based on the same processing unit in different time periods or based on the requests of different operators, so that the problem that the consistency of the data is poor easily caused when the person portrait data is processed due to the fact that the person portrait data is possibly processed by different processing units in the conventional portrait processing technology is solved, and the portrait processing system has high practical value.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
Fig. 1 is a block diagram of an electronic device according to an embodiment of the present disclosure.
FIG. 2 is a flowchart illustrating steps included in a human image processing method according to an embodiment of the present disclosure.
Fig. 3 is a flowchart illustrating sub-steps included in step S120 in fig. 2.
Fig. 4 is a flowchart illustrating sub-steps included in step S130 in fig. 2.
Fig. 5 is a block diagram illustrating functional modules included in the human image processing apparatus according to an embodiment of the present disclosure.
Icon: 10-an electronic device; 12-a memory; 14-a processor; 100-a person image processing device; 110-a data acquisition module; 120-a unit determination module; 130-operation execution module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
As shown in fig. 1, an electronic device 10 according to an embodiment of the present disclosure may include a memory 12 and a processor 14, where the memory 12 may be provided with a human figure processing apparatus 100.
Wherein the memory 12 and the processor 14 are electrically connected directly or indirectly to realize data transmission or interaction. For example, they may be electrically connected to each other via one or more communication buses or signal lines. The person representation processing apparatus 100 includes at least one software function module which may be stored in the memory 12 in the form of software or firmware (firmware). The processor 14 is configured to execute an executable computer program stored in the memory 12, for example, a software functional module and a computer program included in the human image processing apparatus 100, so as to implement the human image processing method according to the embodiment of the present application.
Alternatively, the Memory 12 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like.
The Processor 14 may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), a System on Chip (SoC), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components.
It is understood that the structure shown in fig. 1 is only an illustration, and the electronic device 10 may further include more or less components than those shown in fig. 1, or have a different configuration from that shown in fig. 1, for example, a communication unit for information interaction with other devices (such as a database server) may also be included.
The specific type of the electronic device 10 is not limited, and may be selected according to actual application requirements as long as the electronic device has certain data processing capability, for example, in an alternative example, the electronic device 10 includes a computer, a personal computer, and the like.
With reference to fig. 2, an embodiment of the present application further provides a method for processing a person image applicable to the electronic device 10. The method steps defined by the flow related to the person representation processing method may be implemented by the electronic device 10. The specific process shown in fig. 2 will be described in detail below.
Step S110, acquiring personnel portrait data to be processed.
In this embodiment, the person image data to be processed may be generated in response to an operation by an operator, so that the person image data may be acquired. Wherein the person representation data comprises at least one identification information.
Step S120, determining a target processing unit among a plurality of preset processing units based on target identification information in the at least one identification information.
In this embodiment, after the person image data is acquired based on step S110, since the person image data includes at least one type of identification information, a target processing unit may be determined among a plurality of processing units that are preset based on target identification information in the at least one type of identification information.
Step S130, performing labeling operation on the personnel portrait data based on the target processing unit.
In this embodiment, after the target processing unit is determined based on step S120, a tagging operation may be performed on the acquired person representation data based on the target processing unit.
Based on the method, the same person image data can be processed based on the same processing unit when being processed in different time periods or based on the requests of different operators, namely, the person image data can be processed based on the same processing unit all the time as long as the target identification information of the person image data is not changed. Thus, the problem that the prior image processing technology is easy to have poor data consistency when processing the human image data because different processing units can process the human image data is solved.
It should be noted that, in step S110, the acquired person image data includes at least one type of identification information, but the specific content of the identification information is not limited and may be selected according to the actual application.
For example, in an alternative example, the identification information may include, but is not limited to, a person identification number, a person name, a cell phone number, a residence address, an avatar, a creation time, a last operation time, a creator, a tag list, a tag number, and the like.
The tag list may include, but is not limited to, information such as a tag ID, a tag name, a tag value, and a tag unit.
In detail, in a specific application example, the identification information may include a person identification number, a person name, a mobile phone number, a residence address, a creation time, a last operation time, a tag list, and the tag list may include a tag ID, a tag name, a tag value, and a tag unit.
Specific data, as shown in the following table:
personnel identification card number 31221200012312345x
Name of person Money a certain money
Mobile phone number 12211231123
Residential address xx City
Creation time 2019-10-16 19:59:52
Time of last operation 2019-10-16 19:59:52
Label ID 1
Label name Vending drugs
Tag value 1
Label unit kg
Based on the contents of the tag list in the above table, it can be known that some sold drug is 1Kg with identification number "31221200012312345 x".
It should be noted that, in step S120, a specific manner for determining the target processing unit is not limited, and may be selected according to actual application requirements.
For example, in an alternative example, any one of the at least one kind of identification information may be used as the target identification information, and then the target processing unit may be determined from a plurality of processing units that are preset based on the target identification information.
For another example, in another alternative example, in order to improve the consistency and stability of the target processing unit determined at different time periods, in conjunction with fig. 3, step S120 may include step S121 and step S123, which are described in detail below.
Step S121, obtaining identity information in the at least one identification information.
In this embodiment, in consideration of the fact that the identity information generally has high stability, after the person image data is acquired in step S110, the identity information in the person image data may be acquired and used as the target identification information.
And step S123, determining a target processing unit in a plurality of preset processing units based on the identity information.
In this embodiment, after the identity information is acquired based on step S121, one processing unit may be determined among a plurality of preset processing units based on the identity information, and the processing unit may be used as the target processing unit.
In this way, as long as the identification information in the person image data does not change (does not substantially change), the target processing units determined at different time periods can all be the same, thereby ensuring that the processing units that process the person image data are all the target processing units.
Alternatively, the specific content of the identity information obtained by performing step S121 is not limited, and may be selected according to the actual application requirements, as long as the person representation data can be effectively identified.
For example, in an alternative example, the identity information may be a personal identification number such as 31221200012312345x in the previous example.
Optionally, the specific manner of determining the target processing unit based on the identity information in step S123 is not limited, and may be selected according to the actual application requirement.
For example, in one alternative example, the target processing unit may be determined among the plurality of processing units based on the entire content of the identity information.
For another example, in another alternative example, the target processing unit may be determined among the plurality of processing units based on a portion of content in the identity information.
In detail, in a specific application example, when the identity information is a person identification number, a target processing unit may be determined among the plurality of processing units based on the last bit of data of the person identification number.
Thus, considering that the last bit data of the personal identification number includes 0 to 9 and x, i.e., 11 kinds of data, 11 processing units, such as processing unit a, processing unit B, processing unit C, processing unit D, processing unit E, processing unit F, processing unit G, processing unit H, processing unit I, processing unit J, and processing unit K, may be preset, and then, the 11 processing units are set up in a one-to-one correspondence relationship with the 11 kinds of data. Wherein, the corresponding relationship can be shown as the following table:
Figure BDA0002313059250000091
based on the above correspondence, the target processing unit determined for the personal identification number "31221200012312345 x" may be processing unit K.
That is, if the last bit of data of the person identification number included in the person image data is 0, the target processing unit that can be determined may be the processing unit a. If the last bit of data of the person identification number included in the person image data is 1, the target processing unit that can be determined may be the processing unit B. If the last bit of data of the person identification number included in the person image data is 2, the target processing unit that can be determined may be the processing unit C. If the last bit of data of the person identification number included in the person image data is 3, the target processing unit that can be determined may be the processing unit D. If the last bit of data of the person identification number included in the person image data is 4, the target processing unit that can be determined may be the processing unit E. If the last bit of data of the person identification number included in the person image data is 5, the target processing unit that can be determined may be the processing unit F. If the last bit of data of the person identification number included in the person image data is 6, the target processing unit that can be determined may be the processing unit G. If the last bit of data of the person identification number included in the person image data is 7, the target processing unit that can be determined may be the processing unit H. If the last bit of data of the person identification number included in the person image data is 8, the target processing unit that can be determined may be the processing unit I. If the last bit of data of the person identification number included in the person image data is 9, the target processing unit that can be determined may be the processing unit J.
It is to be understood that the specific form of the processing units is not limited, and may be selected according to the actual application requirements. For example, in an alternative example, the plurality of processing units may refer to a plurality of predetermined threads of the processing module. For another example, in another alternative example, multiple of the processing units may refer to multiple instances of a processing module.
It should be noted that, in step S130, a specific manner of performing the tagging operation on the person representation data is not limited, and may be selected according to actual application requirements.
For example, in an alternative example, a tag addition operation, a tag modification operation, or a tag deletion operation may be directly performed on the person representation data based on a previous configuration.
For another example, in another alternative example, in order to improve the diversity of the labeling operation performed on the person representation data, in conjunction with fig. 4, step S130 may include step S131 and step S133, which are described in detail below.
In step S131, a tag processing instruction is acquired.
In this embodiment, it is considered that different operators have different processing requirements for tag information in the person image data at different stages. Therefore, after the target processing unit is determined in step S120, the tag processing instruction may be acquired first.
The tag processing instruction may be instruction information carried in the person representation data, and may include a tag adding instruction, a tag modifying instruction, or a tag deleting instruction.
Step S133, executing, by the target processing unit, a tagging operation corresponding to the tag processing instruction on tag information in the at least one type of identification information.
In this embodiment, after the tag processing instruction is acquired based on step S131, the target processing unit may perform a tagging operation corresponding to the tag processing instruction on tag information (such as content in a tag list in the foregoing example) in the at least one type of identification information.
For example, if the tag processing instruction is the tag adding instruction, the target processing unit may perform a tag adding operation on the tag information; if the tag processing instruction is the tag modification instruction, the tag modification operation can be executed on the tag information through the target processing unit; if the tag processing instruction is the tag deleting instruction, the target processing unit may execute a tag deleting operation on the tag information.
Based on the above example, it can be known that the specific content of the tag processing instruction is different, so that the specific content of the tagging operation is also different. Thus, different sub-steps may be included when performing a tagging operation based on different tag processing instructions.
For example, in an alternative example, if the tag processing instruction is a tag addition instruction, historical person image data having the same image data ID may be found based on the image data ID (different person west data has different image data ID) included in the tag addition instruction, and then tag information of the historical person image data may be compared with tag information of the person image data. Then, only when the tag information of the historical person image data is different from the tag information of the person image data, a tag addition operation (for example, adding the tag information of the person image data to the historical person image data) can be executed based on the tag addition command.
For example, in another alternative example, if the tag processing command is a tag modification command, historical person image data having the same image data ID may be searched based on the image data ID included in the tag modification command, and then tag information of the historical person image data may be compared with tag information of the person image data. Then, only when the tag information of the historical person image data is different from the tag information of the person image data, a tag modification operation (for example, replacing the tag information in the historical person image data with the tag information in the person image data) can be performed based on the tag modification instruction.
For another example, in another alternative example, if the tag processing instruction is a tag deletion instruction, whether or not historical person image data having the same image data ID is available may be searched based on the image data ID included in the tag deletion instruction. Only when the historical person image data is found, a tag deletion operation (for example, deleting tag information in the historical person image data that is the same as tag information in the person image data) can be executed based on the tag deletion instruction.
As can be understood from the above-described example, after the labeling operation is performed, the tag information included in the person image data may not be updated in some cases, and if the tag processing instruction is a tag deletion instruction, if historical person image data having the same image data ID is not found, the deletion operation is not performed on the tag information.
Therefore, in this embodiment, the method for processing a human figure may further include:
firstly, judging whether to execute updating operation based on label information included in the personnel portrait data when executing the labeling operation; when the labeling operation is executed, an update operation is executed based on label information included in the person image data, and record information of the update operation is generated and stored.
That is, only when the labeling operation is performed so that the tag information in the corresponding historical image data is updated, or only when the labeling operation is performed, the update operation is performed on the tag information in the corresponding historical image data based on the tag information included in the person image data, the record information for performing the update operation is generated.
Therefore, in subsequent applications, each updating operation can be queried based on the recorded information, and the execution of the updating operation is convenient to supervise.
Optionally, a specific manner of generating and storing the record information of the update operation is not limited, and may be selected according to an actual application requirement.
For example, in an alternative example, the record information of the update operation may be generated and saved based on the following sub-steps:
firstly, the operator information, the operation time information and the operation type information of the labeling operation can be obtained; second, log information may be generated based on the operator information, the operation time information, and the operation type information, and a first persistence operation may be performed on the log information to save the log information.
That is, in the present embodiment, the record information may include operator information, operation time information, and operation type information. The operation type information may refer to the tag adding operation, the tag modifying operation, or the tag deleting operation.
It is understood that, in some other examples, the record information may further include some other information, such as the above-mentioned portrait data ID and tag list, which is selected according to the actual application requirement, and is not limited specifically herein.
Moreover, the first persistence operation may be to persist the record information to a target database for storage, such as an ElasticSearch database.
Further, in order to ensure that the person image data obtained by performing the tagging operation can be effectively stored, in this embodiment, the person image processing method may further include the following steps:
first, a second persistence operation may be performed on the person representation data resulting from the tagging operation; second, a deletion operation may be performed on the record information based on previously generated persistence information when the second persistence operation is not successfully performed.
Wherein the persistent information is generated when the first persistent operation is performed on the record information. For example, the persistent information may be a record ID generated for performing the first persistent operation.
That is, the person image data on which the tagging operation is performed (e.g., tag information added, modified, or deleted based on the historical person image data) may be persisted to a target database, such as an ElasticSearch database, for storage.
However, through research by the inventors of the present application, it is found that, in some cases, a problem of unsuccessful execution may occur in the process of executing the second persistence operation, so that the historical person image data currently stored in the target database is still the data before executing the tagging operation. However, since the first persistent operation is executed to store the record information for executing the tagging operation, the subsequent data supervision is not facilitated.
Therefore, in this embodiment, when the second persistence operation is not successfully performed, a deletion operation may be performed on the record information based on the persistence information, so that the record information is no longer stored in the target database, and thus the stored record information and the historical person portrait data have higher consistency, thereby facilitating subsequent data supervision.
Further, based on the foregoing example, it can be known that, when the tagging operation is performed, an updating operation may be performed based on tag information included in the people image data (e.g., based on the tag information, a tag adding operation, a tag modifying operation, or a tag deleting operation is performed on the historical people image data), so that the information content of the tag information included in the historical people image data in different time periods may be different.
In this embodiment, in order to facilitate identification of a change in information content of the tag information, the person representation processing method may further include:
when the tagging operation is performed, an updating operation may be performed based on tag information included in the person image data, and the history snapshot information of the tag information may be updated based on the tagging operation.
Wherein the historical snapshot information may include start time information and end time information of information content that the tag information has before each time it is updated.
For example, in a specific application example, the label information is created in 2017 on day 17/10, and the label information is 0.1kg of drug sold. Then, the tag information is modified to be sold with 0.3kg of drugs by being subjected to the tagging operation on the 20 th day of 5 months in 2018. The labeling operation is performed again on 31/12/2018, and the label information is modified to be 0.4kg for sale of drugs. Finally, the label information is modified to be 0.5kg of drugs for sale in 12 months and 10 in 2019 due to the labeling operation.
Based on the above example, the historical snapshot information may be as shown in the following table:
Figure BDA0002313059250000151
based on the above example, information changes of the tag information included in the historical person image data at different time periods may be stored (e.g., persisted into a target database) in the form of the above historical snapshot information, so as to facilitate subsequent use.
After the historical snapshot information is stored, the historical snapshot information can be queried in different query modes based on different requirements.
For example, in an alternative example, the people representation processing method may further include the steps of: firstly, time interval information to be inquired can be obtained; secondly, all the tag information belonging to the time period information can be found in the target database.
For another example, in another alternative example, the person representation processing method may further include: firstly, a tag ID to be queried can be obtained; next, historical snapshot information of the tag information corresponding to the tag ID may be found in the target database based on the tag ID.
Likewise, the produced and stored record information may be queried based on some requirements. After the record information is inquired, the record information can be displayed differently based on different requirements.
For example, in an alternative example, the display may be based on chronological order. Therefore, the person representation processing method may further include the steps of:
firstly, the time information of each piece of the stored record information can be obtained; secondly, each recorded information can be displayed directly based on the time information; or
Firstly, the time information of each piece of the stored record information can be obtained; next, each of the record information may be displayed based on the time information and label information corresponding to the record information.
With reference to fig. 5, an embodiment of the present application further provides a human image processing apparatus 100 applicable to the electronic device 10. The human representation processing apparatus 100 may include a data acquisition module 110, a unit determination module 120, and an operation execution module 130.
The data acquiring module 110 is configured to acquire person portrait data to be processed, where the person portrait data includes at least one identification information. In this embodiment, the data obtaining module 110 may be configured to execute step S110 shown in fig. 2, and reference may be made to the foregoing description of step S110 for relevant contents of the data obtaining module 110.
The unit determining module 120 is configured to determine a target processing unit among a plurality of preset processing units based on target identification information in the at least one identification information. In this embodiment, the unit determining module 120 may be configured to perform step S120 shown in fig. 2, and reference may be made to the foregoing description of step S120 for relevant contents of the unit determining module 120.
The operation executing module 130 is configured to execute a tagging operation on the person representation data based on the target processing unit. In this embodiment, the operation performing module 130 may be configured to perform step S130 shown in fig. 2, and reference may be made to the foregoing description of step S130 for relevant contents of the operation performing module 130.
In an embodiment of the present application, there is provided a computer-readable storage medium storing a computer program, where the computer program executes the steps of the human image processing method.
The steps executed when the computer program runs are not described in detail herein, and reference may be made to the explanation of the human figure processing method.
In summary, when processing person portrait data, the person portrait processing method and apparatus, the electronic device, and the storage medium provided in the present application may determine a target processing unit among a plurality of preset processing units based on target identification information in the person portrait data, and then perform tagging operation on the person portrait data based on the target processing unit. Therefore, the same person portrait data can be processed based on the same processing unit in different time periods or based on the requests of different operators, so that the problem that the consistency of the data is poor easily caused when the person portrait data is processed due to the fact that the person portrait data is possibly processed by different processing units in the conventional portrait processing technology is solved, and the portrait processing system has high practical value.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus and method embodiments described above are illustrative only, as the flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, an electronic device, or a network device) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes. It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A person image processing method, comprising:
acquiring personnel portrait data to be processed, wherein the personnel portrait data comprises at least one type of identification information;
determining a target processing unit in a plurality of preset processing units based on target identification information in the at least one type of identification information;
performing a tagging operation on the person representation data based on the target processing unit.
2. The person representation processing method of claim 1, wherein the step of determining a target processing unit among a plurality of processing units based on a target identification information of the at least one identification information comprises:
acquiring identity information in the at least one type of identification information;
and determining a target processing unit in a plurality of preset processing units based on the identity information.
3. The people representation processing method of claim 1, wherein the step of performing a tagging operation on the people representation data based on the target processing unit comprises:
acquiring a tag processing instruction, wherein the tag processing instruction comprises a tag adding instruction, a tag modifying instruction or a tag deleting instruction;
and executing labeling operation corresponding to the label processing instruction on the label information in the at least one type of identification information through the target processing unit.
4. The person representation processing method according to any one of claims 1 to 3, further comprising:
judging whether to execute updating operation based on label information included in the person portrait data when the labeling operation is executed;
when the labeling operation is executed, an update operation is executed based on label information included in the person image data, and record information of the update operation is generated and stored.
5. The person image processing method according to claim 4, wherein the step of generating and storing record information of the update operation includes:
acquiring operator information, operation time information and operation type information of the labeling operation;
and generating record information based on the operator information, the operation time information and the operation type information, and executing a first persistence operation on the record information to save the record information.
6. The person representation processing method of claim 5, further comprising:
executing a second persistence operation on the personnel portrait data obtained by executing the tagging operation;
when the second persistence operation is not successfully executed, a deletion operation is executed on the record information based on the pre-generated persistence information, wherein the persistence information is generated when the first persistence operation is executed on the record information.
7. The person representation processing method of claim 4, further comprising:
if the tag operation is executed, updating operation is executed based on tag information included in the personnel portrait data, and updating historical snapshot information of the tag information based on the tag operation;
wherein the history snapshot information includes start time information and end time information of information content that the tag information has before each time of being update-processed.
8. A person image processing apparatus comprising:
the system comprises a data acquisition module, a data processing module and a data processing module, wherein the data acquisition module is used for acquiring personnel portrait data to be processed, and the personnel portrait data comprises at least one type of identification information;
the unit determining module is used for determining a target processing unit in a plurality of preset processing units based on target identification information in the at least one type of identification information;
and the operation execution module is used for executing labeling operation on the personnel portrait data based on the target processing unit.
9. An electronic device, comprising:
a memory for storing a computer program;
a processor coupled to the memory for executing the computer program to implement the person representation processing method of any of claims 1-7.
10. A computer-readable storage medium on which a computer program is stored, characterized in that the program, when executed, implements the person representation processing method of any one of claims 1 to 7.
CN201911266714.6A 2019-12-11 2019-12-11 Person image processing method and device, electronic device and storage medium Pending CN110992105A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911266714.6A CN110992105A (en) 2019-12-11 2019-12-11 Person image processing method and device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911266714.6A CN110992105A (en) 2019-12-11 2019-12-11 Person image processing method and device, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN110992105A true CN110992105A (en) 2020-04-10

Family

ID=70092438

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911266714.6A Pending CN110992105A (en) 2019-12-11 2019-12-11 Person image processing method and device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN110992105A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170093876A1 (en) * 2015-09-29 2017-03-30 International Business Machines Corporation Access control for database
CN107016103A (en) * 2017-04-12 2017-08-04 北京焦点新干线信息技术有限公司 A kind of method and device for building user's portrait
CN108537586A (en) * 2018-03-30 2018-09-14 杭州米趣网络科技有限公司 Data processing method and device based on user's portrait
CN108874971A (en) * 2018-06-07 2018-11-23 北京赛思信安技术股份有限公司 A kind of tool and method applied to the storage of magnanimity labeling solid data
CN110046657A (en) * 2019-03-29 2019-07-23 武汉大学深圳研究院 A kind of social safety figure painting image space method based on multiple view study

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170093876A1 (en) * 2015-09-29 2017-03-30 International Business Machines Corporation Access control for database
CN107016103A (en) * 2017-04-12 2017-08-04 北京焦点新干线信息技术有限公司 A kind of method and device for building user's portrait
CN108537586A (en) * 2018-03-30 2018-09-14 杭州米趣网络科技有限公司 Data processing method and device based on user's portrait
CN108874971A (en) * 2018-06-07 2018-11-23 北京赛思信安技术股份有限公司 A kind of tool and method applied to the storage of magnanimity labeling solid data
CN110046657A (en) * 2019-03-29 2019-07-23 武汉大学深圳研究院 A kind of social safety figure painting image space method based on multiple view study

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
范晓玉等: "融合多源数据的科研人员画像构建方法研究", 《图书情报工作》 *
薛欢雪: "高校图书馆学科服务用户画像创建过程", 《图书馆学研究》 *

Similar Documents

Publication Publication Date Title
US20160127466A1 (en) Methods and systems for providing content data to content consumers
CN105791352B (en) Message pushing method and system for application
CN109491962B (en) File directory tree management method and related device
CN104834719B (en) Applied to the Database Systems under real-time big data scene
CN105094811A (en) Method can device for processing events
CN110647459B (en) Application testing method and device
CN114066533A (en) Product recommendation method and device, electronic equipment and storage medium
CN107480240B (en) Database system and data processing method thereof
WO2019085343A1 (en) Marketing customer screening method based on tag library, electronic device and storage medium
CN108549722B (en) Multi-platform data publishing method, system and medium
CN110992105A (en) Person image processing method and device, electronic device and storage medium
CN111625656A (en) Information processing method, device, equipment and storage medium
CN108959468B (en) Monitoring method of database directory, storage medium and server
CN111159422A (en) Method and system for establishing knowledge graph of medicine, server and medium
US20200311143A1 (en) Correlating user device attribute groups
CN107679908B (en) Salesperson topic auxiliary query method, electronic device and storage medium
CN107193891B (en) Content recommendation method and device
CN113420236B (en) Method and device for displaying list data, electronic equipment and storage medium
CN103778218A (en) Cloud computation-based standard information consistency early warning system and method
CN107169845B (en) Merchant attribute query method and device and server
CN113627454B (en) Article information clustering method, pushing method and device
WO2018076348A1 (en) Building and updating a connected segment graph
US20170097991A1 (en) Automatically branding topics using color
US9218579B2 (en) Method and system for facilitating retrieval of information from images
CN108632092B (en) Information management method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200410