CN113421196A - Image processing method and related device - Google Patents

Image processing method and related device Download PDF

Info

Publication number
CN113421196A
CN113421196A CN202110640062.9A CN202110640062A CN113421196A CN 113421196 A CN113421196 A CN 113421196A CN 202110640062 A CN202110640062 A CN 202110640062A CN 113421196 A CN113421196 A CN 113421196A
Authority
CN
China
Prior art keywords
influence factor
key point
determining
smoothing
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110640062.9A
Other languages
Chinese (zh)
Other versions
CN113421196B (en
Inventor
潘睿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Douku Software Technology Co Ltd
Original Assignee
Hangzhou Douku Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Douku Software Technology Co Ltd filed Critical Hangzhou Douku Software Technology Co Ltd
Priority to CN202110640062.9A priority Critical patent/CN113421196B/en
Publication of CN113421196A publication Critical patent/CN113421196A/en
Application granted granted Critical
Publication of CN113421196B publication Critical patent/CN113421196B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/70

Abstract

The application provides an image processing method and a related device, wherein the method comprises the following steps: acquiring key points of a target object in a target image, wherein the target object is a shot object; determining a smoothing parameter influence factor of the key point according to the target image, wherein the smoothing parameter influence factor is used for indicating the influence degree of the associated attribute of the key point on the original smoothing parameter of the key point; determining a target smoothing parameter of the key point according to the smoothing parameter influence factor and the original smoothing parameter; and smoothing the key points according to the target smoothing parameters. According to the embodiment of the application, the smoothness parameters of a single point in the current image are dynamically calculated, so that the flexibility and the intelligence of the smooth processing of the image by the equipment are improved.

Description

Image processing method and related device
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to an image processing method and a related apparatus.
Background
At present, when an image is shot by equipment and the image is smoothed, smoothing parameters which are uniformly set are generally used for smoothing joint points, the method cannot adapt to a complex real-time scene, and high smoothing parameters can cause too high algorithm complexity and cause delay.
Disclosure of Invention
The application provides an image processing method and a related device, which aim to dynamically calculate the smoothing parameters of a single point in a current image and improve the flexibility and intelligence of smooth processing of the image by equipment.
In a first aspect, the present application provides an image processing method, including:
acquiring key points of a target object in a target image, wherein the target object is a shot object;
determining a smoothing parameter influence factor of the key point according to the target image, wherein the smoothing parameter influence factor is used for indicating the influence degree of the associated attribute of the key point on the original smoothing parameter of the key point;
determining a target smoothing parameter of the key point according to the smoothing parameter influence factor and the original smoothing parameter;
and carrying out smoothing treatment on the key points according to the target smoothing parameters.
It can be seen that, in the embodiment of the present application, since the target smoothing parameter of the key point is obtained by dynamically calculating a smoothing parameter influence factor and an original smoothing parameter, and the smoothing parameter influence factor is used for indicating the influence degree of the associated attribute of the key point on the original smoothing parameter of the key point, the smoothing processing complexity of the key point can be adaptively adjusted according to the associated attribute of the key point, which is beneficial to improving the flexibility and intelligence of the device for smoothing the image.
In a second aspect, the present application provides an image processing apparatus comprising:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring key points of a target object in a target image, and the target object is a shot object;
a determining unit, configured to determine a smoothing parameter influence factor of the keypoint according to the target image, where the smoothing parameter influence factor is used to indicate a degree of influence of an associated attribute of the keypoint on an original smoothing parameter of the keypoint;
the determining unit is further configured to determine a target smoothing parameter of the keypoint according to the smoothing parameter influence factor and the original smoothing parameter;
and the smoothing unit is used for smoothing the key points according to the target smoothing parameters.
In a third aspect, the present application provides an electronic device, one or more processors;
one or more memories for storing programs,
the one or more memories and the program are configured to control the electronic device, by the one or more processors, to execute the instructions of the steps in any of the methods of the first aspect of the embodiments of the present application.
In a fourth aspect, the present application provides a chip comprising: and the processor is used for calling and running the computer program from the memory so that the device provided with the chip executes part or all of the steps described in any method of the first aspect of the embodiment of the application.
In a fifth aspect, the present application provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, and wherein the computer program causes a computer to perform some or all of the steps as described in any one of the methods of the first aspect of the embodiments of the present application.
In a sixth aspect, the present application provides a computer program, wherein the computer program is operable to cause a computer to perform some or all of the steps as described in any of the methods of the first aspect of the embodiments of the present application. The computer program may be a software installation package.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present disclosure;
fig. 2a is a schematic flowchart of an image processing method according to an embodiment of the present application;
FIG. 2b is a schematic diagram illustrating a distribution of joints of a human body according to an embodiment of the present application;
FIG. 2c is a schematic diagram of an embodiment of an ambiguity generated in an elbow joint;
FIG. 2d is a functional block diagram of a heat map generation model provided by an embodiment of the present application;
FIG. 2e is a schematic view of a process flow of smoothing a joint point of a human body according to an embodiment of the present application;
fig. 3 is a block diagram of functional units of an image processing apparatus according to an embodiment of the present application;
fig. 4 is a block diagram of functional units of another image processing apparatus according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
In the present application, "at least one" means one or more, and a plurality means two or more. In this application and/or, an association relationship of an associated object is described, which means that there may be three relationships, for example, a and/or B, which may mean: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein each of a, b, c may itself be an element or a set comprising one or more elements.
It should be noted that, in the embodiments of the present application, the term "equal to" may be used in conjunction with more than, and is applicable to the technical solution adopted when more than, and may also be used in conjunction with less than, and is applicable to the technical solution adopted when less than, and it should be noted that when equal to or more than, it is not used in conjunction with less than; when the ratio is equal to or less than the combined ratio, the ratio is not greater than the combined ratio. In the embodiments of the present application, "of", "corresponding" and "corresponding" may be sometimes used in combination, and it should be noted that the intended meaning is consistent when the difference is not emphasized.
First, partial terms referred to in the embodiments of the present application are explained so as to be easily understood by those skilled in the art.
1. An electronic device. In the embodiment of the present application, the electronic device is a device having an image signal processing function, and may be referred to as a User Equipment (UE), a terminal (terminal), a terminal device, a Mobile Station (MS), a Mobile Terminal (MT), an access terminal device, a vehicle-mounted terminal device, an industrial control terminal device, a UE unit, a UE station, a mobile station, a remote terminal device, a mobile device, a UE terminal device, a wireless communication device, a UE agent, or a UE apparatus. The user equipment may be fixed or mobile. For example, the user equipment may be a mobile phone (mobile phone), a tablet (pad), a desktop, a notebook, a kiosk, a vehicle-mounted terminal, a Virtual Reality (VR) terminal, an Augmented Reality (AR) terminal, a wireless terminal in industrial control (industrial control), a wireless terminal in self driving, a wireless terminal in remote surgery, a wireless terminal in smart grid, a wireless terminal in transportation safety, a wireless terminal in city, a wireless terminal in smart home, a cellular phone, a cordless phone, a Session Initiation Protocol (SIP) phone, a wireless local loop (PDA, personal local), a personal digital assistant (wldi), a handheld wireless terminal with wireless communication function, a wireless communication terminal with a wireless communication function, a wireless communication terminal, a, A computing device or other processing device connected to a wireless modem, a wearable device, a terminal device in a future mobile communication network or a terminal device in a future evolved public mobile land network (PLMN), etc. In some embodiments of the present application, the user equipment may also be a device having a transceiving function, such as a system-on-chip. The chip system may include a chip and may also include other discrete devices.
2. And (4) smoothing algorithm. In the embodiment of the application, the smoothing algorithm is a post-processing jitter elimination means in image processing, and since the output result of each frame of the model has a certain error, when processing continuous frames, the front and rear frames have very obvious jitter and need to be eliminated by the post-processing means. Among these, there are two cases: when processing a video, the current frame can be processed by referring to the information of the previous and next frames, which is called as an off-line algorithm. When a camera previews a picture, a current frame is processed only by referring to previous frame information and no subsequent frame information, and the method is called as an online algorithm. While online algorithms may result in some hysteresis.
3. A Holt two-parameter linear exponential smoothing method. In the embodiment of the application, the Holt double-parameter linear exponential smoothing method is also called a Holt double-parameter smoothing method and a Holt method, and a trend smoothing coefficient beta is added on the basis of a simple exponential smoothing coefficient alpha, so the Holt double-parameter linear exponential smoothing method is also called a double-parameter smoothing method. In a Hall two-parameter smoothing method model, prediction consists of two parts; one part is a horizontal part, and is updated by a simple exponential smoothing method on the basis of the horizontal part in the upper stage; the other part is a trend part, which is smoothly adjusted on the basis of the upper-stage trend part and is also updated by a simple exponential smoothing method; the two are added to obtain the prediction of the next period.
In the real-time human body posture estimation task, because the model predicts each frame of the camera preview picture independently, the results of the previous frame and the next frame have slight difference because of errors, and the slight difference is reflected as the jitter of key points. Since the dithering of the key points causes inconvenience in subsequent use, it is necessary to smooth the data of each frame.
At present, in a traditional smoothing processing mode, when an online task is solved, only the result of a previous frame can be referred to, so that a significant problem is that the larger the smoothing parameter is, the more the output result is not jittered, but the more the hysteresis is, the points can not keep up with the motion of people.
In view of the foregoing problems, embodiments of the present application provide an image processing method and a related apparatus, so as to dynamically calculate a smoothing parameter of a single point in a current image, and improve flexibility and intelligence of smoothing processing an image by an apparatus.
Referring to fig. 1, fig. 1 is a schematic view of an electronic device 100 according to an embodiment of the present disclosure. The electronic device 100 comprises an application processor 120, a memory 130, a communication module 140, and one or more programs 131, wherein the application processor 120 is communicatively coupled to the memory 130 and the communication module 140 via an internal communication bus.
In a specific implementation, the one or more programs 131 are stored in the memory 130 and configured to be executed by the application processor 120, and the one or more programs 131 include instructions for performing some or all of the steps performed by the electronic device in the embodiment of the present application.
The communication module 140 includes a local area network wireless communication module and a wired communication module.
The Application Processor 120 may be, for example, a Central Processing Unit (CPU), a general purpose Processor, a Digital Signal Processor (DSP), an Application-Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), other Programmable logic devices (Programmable Gate Array), a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, units, and circuits described in connection with the disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others.
The memory 130 may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example, but not limitation, many forms of Random Access Memory (RAM) are available, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchlink DRAM (SLDRAM), and direct bus RAM (DR RAM).
Referring to fig. 2a, fig. 2a is a schematic flowchart of an image processing method according to an embodiment of the present disclosure, applied to an electronic device 100 of a vehicle; as shown in the figure, the image processing method includes the following steps.
Step 201, obtaining a key point of a target object in a target image, wherein the target object is a shot object.
The key point refers to a structural feature point of the target object, for example, if the target object is a human body, the key point may be any one of 17 joint points in a joint point distribution diagram shown in fig. 2 b.
In one possible embodiment, the acquiring key points of a target object in a target image includes: determining a detection frame of the target object according to the target image; and determining the key points according to the detection frame.
The detection frame may be a rectangular frame, and the device may specifically determine the contour of the target object according to the difference between the image area of the target object and the background image area, and further determine the rectangular frame with the smallest area covering the contour as the detection frame of the target object.
As can be seen, in this example, the electronic device may first accurately determine the detection frame where the target object is located, and then predict the key point of the target object.
Step 202, determining a smoothing parameter influence factor of the key point according to the target image, wherein the smoothing parameter influence factor is used for indicating the influence degree of the associated attribute of the key point on the original smoothing parameter of the key point.
In one possible embodiment, the association attribute includes at least one of: the confidence of the key points, the area ratio of the detection frame to which the key points belong to the target image, the motion amplitude characteristic of the key points and the shielding condition of the key points.
The higher the confidence of the key point is, the smaller the jitter of the corresponding coordinate value prediction result is, and correspondingly, the lower the smoothing processing requirement degree is; the lower the confidence of the key point, the larger the jitter of the corresponding coordinate value prediction result, and correspondingly, the higher the smoothing processing requirement degree.
The size of the detection frame reflects the distance between the target object and the camera, the distance influences the threshold setting of the jitter, for people with a short distance, the area of the detection frame is large, the relative movement of each frame is large, the jitter of the corresponding coordinate value prediction result is correspondingly small, and correspondingly, the smooth processing requirement degree is low; for people with longer distance, the area of the detection frame is smaller, the relative movement of each frame is smaller, the jitter of the corresponding coordinate value prediction result is correspondingly larger, and correspondingly, the smooth processing requirement degree is higher.
Taking the target object as a human body as an example, for the joint points of the trunk part, the relative motion of each frame is small, the jitter of the corresponding coordinate value prediction result is correspondingly large, and accordingly, the smoothing processing requirement degree is high. For the joint points of four limbs, particularly upper limbs, the movement of each frame is relatively large, the jitter of the corresponding coordinate value prediction result is relatively small, and accordingly, the smoothing processing requirement degree is relatively low.
Taking the target object as a human body as an example, for example, in the case of a side body, joint points of a blocked arm part have a large ambiguity due to missing information (dark joint points shown in fig. 2c correspond to a right shoulder joint point + a right elbow joint point + a right wrist joint point of a user, and a right hip joint point, wherein the right elbow joint point may have a spine joint point overlapping with the right arm of the user to generate an ambiguity), so that the joint points of the front and rear frames are shaken seriously, and accordingly, the required degree of smoothing processing is high.
Therefore, in this example, the device can dynamically adapt and adjust the value of the smoothing parameter of the key point in combination with the associated attribute of the key point, so as to improve the flexibility and accuracy, and balance the stability and the time delay.
In one possible embodiment, the associated attribute includes a confidence level of the keypoint; the determining the smooth parameter influence factor of the key point according to the target image comprises: determining a heat map of the keypoints from the target image; determining a confidence level for the keypoint from the heat map of the keypoint; and determining a confidence coefficient influence factor of the key point according to the confidence coefficient of the key point, wherein the relationship between the numerical value of the confidence coefficient and the numerical value of the confidence coefficient influence factor is negative correlation.
The heat map refers to a position distribution area of a predicted coordinate of a key point of a previous frame image of a target image in a current target image, each coordinate point in the position distribution area has a prediction probability attribute, the probability refers to that the maximum coordinate point is a peak point, and the prediction probability of the peak point is the confidence coefficient of the key point.
In a specific implementation, the electronic device may process the target image using a lightweight heatmap prediction model to determine a heatmap of the key point, where the heatmap prediction model may include, for example, a processing module as shown in fig. 2d, where a first upsampling module is used for a first convolution operation and pixel reorganization, a second upsampling module is used for upsampling a processing result of the first upsampling module, a third upsampling module is used for upsampling a processing result of the second upsampling module, a second convolution operation module is used for calculating a processing result of the third upsampling module to output a two-dimensional heatmap, an average pooling module is used for calculating a processing result of the third upsampling module, and a full connection layer module calculates a calculation result of the average pooling module to output a one-dimensional heatmap.
For example, the confidence impact factor is formulated as: rconf is C1/Conf,
wherein Rconf is the confidence coefficient influence factor, C1 is a constant, and Conf is the confidence coefficient.
As can be seen, in this example, the electronic device can determine the confidence of the keypoint according to the heat map of the target image, and calculate the confidence influence factor according to the confidence, thereby implementing the adaptation and adjustment of the smoothing parameter according to the confidence of the keypoint.
In one possible embodiment, the association attribute includes an area ratio of a detection frame to which the key point belongs to the target image; the determining the smooth parameter influence factor of the key point according to the target image comprises: determining a detection frame of the target object according to the target image; calculating the area ratio of the detection frame to the target image; and determining the detection frame influence factor of the key point according to the area proportion, wherein the relationship between the numerical value of the area proportion and the numerical value of the detection frame influence factor is negative correlation.
For example, taking the target object as a human body, the electronic device can calculate the detection frame influence factor Rarea ═ C2 (ImageW ImageH)/(boxW boxH) according to the detected width and height of the human body frame and the width and height of the whole image
Wherein Rarea is a detection frame influence factor, C2 is a constant, ImageW is the width of the target image, ImageH is the height of the target image, boxW is the width of the detection frame, and boxH is the height of the detection frame.
Therefore, in this example, the electronic device can determine the area ratio according to the detection frame of the target object, and calculate the influence factor of the detection frame according to the area ratio, thereby implementing the adjustment of the smoothing parameter according to the area ratio adaptation of the detection frame.
In one possible embodiment, the correlation attribute comprises a motion amplitude characteristic of the keypoint; the determining the smooth parameter influence factor of the key point according to the target image comprises: and inquiring a preset mapping relation according to the identification of the key point, and determining a motion amplitude influence factor corresponding to the identification, wherein the mapping relation comprises the corresponding relation between the identification of the key point and the motion amplitude influence factor, the identification is used for indicating the motion amplitude characteristic of the key point, and the relation between the motion amplitude characteristic and the motion amplitude influence factor is negative correlation.
In a specific implementation, the electronic device can predict the motion amplitude characteristic of the key point according to the position of the key point in the image area of the target object and the motion characteristic of the target object, and may be specifically classified into the following two types of cases.
In case (1), the identifier of the key point with high motion amplitude characteristic is a type of identifier, the corresponding motion amplitude influence factor is a type of motion amplitude influence factor with a lower value, and the identifier of the key point with high motion amplitude characteristic is a type of identifier, for example: the articulation points of the torso portion of the human body have a small amplitude due to less motion, the required degree of smoothing is high, and the magnitude of the motion amplitude influencing factor is large, and may be, for example, a value larger than 1.
In case (2), the identifier of the key point with low motion amplitude characteristic is a category two identifier, the corresponding motion amplitude impact factor is a category two motion amplitude impact factor with a higher value, and the identifier of the key point with low motion amplitude characteristic is a category two identifier, for example: the joint points of the limbs of the human body are high in amplitude due to frequent movement, low delay is needed, the required degree of smoothing is low, and the value of the movement amplitude influence factor is small, and can be 1, for example.
As can be seen, in this example, the electronic device can determine the motion amplitude influence factor according to the motion amplitude characteristic of the key point, so as to adapt and adjust the smoothing parameter according to the motion amplitude characteristic.
In one possible embodiment, the associated attribute comprises an occlusion condition of the keypoint; the determining the smooth parameter influence factor of the key point according to the target image comprises: determining the shielding condition of the key point according to the coordinate relation between the key point and a reference key point which is symmetrical to the key point; if the occlusion condition is that the key point is not occluded, determining that an occlusion influence factor of the key point is a first occlusion influence factor; and if the occlusion condition is that the key point is occluded, determining that the occlusion influence factor of the key point is a second occlusion influence factor, wherein the first occlusion influence factor is smaller than the second occlusion influence factor.
For example, when the difference between the X-axis coordinate value and the Y-axis coordinate value of the left shoulder joint point and the right shoulder joint point is small but the difference between the depth values Z is large, the joint point having the depth value Z with a large value is a blocked point.
The first shielding influence factor has a value of 1, and the second shielding influence factor has a value of an empirical value greater than 1.
In addition, if the key point is located in the central axis of the target object, the blocked condition is generally not judged, and the value of the blocking influence factor of the key point is defaulted to 1, namely, the blocking influence factor is not influenced.
As can be seen, in this example, because the jitter of the blocked points is more serious, a group of possibly blocked points is obtained through a blocking condition determination mechanism, and then the blocking influence factor larger than 1 is uniformly multiplied, so as to improve the numerical value of the smoothing parameter and reduce the jitter influence.
Step 203, determining the target smoothing parameters of the key points according to the smoothing parameter influence factors and the original smoothing parameters.
In one possible embodiment, the target object comprises a human body; the key points comprise joint points of the human body; the correlation attributes comprise the confidence degrees of the key points, the area proportion of the detection frame to which the key points belong and the target image, the motion amplitude characteristics of the key points and the shielding condition of the key points; the determining the target smoothing parameter of the key point according to the smoothing parameter influence factor and the original smoothing parameter includes: calculating the product of the original smooth parameter and the confidence coefficient influence factor, the detection frame influence factor, the motion amplitude influence factor and the shielding influence factor, wherein the product is the target smooth parameter;
the confidence coefficient influence factor is an influence factor corresponding to the confidence coefficient, the detection frame influence factor is an influence factor corresponding to the area proportion, the motion amplitude influence factor is an influence factor corresponding to the motion amplitude characteristic, and the shielding influence factor is an influence factor corresponding to the shielding condition.
As can be seen, in this example, for a human body detection frame application scenario, the electronic device can perform product operation on the original smoothing parameter, the confidence coefficient impact factor, the detection frame impact factor, the motion amplitude impact factor, and the occlusion impact factor to determine a target smoothing parameter, balance the smoothing effect and the time delay, and improve the smoothing efficiency.
And 204, smoothing the key points according to the target smoothing parameters.
Exemplary smoothing algorithms include, but are not limited to: holt two-parameter smoothing, huffman filtering, exponential smoothing, Savitzky-Golay algorithm, etc. The smoothing result may be used for at least one of: human body posture estimation, a stable algorithm of a detection box, a stable algorithm of human face key points and the like. For the application of attitude estimation, such as 3D model driving, motion recognition and the like, the jitter can be reduced, and the effects of more stability and low delay can be output.
It can be seen that, in the embodiment of the present application, since the target smoothing parameter of the key point is obtained by dynamically calculating a smoothing parameter influence factor and an original smoothing parameter, and the smoothing parameter influence factor is used for indicating the influence degree of the associated attribute of the key point on the original smoothing parameter of the key point, the smoothing processing complexity of the key point can be adaptively adjusted according to the associated attribute of the key point, which is beneficial to improving the flexibility and intelligence of the device for smoothing the image.
As shown in fig. 2e, assuming that the target object is a human body, the image processing method according to the embodiment of the present application includes the following steps:
step 2d01, the electronic device determines a target joint point of the human image in the target image.
Step 2d02, the electronic device determines a confidence impact factor Rconf for the target joint.
And 2d03, the electronic equipment determines a detection frame influence factor Rarea according to the area ratio of the human body frame.
And step 2d04, the electronic equipment determines a motion amplitude influence factor Rjoin according to the identification of the target joint point.
And step 2d05, the electronic equipment judges whether the target joint point is an occluded joint point.
If the target joint point is an unobstructed joint point, step 2d06 is executed.
If the target joint point is the occluded joint point, step 2d07 is executed.
Step 2d06, the electronic device determines an occlusion impact factor rocplus 1.
Step 2d07, the electronic device determines an occlusion impact factor rocplus 2.
Step 2d08, the electronic device calculates a target smoothing parameter SmoothParam'. Rconf. Rarea. Rjoint. rocplus,
wherein SmoothParam is the target smoothing parameter and SmoothParam' is the original smoothing parameter.
And step 2d09, the electronic equipment performs smoothing processing on the coordinates of the target joint point according to the target smoothing parameters to obtain processed coordinates.
It can be seen that, in this example, the electronic device can dynamically calculate the smoothing parameters, the balance stability and the efficiency of the joint points for the human body detection scene.
The embodiment of the application provides an image processing device which can be an electronic device. Specifically, the image processing apparatus is configured to execute the steps executed by the electronic device in the above image processing method. The image processing apparatus provided by the embodiment of the present application may include modules corresponding to the respective steps.
The embodiment of the present application may perform division of functional modules on the image processing apparatus according to the above method, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The division of the modules in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 3 shows a schematic diagram of a possible structure of the image processing apparatus according to the above-described embodiment, in a case where each functional module is divided in correspondence with each function. As shown in fig. 3, the image processing apparatus 3 is applied to an electronic device; the device comprises:
an acquisition unit 30 configured to acquire a key point of a target object in a target image, the target object being a subject to be photographed;
a determining unit 31, configured to determine a smoothing parameter influence factor of the keypoint according to the target image, where the smoothing parameter influence factor is used to indicate a degree of influence of an associated attribute of the keypoint on an original smoothing parameter of the keypoint;
the determining unit 31 is further configured to determine a target smoothing parameter of the keypoint according to the smoothing parameter impact factor and the original smoothing parameter;
and a smoothing unit 32, configured to perform smoothing processing on the key points according to the target smoothing parameter.
In one possible embodiment, the association attribute includes at least one of: the confidence of the key points, the area ratio of the detection frame to which the key points belong to the target image, the motion amplitude characteristic of the key points and the shielding condition of the key points.
In one possible embodiment, the associated attribute includes a confidence level of the keypoint; in respect of the determining the smoothing parameter impact factor of the keypoint from the target image, the determining unit 31 is specifically configured to: determining a heat map of the keypoints from the target image; and determining a confidence level for the keypoint from the heat map of the keypoint; and determining a confidence coefficient influence factor of the key point according to the confidence coefficient of the key point, wherein the relationship between the numerical value of the confidence coefficient and the numerical value of the confidence coefficient influence factor is negative correlation.
In one possible embodiment, the association attribute includes an area ratio of a detection frame to which the key point belongs to the target image; in respect of the determining the smoothing parameter impact factor of the keypoint from the target image, the determining unit 31 is specifically configured to: determining a detection frame of the target object according to the target image; calculating the area ratio of the detection frame to the target image; and determining the detection frame influence factor of the key point according to the area proportion, wherein the relationship between the numerical value of the area proportion and the numerical value of the detection frame influence factor is negative correlation.
In one possible embodiment, the correlation attribute comprises a motion amplitude characteristic of the keypoint; in respect of the determining the smoothing parameter impact factor of the keypoint from the target image, the determining unit 31 is specifically configured to: and inquiring a preset mapping relation according to the identification of the key point, and determining a motion amplitude influence factor corresponding to the identification, wherein the mapping relation comprises the corresponding relation between the identification of the key point and the motion amplitude influence factor, the identification is used for indicating the motion amplitude characteristic of the key point, and the relation between the motion amplitude characteristic and the motion amplitude influence factor is negative correlation.
In one possible embodiment, the associated attribute comprises an occlusion condition of the keypoint; in respect of the determining the smoothing parameter impact factor of the keypoint from the target image, the determining unit 31 is specifically configured to: determining the shielding condition of the key point according to the coordinate relation between the key point and a reference key point which is symmetrical to the key point; if the occlusion condition is that the key point is not occluded, determining that an occlusion influence factor of the key point is a first occlusion influence factor; and if the occlusion condition is that the key point is occluded, determining that the occlusion influence factor of the key point is a second occlusion influence factor, wherein the first occlusion influence factor is smaller than the second occlusion influence factor.
In a possible embodiment, in terms of acquiring the key points of the target object in the target image, the acquiring unit 30 is specifically configured to: determining a detection frame of the target object according to the target image; and determining the key points according to the detection frame.
In one possible embodiment, the target object comprises a human body; the key points comprise joint points of the human body; the correlation attributes comprise the confidence degrees of the key points, the area proportion of the detection frame to which the key points belong and the target image, the motion amplitude characteristics of the key points and the shielding condition of the key points; in the aspect of determining the target smoothing parameter of the keypoint according to the smoothing parameter impact factor and the original smoothing parameter, the determining unit 31 is specifically configured to: calculating the product of the original smooth parameter and the confidence coefficient influence factor, the detection frame influence factor, the motion amplitude influence factor and the shielding influence factor, wherein the product is the target smooth parameter; the confidence coefficient influence factor is an influence factor corresponding to the confidence coefficient, the detection frame influence factor is an influence factor corresponding to the area proportion, the motion amplitude influence factor is an influence factor corresponding to the motion amplitude characteristic, and the shielding influence factor is an influence factor corresponding to the shielding condition.
In the case of using an integrated unit, a schematic structural diagram of another image processing apparatus provided in the embodiment of the present application is shown in fig. 4. In fig. 4, the image processing apparatus 4 includes: a processing module 40 and a communication module 41. The processing module 40 is used for controlling and managing the actions of the device control apparatus, such as the steps performed by the obtaining unit 30, the determining unit 31, the smoothing unit 32, and/or other processes for performing the techniques described herein. The communication module 41 is used to support interaction between the device control apparatus and other devices. As shown in fig. 4, the image processing apparatus may further include a storage module 42, and the storage module 42 is used for storing program codes and data of the image processing apparatus.
The Processing module 40 may be a Processor or a controller, and may be, for example, a Central Processing Unit (CPU), a general-purpose Processor, a Digital Signal Processor (DSP), an ASIC, an FPGA or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, among others. The communication module 41 may be a transceiver, an RF circuit or a communication interface, etc. The storage module 42 may be a memory.
All relevant contents of each scene related to the method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again. Both the image processing apparatus 3 and the image processing apparatus 4 can perform the steps performed by the electronic device in the image processing method shown in fig. 2 a.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
The above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, the above-described embodiments may be implemented in whole or in part in the form of a computer program product. The computer program product comprises one or more computer instructions or computer programs. The procedures or functions according to the embodiments of the present application are wholly or partially generated when the computer instructions or the computer program are loaded or executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire or wirelessly. The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more collections of available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium. The semiconductor medium may be a solid state disk.
Embodiments of the present application also provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, the computer program enabling a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes an electronic device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising an electronic device.
It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed method, apparatus and system may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative; for example, the division of the unit is only a logic function division, and there may be another division manner in actual implementation; for example, various elements or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may be physically included alone, or two or more units may be integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Although the present invention is disclosed above, the present invention is not limited thereto. Various changes and modifications can be easily made by those skilled in the art without departing from the spirit and scope of the present invention, and it is within the scope of the present invention to include different functions, combination of implementation steps, software and hardware implementations.

Claims (11)

1. An image processing method, comprising:
acquiring key points of a target object in a target image, wherein the target object is a shot object;
determining a smoothing parameter influence factor of the key point according to the target image, wherein the smoothing parameter influence factor is used for indicating the influence degree of the associated attribute of the key point on the original smoothing parameter of the key point;
determining a target smoothing parameter of the key point according to the smoothing parameter influence factor and the original smoothing parameter;
and carrying out smoothing treatment on the key points according to the target smoothing parameters.
2. The method of claim 1, wherein the correlation attribute comprises at least one of:
the confidence of the key points, the area ratio of the detection frame to which the key points belong to the target image, the motion amplitude characteristic of the key points and the shielding condition of the key points.
3. The method of claim 2, wherein the relevance attributes include a confidence of the keypoint;
the determining the smooth parameter influence factor of the key point according to the target image comprises:
determining a heat map of the keypoints from the target image;
determining a confidence level for the keypoint from the heat map of the keypoint;
and determining a confidence coefficient influence factor of the key point according to the confidence coefficient of the key point, wherein the relationship between the numerical value of the confidence coefficient and the numerical value of the confidence coefficient influence factor is negative correlation.
4. The method according to claim 2, wherein the correlation attribute comprises an area ratio of a detection frame to which the key point belongs to the target image;
the determining the smooth parameter influence factor of the key point according to the target image comprises:
determining a detection frame of the target object according to the target image;
calculating the area ratio of the detection frame to the target image;
and determining the detection frame influence factor of the key point according to the area proportion, wherein the relationship between the numerical value of the area proportion and the numerical value of the detection frame influence factor is negative correlation.
5. The method of claim 2, wherein the correlation attribute comprises a motion amplitude characteristic of the keypoint;
the determining the smooth parameter influence factor of the key point according to the target image comprises:
and inquiring a preset mapping relation according to the identification of the key point, and determining a motion amplitude influence factor corresponding to the identification, wherein the mapping relation comprises the corresponding relation between the identification of the key point and the motion amplitude influence factor, the identification is used for indicating the motion amplitude characteristic of the key point, and the relation between the motion amplitude characteristic and the motion amplitude influence factor is negative correlation.
6. The method of claim 2, wherein the associated attribute comprises an occlusion condition of the keypoint;
the determining the smooth parameter influence factor of the key point according to the target image comprises:
determining the shielding condition of the key point according to the coordinate relation between the key point and a reference key point which is symmetrical to the key point;
if the occlusion condition is that the key point is not occluded, determining that an occlusion influence factor of the key point is a first occlusion influence factor;
and if the occlusion condition is that the key point is occluded, determining that the occlusion influence factor of the key point is a second occlusion influence factor, wherein the first occlusion influence factor is smaller than the second occlusion influence factor.
7. The method according to any one of claims 1-6, wherein the obtaining key points of the target object in the target image comprises:
determining a detection frame of the target object according to the target image;
and determining the key points according to the detection frame.
8. The method of any one of claims 1-7, wherein the target object comprises a human body; the key points comprise joint points of the human body; the correlation attributes comprise the confidence degrees of the key points, the area proportion of the detection frame to which the key points belong and the target image, the motion amplitude characteristics of the key points and the shielding condition of the key points; the determining the target smoothing parameter of the key point according to the smoothing parameter influence factor and the original smoothing parameter includes:
calculating the product of the original smooth parameter and the confidence coefficient influence factor, the detection frame influence factor, the motion amplitude influence factor and the shielding influence factor, wherein the product is the target smooth parameter;
the confidence coefficient influence factor is an influence factor corresponding to the confidence coefficient, the detection frame influence factor is an influence factor corresponding to the area proportion, the motion amplitude influence factor is an influence factor corresponding to the motion amplitude characteristic, and the shielding influence factor is an influence factor corresponding to the shielding condition.
9. An image processing apparatus characterized by comprising:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring key points of a target object in a target image, and the target object is a shot object;
a determining unit, configured to determine a smoothing parameter influence factor of the keypoint according to the target image, where the smoothing parameter influence factor is used to indicate a degree of influence of an associated attribute of the keypoint on an original smoothing parameter of the keypoint;
the determining unit is further configured to determine a target smoothing parameter of the keypoint according to the smoothing parameter influence factor and the original smoothing parameter;
and the smoothing unit is used for smoothing the key points according to the target smoothing parameters.
10. An electronic device, comprising:
one or more processors;
one or more memories for storing programs,
the one or more memories and the program are configured to control the apparatus to perform the steps in the method of any one of claims 1-8 by the one or more processors.
11. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-8.
CN202110640062.9A 2021-06-08 2021-06-08 Image processing method and related device Active CN113421196B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110640062.9A CN113421196B (en) 2021-06-08 2021-06-08 Image processing method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110640062.9A CN113421196B (en) 2021-06-08 2021-06-08 Image processing method and related device

Publications (2)

Publication Number Publication Date
CN113421196A true CN113421196A (en) 2021-09-21
CN113421196B CN113421196B (en) 2023-08-11

Family

ID=77788086

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110640062.9A Active CN113421196B (en) 2021-06-08 2021-06-08 Image processing method and related device

Country Status (1)

Country Link
CN (1) CN113421196B (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106296735A (en) * 2016-08-05 2017-01-04 海信集团有限公司 Filter update method, device and intelligent terminal in target following
CN107680128A (en) * 2017-10-31 2018-02-09 广东欧珀移动通信有限公司 Image processing method, device, electronic equipment and computer-readable recording medium
CN110245623A (en) * 2019-06-18 2019-09-17 重庆大学 A kind of real time human movement posture correcting method and system
CN110264430A (en) * 2019-06-29 2019-09-20 北京字节跳动网络技术有限公司 Video beautification method, device and electronic equipment
CN110401784A (en) * 2018-04-24 2019-11-01 展讯通信(天津)有限公司 Motion smoothing method, system and the video equipment of automatic adjusument filtering strength
US10620713B1 (en) * 2019-06-05 2020-04-14 NEX Team Inc. Methods and systems for touchless control with a mobile device
CN111028346A (en) * 2019-12-23 2020-04-17 北京奇艺世纪科技有限公司 Method and device for reconstructing video object
CN111090688A (en) * 2019-12-23 2020-05-01 北京奇艺世纪科技有限公司 Smoothing processing method and device for time sequence data
CN111327828A (en) * 2020-03-06 2020-06-23 Oppo广东移动通信有限公司 Photographing method and device, electronic equipment and storage medium
CN111523468A (en) * 2020-04-23 2020-08-11 北京百度网讯科技有限公司 Human body key point identification method and device
CN111670457A (en) * 2017-12-03 2020-09-15 脸谱公司 Optimization of dynamic object instance detection, segmentation and structure mapping
CN112381837A (en) * 2020-11-12 2021-02-19 联想(北京)有限公司 Image processing method and electronic equipment
WO2021034211A1 (en) * 2019-08-16 2021-02-25 Станислав Игоревич АШМАНОВ Method and system of transfer of motion of subject from video onto animated character
CN112488064A (en) * 2020-12-18 2021-03-12 平安科技(深圳)有限公司 Face tracking method, system, terminal and storage medium
KR102240403B1 (en) * 2019-12-24 2021-04-14 아주대학교 산학협력단 Image rectification method and image rectification apparatus
CN112800850A (en) * 2020-12-31 2021-05-14 上海商汤智能科技有限公司 Video processing method and device, electronic equipment and storage medium
JP2021077065A (en) * 2019-11-08 2021-05-20 Kddi株式会社 Image processing apparatus, information processing terminal, server, image processing method, and program

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106296735A (en) * 2016-08-05 2017-01-04 海信集团有限公司 Filter update method, device and intelligent terminal in target following
CN107680128A (en) * 2017-10-31 2018-02-09 广东欧珀移动通信有限公司 Image processing method, device, electronic equipment and computer-readable recording medium
CN111670457A (en) * 2017-12-03 2020-09-15 脸谱公司 Optimization of dynamic object instance detection, segmentation and structure mapping
CN110401784A (en) * 2018-04-24 2019-11-01 展讯通信(天津)有限公司 Motion smoothing method, system and the video equipment of automatic adjusument filtering strength
US10620713B1 (en) * 2019-06-05 2020-04-14 NEX Team Inc. Methods and systems for touchless control with a mobile device
CN110245623A (en) * 2019-06-18 2019-09-17 重庆大学 A kind of real time human movement posture correcting method and system
CN110264430A (en) * 2019-06-29 2019-09-20 北京字节跳动网络技术有限公司 Video beautification method, device and electronic equipment
WO2021034211A1 (en) * 2019-08-16 2021-02-25 Станислав Игоревич АШМАНОВ Method and system of transfer of motion of subject from video onto animated character
JP2021077065A (en) * 2019-11-08 2021-05-20 Kddi株式会社 Image processing apparatus, information processing terminal, server, image processing method, and program
CN111090688A (en) * 2019-12-23 2020-05-01 北京奇艺世纪科技有限公司 Smoothing processing method and device for time sequence data
CN111028346A (en) * 2019-12-23 2020-04-17 北京奇艺世纪科技有限公司 Method and device for reconstructing video object
KR102240403B1 (en) * 2019-12-24 2021-04-14 아주대학교 산학협력단 Image rectification method and image rectification apparatus
CN111327828A (en) * 2020-03-06 2020-06-23 Oppo广东移动通信有限公司 Photographing method and device, electronic equipment and storage medium
CN111523468A (en) * 2020-04-23 2020-08-11 北京百度网讯科技有限公司 Human body key point identification method and device
CN112381837A (en) * 2020-11-12 2021-02-19 联想(北京)有限公司 Image processing method and electronic equipment
CN112488064A (en) * 2020-12-18 2021-03-12 平安科技(深圳)有限公司 Face tracking method, system, terminal and storage medium
CN112800850A (en) * 2020-12-31 2021-05-14 上海商汤智能科技有限公司 Video processing method and device, electronic equipment and storage medium

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
HUIHONG XU ET AL.: "A novel image edge smoothing method based on convolutional neural network", 《INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS》, pages 1 - 11 *
RAJIV VERMA ET AL.: "Grey relational analysis based adaptive smoothing parameter for non-local means image denoising", 《MULTIMEDIA TOOLS AND APPLICATIONS》 *
RAJIV VERMA ET AL.: "Grey relational analysis based adaptive smoothing parameter for non-local means image denoising", 《MULTIMEDIA TOOLS AND APPLICATIONS》, 2 March 2018 (2018-03-02), pages 1 - 20 *
周阳: "面向关节坐标运动数据的运动重定向方法研究", 《中国优秀硕士学位论文全文数据库信息科技辑》, pages 138 - 1573 *
邱相炎: "基于新能量函数和自适应平滑过渡的图像拼接算法的研究", 《中国博士学位论文全文数据库信息科技辑》 *
邱相炎: "基于新能量函数和自适应平滑过渡的图像拼接算法的研究", 《中国博士学位论文全文数据库信息科技辑》, 15 March 2021 (2021-03-15), pages 41 - 78 *
马翔: "基于参数拟合与稀疏结构的图像平滑算法", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
马翔: "基于参数拟合与稀疏结构的图像平滑算法", 《中国优秀硕士学位论文全文数据库信息科技辑》, 15 December 2020 (2020-12-15), pages 7 - 41 *

Also Published As

Publication number Publication date
CN113421196B (en) 2023-08-11

Similar Documents

Publication Publication Date Title
WO2020103647A1 (en) Object key point positioning method and apparatus, image processing method and apparatus, and storage medium
CN107230225B (en) Method and apparatus for three-dimensional reconstruction
CN108198141B (en) Image processing method and device for realizing face thinning special effect and computing equipment
KR102279813B1 (en) Method and device for image transformation
CN109598744B (en) Video tracking method, device, equipment and storage medium
CN112506340B (en) Equipment control method, device, electronic equipment and storage medium
US20200380250A1 (en) Image processing method and apparatus, and computer storage medium
CN104809687A (en) Three-dimensional human face image generation method and system
US11335025B2 (en) Method and device for joint point detection
US20200120171A1 (en) Node processing
CN110991293A (en) Gesture recognition method and device, computer equipment and storage medium
US20220375258A1 (en) Image processing method and apparatus, device and storage medium
CN111754429A (en) Motion vector post-processing method and device, electronic device and storage medium
CN110111364B (en) Motion detection method and device, electronic equipment and storage medium
US20200349382A1 (en) Method and computing device for adjusting region of interest
CN112562068B (en) Human body posture generation method and device, electronic equipment and storage medium
CN115705651A (en) Video motion estimation method, device, equipment and computer readable storage medium
CN113421196B (en) Image processing method and related device
KR20230035382A (en) Height measurement method and device, and terminal
CN112417985A (en) Face feature point tracking method, system, electronic equipment and storage medium
CN112233223A (en) Automatic human body parametric model deformation method and device based on three-dimensional point cloud
CN113989376B (en) Method and device for acquiring indoor depth information and readable storage medium
CN115278084A (en) Image processing method, image processing device, electronic equipment and storage medium
CN115223240A (en) Motion real-time counting method and system based on dynamic time warping algorithm
CN113489897B (en) Image processing method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant