CN113421196B - Image processing method and related device - Google Patents

Image processing method and related device Download PDF

Info

Publication number
CN113421196B
CN113421196B CN202110640062.9A CN202110640062A CN113421196B CN 113421196 B CN113421196 B CN 113421196B CN 202110640062 A CN202110640062 A CN 202110640062A CN 113421196 B CN113421196 B CN 113421196B
Authority
CN
China
Prior art keywords
influence factor
key point
smoothing
determining
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110640062.9A
Other languages
Chinese (zh)
Other versions
CN113421196A (en
Inventor
潘睿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Douku Software Technology Co Ltd
Original Assignee
Hangzhou Douku Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Douku Software Technology Co Ltd filed Critical Hangzhou Douku Software Technology Co Ltd
Priority to CN202110640062.9A priority Critical patent/CN113421196B/en
Publication of CN113421196A publication Critical patent/CN113421196A/en
Application granted granted Critical
Publication of CN113421196B publication Critical patent/CN113421196B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The application provides an image processing method and a related device, wherein the method comprises the following steps: acquiring key points of a target object in a target image, wherein the target object is a shot object; determining a smoothing parameter influence factor of the key point according to the target image, wherein the smoothing parameter influence factor is used for indicating the influence degree of the association attribute of the key point on the original smoothing parameter of the key point; determining target smoothing parameters of key points according to the smoothing parameter influence factors and the original smoothing parameters; and smoothing the key points according to the target smoothing parameters. According to the embodiment of the application, the flexibility and the intelligence of the device for smoothing the image are improved by dynamically calculating the smoothing parameters of the single point in the current image.

Description

Image processing method and related device
Technical Field
The application belongs to the technical field of image processing, and particularly relates to an image processing method and a related device.
Background
At present, when equipment shoots an image and performs smoothing processing on the image, generally, uniformly set smoothing parameters are used for performing smoothing processing on joint points, the method cannot adapt to a real-time scene with complex changes, and higher smoothing parameters can cause too high algorithm complexity to cause delay.
Disclosure of Invention
The application provides an image processing method and a related device, which are used for dynamically calculating the smoothing parameters of a single point in a current image and improving the flexibility and the intelligence of smoothing the image of equipment.
In a first aspect, the present application provides an image processing method, including:
acquiring key points of a target object in a target image, wherein the target object is a shot object;
determining a smoothing parameter influence factor of the key point according to the target image, wherein the smoothing parameter influence factor is used for indicating the influence degree of the association attribute of the key point on the original smoothing parameter of the key point;
determining target smoothing parameters of the key points according to the smoothing parameter influence factors and the original smoothing parameters;
and carrying out smoothing processing on the key points according to the target smoothing parameters.
It can be seen that in the embodiment of the present application, since the target smoothing parameter of the key point is dynamically calculated by the smoothing parameter influence factor and the original smoothing parameter, and the smoothing parameter influence factor is used to indicate the influence degree of the correlation attribute of the key point on the original smoothing parameter of the key point, the smoothing processing complexity of the key point can be adaptively adjusted according to the correlation attribute of the key point, which is beneficial to improving the flexibility and intelligence of the smoothing processing image of the device.
In a second aspect, the present application provides an image processing apparatus comprising:
the acquisition unit is used for acquiring key points of a target object in the target image, wherein the target object is a shot object;
the determining unit is used for determining a smoothing parameter influence factor of the key point according to the target image, wherein the smoothing parameter influence factor is used for indicating the influence degree of the association attribute of the key point on the original smoothing parameter of the key point;
the determining unit is further configured to determine a target smoothing parameter of the key point according to the smoothing parameter influence factor and the original smoothing parameter;
and the smoothing unit is used for carrying out smoothing processing on the key points according to the target smoothing parameters.
In a third aspect, the present application provides an electronic device, one or more processors;
one or more memories for storing programs,
the one or more memories and the program are configured to control, by the one or more processors, the electronic device to execute instructions of steps in any of the methods of the first aspect of the embodiments of the application.
In a fourth aspect, the present application provides a chip comprising: a processor for calling and running a computer program from a memory, causing a device on which the chip is mounted to perform some or all of the steps as described in any of the methods of the first aspect of the embodiments of the application.
In a fifth aspect, the present application provides a computer readable storage medium storing a computer program for electronic data exchange, wherein the computer program causes a computer to perform part or all of the steps as described in any of the methods of the first aspect of the embodiments of the present application.
In a sixth aspect, the present application provides a computer program, wherein the computer program is operable to cause a computer to perform some or all of the steps described in any of the methods of the first aspect of the embodiments of the present application. The computer program may be a software installation package.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application;
fig. 2a is a schematic flow chart of an image processing method according to an embodiment of the present application;
fig. 2b is a schematic diagram illustrating a distribution of human joints according to an embodiment of the present application;
FIG. 2c is a schematic illustration of an elbow joint ambiguity according to one embodiment of the present application;
FIG. 2d is a schematic diagram of a functional module of a heat map generating model according to an embodiment of the present application;
fig. 2e is a schematic diagram of a smoothing process of a human body joint according to an embodiment of the present application;
Fig. 3 is a block diagram showing the functional units of an image processing apparatus according to an embodiment of the present application;
fig. 4 is a block diagram showing the functional units of another image processing apparatus according to an embodiment of the present application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms first, second and the like in the description and in the claims and in the above-described figures are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
The term "at least one" in the present application means one or more, and a plurality means two or more. In the present application and/or describing the association relationship of the association object, the representation may have three relationships, for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one (item) below" or the like, refers to any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein each of a, b, c may itself be an element, or may be a collection comprising one or more elements.
It should be noted that, the equality in the embodiment of the present application may be used with a greater than or less than the technical scheme adopted when the equality is greater than or equal to the technical scheme adopted when the equality is less than the technical scheme, and it should be noted that the equality is not used when the equality is greater than the technical scheme adopted when the equality is greater than or equal to the technical scheme adopted when the equality is greater than the technical scheme; when the value is equal to or smaller than that used together, the value is not larger than that used together. "of", corresponding "and" corresponding "in the embodiments of the present application may be sometimes used in combination, and it should be noted that the meaning to be expressed is consistent when the distinction is not emphasized.
First, some nouns involved in the embodiments of the present application are explained for easy understanding by those skilled in the art.
1. An electronic device. The electronic device in the embodiment of the present application is a device having an image signal processing function, and may be referred to as a User Equipment (UE), a terminal (terminal), a terminal device, a Mobile Station (MS), a Mobile Terminal (MT), an access terminal device, a vehicle-mounted terminal device, an industrial control terminal device, a UE unit, a UE station, a mobile station, a remote terminal device, a mobile device, a UE terminal device, a wireless communication device, a UE agent, or a UE apparatus. The user equipment may be fixed or mobile. For example, the user device may be a mobile phone, tablet, desktop, notebook, all-in-one, in-vehicle, virtual Reality (VR) terminal device, augmented reality (augmented reality, AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned (self-driving), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid, a wireless terminal in transportation security (transportation safety), a wireless terminal in smart city, a wireless terminal in smart home (PLMN), a cellular phone, a cordless phone, a session initiation protocol (session initiation protocol, SIP) phone, a wireless local loop (wireless local loop, WLL) station, a personal digital assistant (personal digital assistant, PDA), a handheld device with wireless communication capabilities, a computing device or other processing device connected to a wireless modem, a wearable device, a terminal in a future mobile communication network, or a terminal in an evolving public land network (public land mobile network), etc. In some embodiments of the present application, the user equipment may also be a device with transceiving functions, such as a system on a chip. The chip system may include a chip and may also include other discrete devices.
2. A smoothing algorithm. The smoothing algorithm in the embodiment of the application is a means for eliminating jitter in post-processing in image processing, and because the output result of each frame of the model has a certain error, when processing continuous frames, the front and rear frames can be seen to have obvious jitter, and the jitter needs to be eliminated by the post-processing means. There are two cases: in processing video, the current frame may be processed with reference to information of previous and subsequent frames, referred to as an offline algorithm. When the camera preview picture is processed, the current frame can only be processed by referring to the front frame information, and the back frame information is not processed, which is called an online algorithm. Whereas online algorithms may result in some hysteresis.
3. The Holt double-parameter linear index smoothing method. In the embodiment of the application, the Holt dual-parameter linear exponential smoothing method is also called Holt dual-parameter smoothing method and Holt dual-parameter linear exponential smoothing method, and a smoothing coefficient beta of a trend is added on the basis of a simple exponential smoothing coefficient alpha, so that the Holt dual-parameter linear exponential smoothing method is also called as a dual-parameter smoothing method. In a Holter double-parameter smoothing method model, prediction consists of two parts; one part is a horizontal part, and is updated by a simple exponential smoothing method on the basis of the upper horizontal part; the other part is a trend part, which is smoothly adjusted on the basis of the upper trend part and is updated by a simple exponential smoothing method; and the two are added to obtain the prediction of the next period.
In the real-time human body posture estimation task, since the model predicts each frame of the camera preview picture independently, the results of the front and rear frames have slight differences due to errors, and the results are reflected as jitter of key points. The dithering of the key points brings inconvenience to the subsequent use, so that the data of each frame needs to be smoothed.
At present, in the traditional smoothing processing mode, when an online task is solved, only the result of a previous frame can be referred, so that a remarkable problem is that the larger the smoothing parameter is, the less the output result shakes, but the more the hysteresis is, the point can not follow the movement of a person.
In view of the above problems, an embodiment of the present application provides an image processing method and a related apparatus, so as to dynamically calculate a smoothing parameter of a single point in a current image, and improve flexibility and intelligence of smoothing the image by a device, which is described below with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a schematic diagram of an electronic device 100 according to an embodiment of the application. The electronic device 100 includes an application processor 120, a memory 130, a communication module 140, and one or more programs 131, where the application processor 120 is communicatively connected to both the memory 130 and the communication module 140 via an internal communication bus.
In a specific implementation, the one or more programs 131 are stored in the memory 130 and configured to be executed by the application processor 120, where the one or more programs 131 include instructions for executing some or all of the steps executed by the electronic device in the embodiment of the present application.
The communication module 140 includes a local area network wireless communication module and a wired communication module.
The Application processor 120 may be, for example, a central processing unit (Central Processing Unit, CPU), a general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an Application-specific integrated circuit (ASIC), a field programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various exemplary logic blocks, units and circuits described in connection with this disclosure. The processor may also be a combination that performs the function of a computation, e.g., a combination comprising one or more microprocessors, a combination of a DSP and a microprocessor, and the like.
The memory 130 may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example but not limitation, many forms of random access memory (random access memory, RAM) are available, such as Static RAM (SRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate synchronous dynamic random access memory (DDR SDRAM), enhanced Synchronous Dynamic Random Access Memory (ESDRAM), synchronous Link DRAM (SLDRAM), and direct memory bus RAM (DR RAM).
Referring to fig. 2a, fig. 2a is a schematic flow chart of an image processing method according to an embodiment of the application, which is applied to an electronic device 100 of a vehicle; as shown in the figure, the image processing method includes the following steps.
Step 201, obtaining a key point of a target object in a target image, wherein the target object is a photographed object.
The key point refers to a structural feature point of the target object, for example, if the target object is a human body, the key point may be any one of 17 nodes in the node distribution diagram shown in fig. 2 b.
In one possible embodiment, the acquiring the keypoints of the target object in the target image includes: determining a detection frame of the target object according to the target image; and determining the key points according to the detection frame.
The detection frame may be a rectangular frame, and the device may specifically determine an outline of the target object according to a difference between an image area of the target object and a background image area, and further determine a rectangular frame with a smallest area covering the outline as the detection frame of the target object.
In this example, the electronic device may determine the detection frame where the target object is located accurately, and further predict the key point of the target object.
And step 202, determining a smoothing parameter influence factor of the key point according to the target image, wherein the smoothing parameter influence factor is used for indicating the influence degree of the association attribute of the key point on the original smoothing parameter of the key point.
In one possible embodiment, the association attribute includes at least one of: the confidence degree of the key point, the area ratio of the detection frame to which the key point belongs to the target image, the motion amplitude characteristic of the key point and the shielding condition of the key point.
The higher the confidence of the key point is, the smaller the jitter of the prediction result of the corresponding coordinate value is, and the lower the smoothing processing requirement is; the lower the confidence of the key point is, the larger the jitter of the prediction result of the corresponding coordinate value is, and the higher the smoothing processing requirement degree is correspondingly.
The size of the detection frame reflects the distance degree of the target object and the camera, the distance degree can influence the threshold setting of jitter, the area of the detection frame is larger for people with a relatively short distance, the relative movement of each frame can be larger, the jitter of a corresponding coordinate value prediction result is correspondingly smaller, and the smoothing processing requirement degree is correspondingly lower; for people with a longer distance, the area of the detection frame is smaller, the relative movement of each frame is smaller, the jitter of the prediction result of the corresponding coordinate value is correspondingly larger, and the smoothing processing requirement degree is correspondingly higher.
Taking a target object as a human body as an example, for an articulation point of a trunk part, relative motion of each frame is smaller, jitter of a corresponding coordinate value prediction result is correspondingly larger, and accordingly, the smoothing processing requirement degree is higher. For the joints of the limbs, especially the upper limbs, the motion of each frame is relatively large, the jitter of the prediction result of the corresponding coordinate value is correspondingly smaller, and the smoothing processing requirement degree is correspondingly lower.
Taking a target object as a human body as an example, for example, in the case of leaning, because information is missing at the joint points of the blocked arm part, the ambiguity is larger (the dark joint points shown in fig. 2c correspond to the right shoulder joint point, the right elbow joint point, the right wrist joint point and the right hip joint point of the user, wherein along with the swing of the right arm of the user, the right elbow joint point may have the overlapping of the spinal joint points to generate ambiguity), so that the shake of the front frame joint point and the back frame joint point is serious, and accordingly, the smoothing processing requirement degree is higher.
In this example, the device can dynamically adapt and adjust the values of the smoothing parameters of the key points in combination with the associated attributes of the key points, so as to improve flexibility and accuracy, and balance stability and time delay.
In one possible embodiment, the association attribute includes a confidence level of the keypoint; the determining the smoothing parameter influence factor of the key point according to the target image comprises the following steps: determining a heat map of the key points according to the target image; determining the confidence level of the key point according to the heat map of the key point; and determining a confidence coefficient influence factor of the key point according to the confidence coefficient of the key point, wherein the relation between the numerical value of the confidence coefficient and the numerical value of the confidence coefficient influence factor is a negative correlation.
The heat map refers to a position distribution area of predicted coordinates of key points of a previous frame image of the target image in the current target image, each coordinate point in the position partition area has a predicted probability attribute, the probability refers to that the largest coordinate point is a peak point, and the predicted probability of the peak point is the confidence of the key point.
In a specific implementation, the electronic device may process the target image by using a lightweight heat map prediction model to determine a heat map of a key point, where the heat map prediction model may include, for example, a processing module as shown in fig. 2d, where a first upsampling module is used for a first convolution operation and pixel reorganization, a second upsampling module is used for upsampling a processing result of the first upsampling module, a third upsampling module is used for upsampling a processing result of the second upsampling module, a second convolution operation module is used for calculating a processing result of the third upsampling module to output a two-dimensional heat map, an average pooling module is used for calculating a processing result of the third upsampling module, and a full connection layer module is used for calculating a calculation result of the average pooling module to output a one-dimensional heat map.
Illustratively, the confidence impact factor is formulated as: rconf=c1/Conf,
wherein Rconf is a confidence factor, C1 is a constant, and Conf is a confidence.
In this example, the electronic device can determine the confidence level of the key point according to the heat map of the target image, and calculate the confidence level influence factor according to the confidence level, so as to adjust the smoothing parameter according to the confidence level adaptation of the key point.
In a possible embodiment, the association attribute includes an area ratio of a detection frame to which the key point belongs to the target image; the determining the smoothing parameter influence factor of the key point according to the target image comprises the following steps: determining a detection frame of the target object according to the target image; calculating the area ratio of the detection frame to the target image; and determining a detection frame influence factor of the key point according to the area proportion, wherein the relationship between the numerical value of the area proportion and the numerical value of the detection frame influence factor is negative correlation.
Taking a target object as an example of a human body, the electronic device can calculate a detection frame influence factor rarea=c2× (imagew×imageh)/(box w×box h) according to the detected width and height of the human body frame and the detected width and height of the whole image
Wherein Rarea is a detection frame influence factor, C2 is a constant, imageW is a width of the target image, imageH is a height of the target image, box w is a width of the detection frame, and box h is a height of the detection frame.
In this example, the electronic device can determine the area ratio according to the detection frame of the target object, and calculate the detection frame influence factor according to the area ratio, so as to adapt and adjust the smoothing parameter according to the area ratio of the detection frame.
In one possible embodiment, the associated attribute includes a motion amplitude characteristic of the keypoint; the determining the smoothing parameter influence factor of the key point according to the target image comprises the following steps: inquiring a preset mapping relation according to the identification of the key point, and determining a motion amplitude influence factor corresponding to the identification, wherein the mapping relation comprises a correspondence relation between the identification of the key point and the motion amplitude influence factor, the identification is used for indicating the motion amplitude characteristic of the key point, and the relationship between the motion amplitude characteristic and the motion amplitude influence factor is negative correlation.
In a specific implementation, the electronic device can predict the motion amplitude characteristic of the key point according to the position of the key point in the image area of the target object and the motion characteristic of the target object, and the method can be specifically divided into the following two conditions of high and low.
The identification of the key point with high motion amplitude characteristic is a type of identification, the corresponding motion amplitude influence factor is a type of motion amplitude influence factor with lower value, and the identification of the key point with high motion amplitude characteristic is a type of identification, for example: the joint points of the trunk part of the human body have small amplitude due to less movement, the smoothness processing requirement degree is high, and the value of the movement amplitude influence factor is larger, for example, can be a value larger than 1.
The identification of the key point with low motion amplitude characteristic is the second-class identification, the corresponding motion amplitude influence factor is the second-class motion amplitude influence factor with higher value, and the identification of the key point with low motion amplitude characteristic is the second-class identification, for example: the joint points of the four limbs of the human body have high amplitude due to frequent movement, low delay is needed, the degree of the smoothness processing is low, and the value of the movement amplitude influence factor is small, for example, can be 1.
It can be seen that, in this example, the electronic device can determine the motion amplitude influencing factor according to the motion amplitude characteristic of the key point, so as to adapt and adjust the smoothing parameter according to the motion amplitude characteristic.
In a possible embodiment, the association attribute includes an occlusion condition of the key point; the determining the smoothing parameter influence factor of the key point according to the target image comprises the following steps: determining the shielding condition of the key points according to the coordinate relation between the key points and the reference key points symmetrical to the key points; if the shielding condition is that the key point is not shielded, determining that the shielding influence factor of the key point is a first shielding influence factor; and if the shielding condition is that the key point is shielded, determining that the shielding influence factor of the key point is a second shielding influence factor, wherein the first shielding influence factor is smaller than the second shielding influence factor.
For example, when the difference between the X-axis coordinate value and the Y-axis coordinate value of the left shoulder joint point and the right shoulder joint point is small, but the difference between the depth values Z is large, the joint point with the large depth value Z is a blocked point.
Wherein the first occlusion effect factor has a value of 1 and the second occlusion effect factor has an empirical value greater than 1.
In addition, if the key point is located in the central axis of the target object, the judgment of the blocked condition is not generally made, and the value of the blocking influence factor of the key point defaults to 1, namely, the blocking influence factor is not influenced.
In this example, as the jitter of the blocked points is more serious, a group of possible blocked points is obtained through the blocking condition judging mechanism, and then the blocking influence factors larger than 1 are multiplied uniformly, so that the value of the smoothing parameter is improved, and the jitter influence is reduced.
And 203, determining the target smoothing parameters of the key points according to the smoothing parameter influence factors and the original smoothing parameters.
In one possible embodiment, the target object comprises a human body; the key points comprise joint points of the human body; the association attribute comprises the confidence coefficient of the key point, the area proportion of a detection frame to which the key point belongs to the target image, the motion amplitude characteristic of the key point and the shielding condition of the key point; the determining the target smoothing parameter of the key point according to the smoothing parameter influence factor and the original smoothing parameter comprises the following steps: calculating the product of the original smoothing parameter and a confidence factor, a detection frame factor, a motion amplitude factor and a shielding factor, wherein the product is the target smoothing parameter;
The confidence factor is an influence factor corresponding to the confidence, the detection frame influence factor is an influence factor corresponding to the area proportion, the motion amplitude influence factor is an influence factor corresponding to the motion amplitude characteristic, and the shielding influence factor is an influence factor corresponding to the shielding condition.
In this example, for the human body detection frame application scenario, the electronic device may perform product operation on the original smoothing parameter and the confidence factor, the detection frame factor, the motion amplitude factor, and the occlusion factor to determine the target smoothing parameter, balance the smoothing effect and the time delay, and improve the smoothing efficiency.
And 204, smoothing the key points according to the target smoothing parameters.
Exemplary smoothing algorithms include, but are not limited to: holt dual-parameter smoothing, huffman filtering, exponential smoothing, savitzky-Golay algorithm, etc. The smoothing result may be used for at least one of: human body posture estimation, a stabilization algorithm of a detection frame, a stabilization algorithm of a key point of a human face, and the like. For applications of pose estimation, such as 3D model driving, motion recognition, etc., jitter can be reduced, and more stable and low-delay effects can be output.
It can be seen that in the embodiment of the present application, since the target smoothing parameter of the key point is dynamically calculated by the smoothing parameter influence factor and the original smoothing parameter, and the smoothing parameter influence factor is used to indicate the influence degree of the correlation attribute of the key point on the original smoothing parameter of the key point, the smoothing processing complexity of the key point can be adaptively adjusted according to the correlation attribute of the key point, which is beneficial to improving the flexibility and intelligence of the smoothing processing image of the device.
As shown in fig. 2e, assuming that the target object is a human body, the image processing method according to the embodiment of the present application includes the following steps:
step 2d01, the electronic device determines a target node of the human body image in the target image.
In step 2d02, the electronic device determines a confidence impact factor Rconf of the target node.
And 2d03, the electronic equipment determines a detection frame influence factor Rarea according to the area occupation ratio of the human body frame.
Step 2d04, the electronic device determines a motion amplitude influence factor Rjoint according to the identification of the target node.
And 2d05, the electronic equipment judges whether the target articulation point is an occluded articulation point.
If the target joint point is a joint point which is not shielded, executing step 2d06.
If the target joint is a blocked joint, step 2d07 is executed.
In step 2d06, the electronic device determines an occlusion influence factor roccalde 1.
Step 2d07, the electronic device determines an occlusion influencing factor roccalde 2.
Step 2d08, the electronic device calculates a target smoothing parameter smoothparam=smoothparam'. Rconf. Rarea. Rjoint. Roccalde,
wherein smoothParam is the target smoothing parameter and smoothParam' is the original smoothing parameter.
And step 2d09, the electronic equipment performs smoothing processing on the coordinates of the target node according to the target smoothing parameters to obtain processed coordinates.
In this example, the electronic device can dynamically calculate the smoothing parameters of the joint points, balance stability and efficiency for the human body detection scene.
The embodiment of the application provides an image processing device which can be an electronic device. Specifically, the image processing apparatus is configured to perform the steps performed by the electronic device in the above image processing method. The image processing device provided by the embodiment of the application can comprise modules corresponding to the corresponding steps.
The embodiment of the present application may divide the functional modules of the image processing apparatus according to the above-described method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. The division of the modules in the embodiment of the application is schematic, only one logic function is divided, and other division modes can be adopted in actual implementation.
Fig. 3 shows a possible configuration diagram of the image processing apparatus involved in the above-described embodiment in the case where respective functional blocks are divided with corresponding respective functions. As shown in fig. 3, the image processing apparatus 3 is applied to an electronic device; the device comprises:
an acquiring unit 30, configured to acquire a key point of a target object in a target image, where the target object is a photographed object;
a determining unit 31, configured to determine a smoothing parameter impact factor of the keypoint according to the target image, where the smoothing parameter impact factor is used to indicate an impact degree of an associated attribute of the keypoint on an original smoothing parameter of the keypoint;
the determining unit 31 is further configured to determine a target smoothing parameter of the key point according to the smoothing parameter influence factor and the original smoothing parameter;
and a smoothing unit 32, configured to perform smoothing processing on the key points according to the target smoothing parameter.
In one possible embodiment, the association attribute includes at least one of: the confidence degree of the key point, the area ratio of the detection frame to which the key point belongs to the target image, the motion amplitude characteristic of the key point and the shielding condition of the key point.
In one possible embodiment, the association attribute includes a confidence level of the keypoint; in terms of the determination of the smoothing parameter impact factor of the keypoint from the target image, the determining unit 31 is specifically configured to: determining a heat map of the key points according to the target image; determining the confidence of the key point according to the heat map of the key point; and determining a confidence coefficient influence factor of the key point according to the confidence coefficient of the key point, wherein the relation between the numerical value of the confidence coefficient and the numerical value of the confidence coefficient influence factor is a negative correlation.
In a possible embodiment, the association attribute includes an area ratio of a detection frame to which the key point belongs to the target image; in terms of the determination of the smoothing parameter impact factor of the keypoint from the target image, the determining unit 31 is specifically configured to: determining a detection frame of the target object according to the target image; calculating the area ratio of the detection frame to the target image; and determining a detection frame influence factor of the key point according to the area proportion, wherein the relationship between the value of the area proportion and the value of the detection frame influence factor is negative correlation.
In one possible embodiment, the associated attribute includes a motion amplitude characteristic of the keypoint; in terms of the determination of the smoothing parameter impact factor of the keypoint from the target image, the determining unit 31 is specifically configured to: inquiring a preset mapping relation according to the identification of the key point, and determining a motion amplitude influence factor corresponding to the identification, wherein the mapping relation comprises a correspondence relation between the identification of the key point and the motion amplitude influence factor, the identification is used for indicating the motion amplitude characteristic of the key point, and the relationship between the motion amplitude characteristic and the motion amplitude influence factor is negative correlation.
In a possible embodiment, the association attribute includes an occlusion condition of the key point; in terms of the determination of the smoothing parameter impact factor of the keypoint from the target image, the determining unit 31 is specifically configured to: determining the shielding condition of the key points according to the coordinate relation between the key points and the reference key points symmetrical to the key points; if the shielding condition is that the key point is not shielded, determining that the shielding influence factor of the key point is a first shielding influence factor; and if the shielding condition is that the key point is shielded, determining that the shielding influence factor of the key point is a second shielding influence factor, wherein the first shielding influence factor is smaller than the second shielding influence factor.
In one possible embodiment, in terms of the key points of the target object in the acquired target image, the acquiring unit 30 is specifically configured to: determining a detection frame of the target object according to the target image; and determining the key point according to the detection frame.
In one possible embodiment, the target object comprises a human body; the key points comprise joint points of the human body; the association attribute comprises the confidence coefficient of the key point, the area proportion of a detection frame to which the key point belongs to the target image, the motion amplitude characteristic of the key point and the shielding condition of the key point; in terms of the determining the target smoothing parameter of the keypoint based on the smoothing parameter influence factor and the original smoothing parameter, the determining unit 31 is specifically configured to: calculating the product of the original smoothing parameter and a confidence factor, a detection frame factor, a motion amplitude factor and a shielding factor, wherein the product is the target smoothing parameter; the confidence factor is an influence factor corresponding to the confidence, the detection frame influence factor is an influence factor corresponding to the area proportion, the motion amplitude influence factor is an influence factor corresponding to the motion amplitude characteristic, and the shielding influence factor is an influence factor corresponding to the shielding condition.
In the case of using an integrated unit, a schematic structural diagram of another image processing apparatus provided in an embodiment of the present application is shown in fig. 4. In fig. 4, the image processing apparatus 4 includes: a processing module 40 and a communication module 41. The processing module 40 is configured to control and manage actions of the device control apparatus, e.g., steps performed by the acquisition unit 30, the determination unit 31, the smoothing unit 32, and/or other processes for performing the techniques described herein. The communication module 41 is used to support interactions between the device control apparatus and other devices. As shown in fig. 4, the image processing apparatus may further include a storage module 42, the storage module 42 storing program codes and data of the image processing apparatus.
The processing module 40 may be a processor or controller, such as a central processing unit (Central Processing Unit, CPU), a general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an ASIC, an FPGA or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with this disclosure. The processor may also be a combination that performs the function of a computation, e.g., a combination comprising one or more microprocessors, a combination of a DSP and a microprocessor, and the like. The communication module 41 may be a transceiver, an RF circuit, a communication interface, or the like. The memory module 42 may be a memory.
All relevant contents of each scenario related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein. The image processing apparatus 3 and the image processing apparatus 4 may each perform the steps performed by the electronic device in the image processing method shown in fig. 2 a.
The foregoing has outlined rather broadly the more detailed description of embodiments of the application, wherein the principles and embodiments of the application are explained in detail using specific examples, the above examples being provided solely to facilitate the understanding of the method and core concepts of the application; meanwhile, as those skilled in the art will have variations in the detailed description and the application scope in accordance with the idea of the present application, the present description should not be construed as limiting the application.
The above embodiments may be implemented in whole or in part by software, hardware, firmware, or any other combination. When implemented in software, the above-described embodiments may be implemented in whole or in part in the form of a computer program product. The computer program product comprises one or more computer instructions or computer programs. When the computer instructions or computer program are loaded or executed on a computer, the processes or functions described in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center by wired or wireless means. The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more sets of available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium. The semiconductor medium may be a solid state disk.
The embodiment of the application also provides a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program makes a computer execute part or all of the steps of any one of the above method embodiments, and the computer includes an electronic device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to perform part or all of the steps of any one of the methods described in the method embodiments above. The computer program product may be a software installation package, said computer comprising an electronic device.
It should be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed method, apparatus and system may be implemented in other manners. For example, the device embodiments described above are merely illustrative; for example, the division of the units is only one logic function division, and other division modes can be adopted in actual implementation; for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may be physically included separately, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Although the present invention is disclosed above, the present invention is not limited thereto. Variations and modifications, including combinations of the different functions and implementation steps, as well as embodiments of the software and hardware, may be readily apparent to those skilled in the art without departing from the spirit and scope of the invention.

Claims (10)

1. An image processing method, comprising:
acquiring key points of a target object in a target image, wherein the target object is a shot object, and the key points refer to structural feature points of the target object;
determining a smoothing parameter influence factor of the key point according to the target image, wherein the smoothing parameter influence factor is used for indicating the influence degree of the association attribute of the key point on the original smoothing parameter of the key point, and the association attribute comprises at least one of the following components: the confidence degree of the key point, the area ratio of the detection frame to which the key point belongs to the target image, the motion amplitude characteristic of the key point and the shielding condition of the key point, wherein the confidence degree is the prediction probability corresponding to the maximum coordinate peak point in the position distribution area of the key point of the previous frame image of the target image and the predicted coordinate of the target image; the motion amplitude characteristic is predicted according to the position of the key point in the image area of the target object and the motion characteristic of the target object, and the smoothing parameter influence factor corresponding to the motion amplitude characteristic is a motion amplitude influence factor; the determining the smoothing parameter influence factor of the key point according to the target image comprises the following steps: setting the marks of the key points with high motion amplitude as a first type mark and setting the marks of the key points with low motion amplitude as a second type mark; when the identification is the identification of the type, the range of the joint point of the trunk part of the target object is small due to less movement, and the value of the movement range influence factor is set to be larger if the degree of the smoothing treatment requirement is high; when the identification is the second-class identification, the amplitude of the joint points of the four limbs of the target object is high due to frequent movement, low delay is needed, the degree of the smoothing processing is low, and the value of the movement amplitude influence factor is set to be smaller;
Determining target smoothing parameters of the key points according to the smoothing parameter influence factors and the original smoothing parameters;
and carrying out smoothing processing on the key points according to the target smoothing parameters.
2. The method of claim 1, wherein the association attribute comprises a confidence level of the keypoint;
the determining the smoothing parameter influence factor of the key point according to the target image comprises the following steps:
determining a heat map of the key points according to the target image;
determining the confidence level of the key point according to the heat map of the key point;
and determining a confidence coefficient influence factor of the key point according to the confidence coefficient of the key point, wherein the relation between the numerical value of the confidence coefficient and the numerical value of the confidence coefficient influence factor is a negative correlation.
3. The method of claim 1, wherein the associated attribute comprises an area ratio of a detection box to which the keypoint belongs to the target image;
the determining the smoothing parameter influence factor of the key point according to the target image comprises the following steps:
determining a detection frame of the target object according to the target image;
calculating the area ratio of the detection frame to the target image;
And determining a detection frame influence factor of the key point according to the area proportion, wherein the relationship between the numerical value of the area proportion and the numerical value of the detection frame influence factor is negative correlation.
4. The method of claim 1, wherein the correlation attribute comprises a motion amplitude characteristic of the keypoint;
the determining the smoothing parameter influence factor of the key point according to the target image comprises the following steps:
inquiring a preset mapping relation according to the identification of the key point, and determining a motion amplitude influence factor corresponding to the identification, wherein the mapping relation comprises a correspondence relation between the identification of the key point and the motion amplitude influence factor, the identification is used for indicating the motion amplitude characteristic of the key point, and the relationship between the motion amplitude characteristic and the motion amplitude influence factor is negative correlation.
5. The method of claim 1, wherein the associated attribute comprises an occlusion condition of the keypoint;
the determining the smoothing parameter influence factor of the key point according to the target image comprises the following steps:
determining the shielding condition of the key points according to the coordinate relation between the key points and the reference key points symmetrical to the key points;
If the shielding condition is that the key point is not shielded, determining that the shielding influence factor of the key point is a first shielding influence factor;
and if the shielding condition is that the key point is shielded, determining that the shielding influence factor of the key point is a second shielding influence factor, wherein the first shielding influence factor is smaller than the second shielding influence factor.
6. The method according to any one of claims 1-5, wherein the acquiring the keypoints of the target object in the target image comprises:
determining a detection frame of the target object according to the target image;
and determining the key points according to the detection frame.
7. The method of any one of claims 1-5, wherein the target object comprises a human body; the determining the target smoothing parameter of the key point according to the smoothing parameter influence factor and the original smoothing parameter comprises the following steps:
calculating the product of the original smoothing parameter and a confidence factor, a detection frame factor, a motion amplitude factor and a shielding factor, wherein the product is the target smoothing parameter;
the confidence factor is an influence factor corresponding to the confidence, the detection frame influence factor is an influence factor corresponding to the area proportion, the motion amplitude influence factor is an influence factor corresponding to the motion amplitude characteristic, and the shielding influence factor is an influence factor corresponding to the shielding condition.
8. An image processing apparatus, comprising:
the acquisition unit is used for acquiring key points of a target object in a target image, wherein the target object is a shot object, and the key points refer to structural feature points of the target object;
a determining unit, configured to determine a smoothing parameter influence factor of the keypoint according to the target image, where the smoothing parameter influence factor is used to indicate a degree of influence of an association attribute of the keypoint on an original smoothing parameter of the keypoint, and the association attribute includes at least one of: the confidence degree of the key point, the area ratio of the detection frame to which the key point belongs to the target image, the motion amplitude characteristic of the key point and the shielding condition of the key point; the motion amplitude characteristic is predicted according to the position of the key point in the image area of the target object and the motion characteristic of the target object, and the smoothing parameter influence factor corresponding to the motion amplitude characteristic is a motion amplitude influence factor; the determining the smoothing parameter influence factor of the key point according to the target image comprises the following steps: setting the marks of the key points with high motion amplitude as a first type mark and setting the marks of the key points with low motion amplitude as a second type mark; when the identification is the identification of the type, the range of the joint point of the trunk part of the target object is small due to less movement, and the value of the movement range influence factor is set to be larger if the degree of the smoothing treatment requirement is high; when the identification is the second-class identification, the amplitude of the joint points of the four limbs of the target object is high due to frequent movement, low delay is needed, the degree of the smoothing processing is low, and the value of the movement amplitude influence factor is set to be smaller;
The determining unit is further configured to determine a target smoothing parameter of the key point according to the smoothing parameter influence factor and the original smoothing parameter;
and the smoothing unit is used for carrying out smoothing processing on the key points according to the target smoothing parameters.
9. An electronic device, comprising:
one or more processors;
one or more memories for storing programs,
the one or more memories and the program are configured to control, by the one or more processors, the device to perform the steps in the method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-7.
CN202110640062.9A 2021-06-08 2021-06-08 Image processing method and related device Active CN113421196B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110640062.9A CN113421196B (en) 2021-06-08 2021-06-08 Image processing method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110640062.9A CN113421196B (en) 2021-06-08 2021-06-08 Image processing method and related device

Publications (2)

Publication Number Publication Date
CN113421196A CN113421196A (en) 2021-09-21
CN113421196B true CN113421196B (en) 2023-08-11

Family

ID=77788086

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110640062.9A Active CN113421196B (en) 2021-06-08 2021-06-08 Image processing method and related device

Country Status (1)

Country Link
CN (1) CN113421196B (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106296735A (en) * 2016-08-05 2017-01-04 海信集团有限公司 Filter update method, device and intelligent terminal in target following
CN107680128A (en) * 2017-10-31 2018-02-09 广东欧珀移动通信有限公司 Image processing method, device, electronic equipment and computer-readable recording medium
CN110245623A (en) * 2019-06-18 2019-09-17 重庆大学 A kind of real time human movement posture correcting method and system
CN110264430A (en) * 2019-06-29 2019-09-20 北京字节跳动网络技术有限公司 Video beautification method, device and electronic equipment
CN110401784A (en) * 2018-04-24 2019-11-01 展讯通信(天津)有限公司 Motion smoothing method, system and the video equipment of automatic adjusument filtering strength
US10620713B1 (en) * 2019-06-05 2020-04-14 NEX Team Inc. Methods and systems for touchless control with a mobile device
CN111028346A (en) * 2019-12-23 2020-04-17 北京奇艺世纪科技有限公司 Method and device for reconstructing video object
CN111090688A (en) * 2019-12-23 2020-05-01 北京奇艺世纪科技有限公司 Smoothing processing method and device for time sequence data
CN111327828A (en) * 2020-03-06 2020-06-23 Oppo广东移动通信有限公司 Photographing method and device, electronic equipment and storage medium
CN111523468A (en) * 2020-04-23 2020-08-11 北京百度网讯科技有限公司 Human body key point identification method and device
CN111670457A (en) * 2017-12-03 2020-09-15 脸谱公司 Optimization of dynamic object instance detection, segmentation and structure mapping
CN112381837A (en) * 2020-11-12 2021-02-19 联想(北京)有限公司 Image processing method and electronic equipment
WO2021034211A1 (en) * 2019-08-16 2021-02-25 Станислав Игоревич АШМАНОВ Method and system of transfer of motion of subject from video onto animated character
CN112488064A (en) * 2020-12-18 2021-03-12 平安科技(深圳)有限公司 Face tracking method, system, terminal and storage medium
KR102240403B1 (en) * 2019-12-24 2021-04-14 아주대학교 산학협력단 Image rectification method and image rectification apparatus
CN112800850A (en) * 2020-12-31 2021-05-14 上海商汤智能科技有限公司 Video processing method and device, electronic equipment and storage medium
JP2021077065A (en) * 2019-11-08 2021-05-20 Kddi株式会社 Image processing apparatus, information processing terminal, server, image processing method, and program

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106296735A (en) * 2016-08-05 2017-01-04 海信集团有限公司 Filter update method, device and intelligent terminal in target following
CN107680128A (en) * 2017-10-31 2018-02-09 广东欧珀移动通信有限公司 Image processing method, device, electronic equipment and computer-readable recording medium
CN111670457A (en) * 2017-12-03 2020-09-15 脸谱公司 Optimization of dynamic object instance detection, segmentation and structure mapping
CN110401784A (en) * 2018-04-24 2019-11-01 展讯通信(天津)有限公司 Motion smoothing method, system and the video equipment of automatic adjusument filtering strength
US10620713B1 (en) * 2019-06-05 2020-04-14 NEX Team Inc. Methods and systems for touchless control with a mobile device
CN110245623A (en) * 2019-06-18 2019-09-17 重庆大学 A kind of real time human movement posture correcting method and system
CN110264430A (en) * 2019-06-29 2019-09-20 北京字节跳动网络技术有限公司 Video beautification method, device and electronic equipment
WO2021034211A1 (en) * 2019-08-16 2021-02-25 Станислав Игоревич АШМАНОВ Method and system of transfer of motion of subject from video onto animated character
JP2021077065A (en) * 2019-11-08 2021-05-20 Kddi株式会社 Image processing apparatus, information processing terminal, server, image processing method, and program
CN111090688A (en) * 2019-12-23 2020-05-01 北京奇艺世纪科技有限公司 Smoothing processing method and device for time sequence data
CN111028346A (en) * 2019-12-23 2020-04-17 北京奇艺世纪科技有限公司 Method and device for reconstructing video object
KR102240403B1 (en) * 2019-12-24 2021-04-14 아주대학교 산학협력단 Image rectification method and image rectification apparatus
CN111327828A (en) * 2020-03-06 2020-06-23 Oppo广东移动通信有限公司 Photographing method and device, electronic equipment and storage medium
CN111523468A (en) * 2020-04-23 2020-08-11 北京百度网讯科技有限公司 Human body key point identification method and device
CN112381837A (en) * 2020-11-12 2021-02-19 联想(北京)有限公司 Image processing method and electronic equipment
CN112488064A (en) * 2020-12-18 2021-03-12 平安科技(深圳)有限公司 Face tracking method, system, terminal and storage medium
CN112800850A (en) * 2020-12-31 2021-05-14 上海商汤智能科技有限公司 Video processing method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
面向关节坐标运动数据的运动重定向方法研究;周阳;《中国优秀硕士学位论文全文数据库信息科技辑》;I138-1573 *

Also Published As

Publication number Publication date
CN113421196A (en) 2021-09-21

Similar Documents

Publication Publication Date Title
WO2020103647A1 (en) Object key point positioning method and apparatus, image processing method and apparatus, and storage medium
CN107230225B (en) Method and apparatus for three-dimensional reconstruction
US20180307928A1 (en) Living face verification method and device
CN110909580B (en) Data processing method and device, electronic equipment and storage medium
US7333133B2 (en) Recursive least squares approach to calculate motion parameters for a moving camera
CN111090688B (en) Smoothing processing method and device for time sequence data
CN112562068B (en) Human body posture generation method and device, electronic equipment and storage medium
CN104809687A (en) Three-dimensional human face image generation method and system
US20200380250A1 (en) Image processing method and apparatus, and computer storage medium
KR20230035382A (en) Height measurement method and device, and terminal
CN113421196B (en) Image processing method and related device
CN113984068A (en) Positioning method, positioning apparatus, and computer-readable storage medium
CN115705651A (en) Video motion estimation method, device, equipment and computer readable storage medium
JP6839116B2 (en) Learning device, estimation device, learning method, estimation method and computer program
CN115533906A (en) Robot control method, device, electronic device and storage medium
CN115435790A (en) Method and system for fusing visual positioning and visual odometer pose
CN113489897B (en) Image processing method and related device
CN115223240A (en) Motion real-time counting method and system based on dynamic time warping algorithm
CN114564014A (en) Object information determination method, mobile robot system, and electronic device
CN110827226B (en) Skeleton point smoothing method and device and electronic equipment
CN111684489B (en) Image processing method and device
CN110443887B (en) Feature point positioning method, device, reconstruction method, system, equipment and medium
CN117275089A (en) Character recognition method, device and equipment for monocular camera and storage medium
CN111784733A (en) Image processing method, device, terminal and computer readable storage medium
US20230290101A1 (en) Data processing method and apparatus, electronic device, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant