CN117788316A - Image processing method, apparatus, electronic device, medium, and computer program product - Google Patents

Image processing method, apparatus, electronic device, medium, and computer program product Download PDF

Info

Publication number
CN117788316A
CN117788316A CN202410176399.2A CN202410176399A CN117788316A CN 117788316 A CN117788316 A CN 117788316A CN 202410176399 A CN202410176399 A CN 202410176399A CN 117788316 A CN117788316 A CN 117788316A
Authority
CN
China
Prior art keywords
image
layer
images
common
common layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410176399.2A
Other languages
Chinese (zh)
Inventor
方智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202410176399.2A priority Critical patent/CN117788316A/en
Publication of CN117788316A publication Critical patent/CN117788316A/en
Pending legal-status Critical Current

Links

Abstract

The application discloses an image processing method, an image processing device, electronic equipment, media and a computer program product, which belong to the technical field of electronic equipment, and the method comprises the following steps: determining a common image layer from at least two images under the condition that the similarity of the at least two images is larger than or equal to the preset similarity; and storing the at least two images according to the shared image layer and the non-shared image layer of the at least two images.

Description

Image processing method, apparatus, electronic device, medium, and computer program product
Technical Field
The application belongs to the technical field of electronic equipment, and particularly relates to an image processing method, an image processing device, electronic equipment, a medium and a computer program product.
Background
With the development of electronic technology, there are more and more images that can be stored in electronic devices. For example, in daily use or travel photographing of a user, in order to ensure that the most satisfactory photograph can be taken, photographing is often repeated multiple times in one scene, and multiple similar images in the same scene are obtained.
However, as the number of similar images stored in the same scene increases in the electronic device, the storage space required to store the images also increases, resulting in an insufficient remaining storage space of the electronic device.
Disclosure of Invention
An object of an embodiment of the present application is to provide an image processing method, an apparatus, an electronic device, a medium, and a computer program product, which can increase the remaining storage space of the electronic device.
In a first aspect, an embodiment of the present application provides an image processing method, including: under the condition that the similarity of at least two images is larger than or equal to the preset similarity, determining a shared image layer from the at least two images; at least two images are stored according to the common layer and the non-common layer of the at least two images.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including: a determining module and a storing module; the determining module is used for determining a shared image layer from at least two images under the condition that the similarity of the at least two images is greater than or equal to the preset similarity; and the storage module is used for storing at least two images according to the shared image layer and the non-shared image layer of the at least two images, which are determined by the determination module.
In a third aspect, embodiments of the present application provide an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, when the similarity of at least two images is greater than or equal to the preset similarity, a shared image layer is determined from the at least two images, and the at least two images are stored according to the shared image layer and non-shared image layers of the at least two images. By the scheme, the storage space occupied by the same image content in the at least two images can be reduced, the storage space occupied by storing the at least two images is reduced, and the remaining storage space of the electronic equipment is further increased.
Drawings
FIG. 1 is one of the flowcharts of an image processing method provided in an embodiment of the present application;
FIG. 2 is an example schematic diagram of an image stored by an electronic device provided in an embodiment of the present application;
fig. 3 is an example schematic diagram of a layer of an image obtained by an electronic device according to an embodiment of the present application;
fig. 4 is an example schematic diagram of an electronic device for identifying a layer according to an embodiment of the present application;
fig. 5 is a schematic diagram of an example of an electronic device display image according to an embodiment of the present application;
FIG. 6 is a second flowchart of an image processing method according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 8 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application;
fig. 9 is a second schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
Technical solutions in the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The terms "first," "second," and the like in the description of the present application, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the objects identified by "first," "second," etc. are generally of a type and do not limit the number of objects, for example, the first object may be one or more. In addition, "and/or" in the specification means at least one of the connected objects, and the character "/", generally means a relationship in which the associated objects are one kind of "or".
The terms "at least one", and the like in the description of the present application refer to any one, any two, or a combination of two or more of the objects that it comprises. For example, at least one of a, b, c (item) may represent: "a", "b", "c", "a and b", "a and c", "b and c" and "a, b and c", wherein a, b, c may be single or plural. Similarly, the term "at least two" means two or more, and the meaning of the expression is similar to the term "at least one".
The image processing method, apparatus, electronic device, medium and computer program product provided in the embodiments of the present application are described in detail below with reference to the accompanying drawings by means of specific embodiments and application scenarios thereof.
The image processing method, the image processing device, the electronic equipment, the medium and the computer program product provided by the embodiment of the application can be applied to a scene for storing images in the electronic equipment.
With the development of electronic equipment shooting technology, the definition of a single image shot by the electronic equipment is higher and higher, and meanwhile, the corresponding occupied storage space is larger and larger, so that a great challenge is presented to the storage space of the electronic equipment.
Typically, a single image taken by an electronic device occupies a storage space that varies from 1-2MB to 7-8 MB. In daily use and travel photographing of a user, in order to ensure that the most satisfactory picture can be taken, it is common to repeat photographing a plurality of times in one scene and use it in a subsequent video clip or friend circle.
However, users often do not easily clean other photos, which causes that seven or eight photos in a same scene may appear, and occupy more storage space, which greatly consumes the storage space of the electronic device. Moreover, if the image is compressed and stored, the user needs to synchronously decompress and view the image when previewing, so that the effect of viewing the image by the user is affected and the problem of mobile phone endurance is caused by frequent decompression. As such, insufficient remaining storage space of the electronic device may result.
In the embodiment of the application, the storage space occupied by the same image content in the at least two images can be reduced, the storage space occupied by storing the at least two images is reduced, and the remaining storage space of the electronic equipment is further increased.
The execution subject of the image processing method provided in the embodiment of the present application may be an image processing apparatus. The image processing apparatus may be an electronic device or a component in the electronic device, such as an integrated circuit or a chip, for example. The image processing method provided in the embodiment of the present application will be exemplarily described below by taking an electronic device as an example.
An embodiment of the present application provides an image processing method, and fig. 1 shows a flowchart of the image processing method provided in the embodiment of the present application, where the method may be applied to an electronic device. As shown in fig. 1, the image processing method provided in the embodiment of the present application may include the following steps 101 and 102.
Step 101, the electronic device determines a common image layer from at least two images when the similarity of the at least two images is greater than or equal to a preset similarity.
In some embodiments of the present application, the at least two images may be images in an electronic device. It is understood that the image in the electronic device may include an image captured by the electronic device, an image obtained by the electronic device capturing the content displayed on the screen, an image received by the electronic device from another device, an image downloaded by the electronic device from a network, and an image in a video.
In some embodiments of the present application, the at least two images may belong to the same video or different videos.
In some embodiments of the present application, the same video or different videos to which the at least two images belong may be videos stored in the electronic device, or may be videos acquired by the electronic device from other devices. The embodiments of the present application are not particularly limited.
The at least two images may be at least two frames of images in a video stored in the electronic device, for example.
Thus, the electronic device can improve the flexibility of acquiring at least two images with the similarity greater than or equal to the preset similarity.
In some embodiments of the present application, the similarity of the at least two images being greater than or equal to a preset similarity may indicate: the image content similarity of the at least two images is greater than or equal to a preset similarity. In other words, at least part of the same image content exists in the at least two images, and the at least two images are similar images.
The at least two images may be multiple images captured by the electronic device in the same capture scene.
The at least two images may be, for example, a plurality of images acquired by the electronic device from other devices that contain at least partially identical image content.
In some embodiments of the present application, the preset similarity may be default to the electronic device or may be set by a user. The embodiments of the present application are not particularly limited.
In some embodiments of the present application, the electronic device may calculate, through an image intelligent comparison algorithm, a similarity between image contents of images stored by the electronic device, so as to determine at least two images with image content similarity greater than or equal to a preset similarity.
For example, the electronic device may calculate the similarity between the image contents of the images through a histogram algorithm, thereby determining at least two images having a similarity greater than or equal to a preset similarity.
The electronic device may also calculate a similarity between image contents of the images through a gray scale map algorithm, thereby determining at least two images having a similarity greater than or equal to a preset similarity.
In practical implementation, the electronic device may further calculate the similarity between the image contents of the images through any other image intelligent comparison algorithm, so as to determine at least two images with the similarity greater than or equal to the preset similarity. The embodiments of the present application are not particularly limited.
For example, the preset similarity is 50%. As shown in fig. 2, it is assumed that four images of an image 201, an image 202, an image 203, and an image 204 are stored in the electronic device. The electronic device may determine that the image 201, the image 202, and the image 203 each include the image content of sun, snow, and maple leaves by comparing the image content of the four images, where the image 201, the image 202, and the image 203 are images in the same shooting scene, and the image 204 is an image in another shooting scene. The intelligent image comparison algorithm of the electronic device determines that the similarity between the image 201, the image 202 and the image 203 is 60%. So that the electronic device can determine the image 201, the image 202, and the image 203 as three images having a similarity greater than or equal to a preset similarity. Meanwhile, the images 201 and 202 also comprise carts and figures at different positions; also included in the image 203 is a person without a cart.
In some embodiments of the present application, the electronic device may acquire at least two images in a preset scene, and calculate the similarity of the images in the electronic device.
For example, assuming that the above-mentioned preset scenario is that the electronic device is idle and has sufficient electric power, the electronic device may acquire at least two images with a similarity greater than or equal to the preset similarity under the condition that the electronic device is idle and has sufficient electric power.
It should be noted that, when the electronic device acquires the at least two images, the images with similar storage time may be preferentially compared, as shown in fig. 2, the storage time of the image 201 is 2023, 8, 23, and 13:23:11, the storage time of the image 202 is 2023, 8, 23, 13:23:21, the image 203 is stored for 2023, 9, 13, 15:33:31, the image 204 is stored for 2023, 8, 24, 15:23:11. the storage time of the image 201 and the storage time of the image 202 are similar, and the electronic device may preferentially classify the images to obtain at least two images with a similarity greater than or equal to a preset similarity. It can be understood that the calculation amount of the electronic equipment is moderate at this time, and the method is suitable for optimizing the storage space of the electronic equipment in real time.
For example, assuming that the above-mentioned preset scenario is when the electronic device is charged, the electronic device may perform traversal comparison on the image stored in the preset time period and all the images stored in the electronic device under the charging condition, so as to obtain at least two images with a similarity greater than or equal to the preset similarity.
It can be understood that the calculation amount of the electronic device is larger at this time, so that the storage space of the electronic device can be optimized to the maximum extent.
For example, assuming that the above-mentioned preset scene is an image stored in the electronic device, for example, when the electronic device shoots an image, the electronic device may perform traversal comparison on the image and all the images stored in the electronic device, so as to obtain at least two images with a similarity greater than or equal to the preset similarity.
It should be noted that, when the electronic device stores the image, the electronic device may also store the image separately first, and compare the image with other images stored in the electronic device in a preset scene later, so as to obtain at least two images with similarity greater than or equal to the preset similarity. The embodiments of the present application are not particularly limited.
In some embodiments of the present application, the common layer may include the same image content in the at least two images.
In some embodiments of the present application, after the electronic device determines the common image layer from at least two images, the non-common image layer corresponding to each image may also be determined from the at least two images.
It is understood that the non-common layer corresponding to each of the at least two images may include image contents of the at least two images other than the image contents of the common layer.
In some embodiments of the present application, for each of at least two images acquired by an electronic device, the electronic device may perform image layering on the at least two images, and extract the same image content in the at least two images by using an artificial intelligence (Artificial Intelligence, AI) algorithm, so as to separate a common image layer and at least two non-common image layers.
It will be appreciated that the content of the images in the common layer separated by each of the at least two images is the same. The content of the images in the non-shared image layer separated from each of the at least two images is different, and the non-shared image layer may be a private image layer of the corresponding image.
Illustratively, in conjunction with fig. 2, as shown in fig. 3, the electronic device may perform layering processing on the image 201 to obtain a shared layer 301 and a non-shared layer 302; layering the image 202 to obtain a shared image layer 301 and a non-shared image layer 303; the image 203 is layered to obtain a shared layer 301 and a non-shared layer 304. Wherein the common layer 301 comprises sun, snow and maple leaves; the non-common layer 302 includes characters and carts therein; the non-common layer 303 includes a person and a cart, and the positions of the person and the cart in the non-common layer 303 are different from those in the non-common layer 302; only characters are included in the non-common layer 304.
It will be appreciated that non-common layer 302 is a private layer of image 201, non-common layer 303 is a private layer of image 202, and non-common layer 304 is a private layer of image 203; common layer 301 is a common layer corresponding to image 201, image 202, and image 203.
In some embodiments of the present application, the electronic device may indicate the at least two images through the obtained at least two non-common image layers. That is, the common layer obtained by the electronic device may not be counted as a picture, and the presence of the common layer does not change the number of images stored in the electronic device.
Step 102, the electronic device stores at least two images according to the shared image layer and the non-shared image layer of the at least two images.
In some embodiments of the present application, the electronic device may store a determined common image layer in association with a non-common image layer in each of at least two images, so as to implement storage of the at least two images, so that other image contents in the images may be separated into private non-common image layers by extracting the same image contents contained in the at least two images into the common image layer. The shared image layer only needs to be stored, so that the consumption of the storage space is greatly reduced, and the remaining storage space of the electronic equipment can be increased while the long endurance of the storage space of the electronic equipment is realized.
For example, as shown in fig. 2, assume that four images, that is, an image 201, an image 202, an image 203, and an image 204, are stored in the electronic device, and that the image 201 occupies 5MB of storage space, the image 202 occupies 6MB of storage space, the image 203 occupies 4MB of storage space, and the image 204 occupies 7MB of storage space. I.e. the four images occupy a total of 22MB of memory, with 15MB of memory occupied by images 201, 202 and 203. Referring to fig. 2, as shown in fig. 3, the electronic device performs layering processing on an image 201, an image 202, and an image 203, and obtains a 3MB shared layer 301, a 2MB non-shared layer 302, a 3MB non-shared layer 303, and a 1MB non-shared layer 304. After the electronic device stores the obtained common layer 301, the non-common layer 302, the non-common layer 303 and the non-common layer 304 in an associated manner, the storage space occupied by the common layer 301, the non-common layer 302, the non-common layer 303 and the non-common layer 304 is 9MB, which greatly reduces the occupation of the storage space compared with 15MB occupied by the image 201, the image 202 and the image 203.
In some embodiments of the present application, the electronic device may store the at least two images by associating one common layer with a non-common layer in each image by adding the same identifier to the one common layer and the non-common layer in each image.
In some embodiments of the present application, the electronic device may further store the at least two images by creating a mapping table to associate the one common layer with the non-common layer in each image.
In a practical implementation, the electronic device may store the at least two images by associating a common image layer with a non-common image layer in each image in any other manner. The embodiments of the present application are not particularly limited.
In some embodiments of the present application, for image frames of a video stored in an electronic device, the electronic device may perform layer separation processing on at least two image frames of the video, obtain a common layer and at least two non-common layers corresponding to the at least two image frames, and store the at least two image frames in an associated manner. Thus, the electronic device can reduce the storage space required to store the video by reducing the repeated storage of the same image content as compared to the storage space required to store the video.
It will be appreciated that the video itself is made up of image frames, for example, one frame per frame, and that if the number of base frames required for smooth video playback is 20 frames, then there should be at least 20 images in this second. In some non-motion scenes in the video, most elements in a plurality of image frames in the scene are similar, so that the electronic device can perform intelligent image layer separation on the plurality of image frames to obtain a common image layer shared by the plurality of image frames and a plurality of proprietary non-common image layers.
In some embodiments of the present application, the electronic device may store the common layer shared by the plurality of image frames separately, while preserving the plurality of non-common layers private to the plurality of video image frames directly in the original video and identifying it as a layer that is not the original video frame.
According to the image processing method, the storage space occupied by the same image content in the at least two images can be reduced, the storage space occupied by storing the at least two images is reduced, and the remaining storage space of the electronic equipment is further increased.
In some embodiments of the present application, for each of at least two images, a non-common layer of images may be associated with image information.
In some embodiments of the present application, the image information may include at least one of: the identification of the shared layer, the storage position of the shared layer, the layer position of the shared layer, the offset information of the shared layer, the shooting time of the image, the shooting place of the image, the shooting parameter of the image and the image parameter of the image.
In some embodiments of the present application, the electronic device associating an identification of a common layer with a non-common layer of an image may indicate that the image is separated into the non-common layer and the common layer indicated by the identification. Therefore, the electronic equipment can quickly find the shared layer based on the identification of the shared layer when displaying or using the image, and the image is obtained based on the shared layer and the non-shared layer.
In some embodiments of the present application, after the electronic device obtains the shared image layer of at least two images and the non-shared image layer of each image of at least two images, the electronic device may add the same and unique identifier to the shared image layer and the non-shared image layer of each image, so that the electronic device may quickly find the shared image layer and the non-shared image layer required for displaying the image when displaying the image.
For example, for multiple images containing the same image content 1, the electronic device may add the identification 1 for one common layer corresponding to the multiple images containing the same image content 1 and for non-common layers in each image. For multiple images containing the same image content 2, the electronic device may add the identifier 2 for one common layer corresponding to the multiple images containing the same image content 2 and for non-common layers in each image.
In some embodiments of the present application, the above-described identification may include, but is not limited to: text labels, graphic combination labels, numbers, etc.
In some embodiments of the present application, the storage location of the common layer may indicate a storage address of the common layer in the electronic device.
In some embodiments of the present application, the layer position of the common layer may indicate a level of the common layer in the image. For example, the layer position of the common layer may indicate that the common layer is the bottom layer of the image, i.e. the layer position is 0.
Illustratively, the identification of the common layer includes a number, the layer position of the common layer is 0, and the layer position of the non-common layer is 1. As shown in fig. 3, after obtaining the common layer 301, the non-common layer 302, the non-common layer 303, and the non-common layer 3034, the electronic device may add the same number "#123" to the common layer 301, the non-common layer 302, the non-common layer 303, and the non-common layer 304, and set the level of the layer position indication of the common layer 301 to 0, and set the level of the layer position indication of the non-common layer 302, the non-common layer 303, and the non-common layer 304 to 1, as shown in fig. 4. Thus, the electronic device can quickly find and synthesize the obtained image based on the hierarchy and the number of the layers.
In some embodiments of the present application, the offset information of the common layer may be used to characterize a field of view offset between a non-common layer and a common layer of different images of the at least two images.
In some embodiments of the present application, the above-described field of view offset may represent: after the electronic device determines the common layer from the at least two images, non-common layers of different ones of the at least two images are offset from a field of view of the common layer.
It can be understood that there may be a fine shift in the field of view between multiple images obtained in the same shooting scene, for example, the electronic device shakes when shooting multiple images, or the electronic device adopts a wide-angle camera and a telephoto camera when shooting multiple images, and these fine shifts may cause that when the common image layer obtained by extracting multiple images and the multiple non-common image layers obtained by extracting multiple images are overlapped, different origin of coordinates are needed to be overlapped, so that the original image can be obtained.
For example, the electronic device superimposes the common layer with the center of the common layer as the origin of coordinates and the non-common layer to obtain an image. Referring to fig. 2, as shown in fig. 4, the electronic device may add coordinates (0,0.012) to the non-shared layer 302, so that when the electronic device overlaps the shared layer 301 and the non-shared layer 302, it is necessary to shift the center of the non-shared layer 302 to the right by a distance of 0.012 units, and then overlap the center of the shared layer 301 to obtain the image 201. The electronic device may add coordinates (0, 0) to the non-common layer 303, so as to indicate that when the electronic device superimposes the common layer 301 and the non-common layer 303, the electronic device directly superimposes the center of the non-common layer 303 with the center of the common layer aligned, so that the image 202 may be obtained. The electronic device may further add coordinates (0.001,0) to the non-shared layer 304, so that when the electronic device superimposes the shared layer 301 and the non-shared layer 304, the electronic device needs to shift the center of the non-shared layer 304 to the left by a distance of 0.001 units, and then superimpose the non-shared layer with the center of the shared layer in alignment, so as to obtain the image 203.
In some embodiments of the present application, the electronic device may store, in association with the common image layer, a field of view offset between the common image layer and the non-common image layer of the first image, so when the electronic device displays the image, the common image layer and the non-common image layer may be directly superimposed based on the field of view offset to display the image, so as to ensure that the displayed image is consistent with the original image.
In some embodiments of the present application, the electronic device may acquire the capturing time of the image through a clock of the electronic device, and may also acquire the capturing time of the image through acquiring the network time. In practical implementation, the electronic device may also acquire the shooting time of the image in any other way. The embodiments of the present application are not particularly limited.
In some embodiments of the present application, the electronic device may acquire the shooting location of the image through a self-contained positioning system. In practical implementation, the electronic device may also acquire the shooting location of the image through any other way. The embodiments of the present application are not particularly limited.
Illustratively, in connection with fig. 2, as shown in fig. 3, since the image 201, the image 202, and the image 203 are hierarchically processed into the non-shared layer 302, the non-shared layer 303, the non-shared layer 304, and the shared layer 301, the electronic device can store the image information originally stored in the image 201, the image 202, and the image 203 in the non-shared layer 302, the non-shared layer 303, and the non-shared layer 304, respectively. For example, the electronic device may store information such as the shooting time, the longitude and latitude of the shooting location, and the like of the image 201 in the non-common layer 302.
In some embodiments of the present application, the shooting parameters of the image may include, but are not limited to: aperture, color temperature, white balance, depth of field, sensitivity, shutter speed, exposure compensation, focal length of the photographed image.
The image parameters of the image in some embodiments of the present application may include, but are not limited to: resolution, brightness, contrast, saturation, color, noise of the image.
In this way, various image information of the image can be associated with the non-shared image layer of the image, so that when the image is stored in the shared image layer and the non-shared image layer, the image information of the image can be reserved while the memory occupied by the stored image is reduced.
In some embodiments of the present application, the step 102 may include a step 102a described below.
Step 102a, the electronic device deletes at least two images, and stores at least two images according to the shared layer and the non-shared layer of the at least two images.
In some embodiments of the present application, after obtaining at least two images with a similarity greater than or equal to a preset similarity, the electronic device may determine a common image layer and an unshared image layer of each of the at least two images, delete the at least two images, and store one common image layer in association with the unshared image layer in each image.
In this way, the electronic device can replace the at least two images with one shared image layer and store the non-shared image layer in each image, so that the storage space occupied by the same image content in the at least two images can be reduced.
In some embodiments of the present application, the number of the at least two images is M.
In some embodiments of the present application, the image processing method provided in the embodiments of the present application may further include step 103 described below.
Step 103, the electronic device reserves a shared layer when detecting a deleting instruction of N images in at least two images and N is less than M.
Wherein M is a positive integer greater than or equal to 2, and N is a positive integer.
In some embodiments of the present application, after the electronic device deletes the at least two images and stores one common image layer in association with a non-common image layer in each image, if a user performs a delete operation on one or more images in the at least two images, the electronic device may delete only the non-common image layer of the one or more images, but not the common image layer.
In some embodiments of the present application, when only one non-shared layer is left in N non-shared layers corresponding to the at least two images, the electronic device may directly superimpose the non-shared layer with the shared layer to obtain an image corresponding to the one non-shared layer, and directly store the image. It will be appreciated that if the image is deleted, both the non-common layer and the common layer will be deleted.
In this way, since the electronic device can replace N images in the album with one shared image layer and store the non-shared image layer in each image, when deleting an image, only the non-shared image layer of the image can be deleted. Therefore, the storage space occupied by storing the N images can be reduced, and the storage of other images is not influenced, so that the residual storage space of the electronic equipment can be increased.
In some embodiments of the present application, the image processing method provided in the embodiments of the present application may further include step 104 described below.
Step 104, the electronic device displays the image according to the shared image layer and the non-shared image layer of the image for each of the at least two images.
In some embodiments of the present application, the electronic device may superimpose the common image layer of the image and the non-common image layer of the at least one image respectively to obtain the image to be displayed. It will be appreciated that the image obtained by stacking the common layer and the non-common layer is identical to the image content and the image quality of the original image before the layering process.
For example, in conjunction with fig. 2, as shown in fig. 5, the electronic device may superimpose the common layer 301 and the non-common layer 302, merge the sun, snow, and maple leaves in the common layer 301 and the characters and carts in the non-common layer 302 into the same image, and obtain and display the image 201.
It will be appreciated that when the user performs a search preview on an image, the user will not see the shared layer and the non-shared layer that only contain part of the image content, and the electronic device will display the complete image obtained by stacking the shared layer and the non-shared layer of the image.
In some embodiments of the present application, when the at least two images are image frames in the video, the electronic device may pre-process a non-shared image layer of several image frames when the electronic device plays the video, and superimpose the shared image layer and the non-shared image layer to form an original video image frame, so that an original playing effect may be achieved when the video is played.
Therefore, the electronic equipment displays at least one image with the same display effect and image quality as the original image based on the stored shared image layer and the non-shared image layer of the at least one image, so that the display effect and the image quality of the image can be ensured while the storage space occupied by the image is reduced.
In some embodiments of the present application, for each of the at least two images, the non-common layers of the image are associated with a layer position of the common layer.
In some embodiments of the present application, before the step 104, the image processing method provided in the embodiments of the present application may further include a step 105 described below; the step 104 may include a step 104a described below.
Step 105, the electronic device receives a first input for adjusting a layer position of a common layer.
Step 104a, the electronic device displays the image according to the non-common layer and the adjusted common layer of the image in response to the first input.
In some embodiments of the present application, a user may adjust a layer position of a common layer, so that the electronic device may obtain and display an image that is not exactly the same as an original image based on a non-common layer of the image and the adjusted common layer, and generate a personalized image copy.
In some embodiments of the present application, the first input may be used to adjust a layer position of the common layer. In other words, the first input may be an adjustment input of a layer position of the common layer.
In some embodiments of the present application, the first input includes, but is not limited to: the user inputs the touch control of the layer position of the shared layer through a touch control device such as a finger or a stylus, or inputs a voice command input by the user, or inputs a specific gesture input by the user, or inputs other feasibility, and the specific touch control can be determined according to actual use requirements. The specific gesture in the embodiment of the present application may be any one of a single-click gesture, a sliding gesture, a dragging gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture, and a double-click gesture; the click input in the embodiment of the application may be single click input, double click input, or any number of click inputs, and may also be long press input or short press input.
The first input may be a user setting input for a layer position of the common layer, so that the electronic device may display the image according to the non-common layer and the adjusted common layer of the image in response to the setting input.
The first input may also be voice input by the user, for example, "set the layer position of the common layer to 1", so that the electronic device may display the image according to the non-common layer and the adjusted common layer of the image in response to the voice input.
Illustratively, the first input is exemplified as a user setting input of a layer position of the common layer. Referring to fig. 2 and fig. 4, as shown in fig. 5, the electronic device may superimpose a non-shared layer 302 with a layer level of 1 indicated by the layer position and a shared layer 301 with a layer level of 0 indicated by the layer position, so as to obtain an image 201 with the shared layer 301 as a background and the non-shared layer 302 as a foreground. The electronic device may receive a setting input from a user to set a layer position of the common layer 301 from 0 to 2, and superimpose the non-common layer 302 with a layer level of 1 indicated by the layer position and the common layer 301 with a layer level of 2 indicated by the layer position, to obtain a copy of the image 201 with the common layer 301 as a foreground and the non-common layer 302 as a background.
Therefore, the image with personalized effect can be obtained based on the adjusted shared image layer and the non-shared image layer by adjusting the image layer position of the shared image layer, so that the flexibility of storing the image of the electronic equipment is improved.
In some embodiments of the present application, for each of at least two images, the non-common layers of the images are associated with image parameters of the common layer.
In some embodiments of the present application, before the step 104, the image processing method provided in the embodiments of the present application may further include a step 106 described below; the step 104 may include a step 104b described below.
Step 106, the electronic device receives a second input for adjusting image parameters of the common layer.
Step 104b, the electronic device displays the image according to the non-common layer and the adjusted common layer of the image in response to the second input.
In some embodiments of the present application, a user may adjust image parameters of a common image layer, so that the electronic device may obtain and display an image that is not exactly the same as an original image based on a non-common image layer of the image and the adjusted common image layer, and generate a personalized image copy.
In some embodiments of the present application, the second input may be used to adjust an image parameter of the common layer. In other words, the second input may be an adjustment input of an image parameter of the common layer.
In some embodiments of the present application, the second input includes, but is not limited to: the touch input of the image parameters of the shared image layer by the user through the touch device such as a finger or a stylus, or the voice command input by the user, or the specific gesture input by the user, or other feasibility inputs can be specifically determined according to the actual use requirement, and the embodiment of the application is not limited. The specific gesture in the embodiment of the present application may be any one of a single-click gesture, a sliding gesture, a dragging gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture, and a double-click gesture; the click input in the embodiment of the application may be single click input, double click input, or any number of click inputs, and may also be long press input or short press input.
The second input may be, for example, a user setting input of an image parameter of the common layer, so that the electronic device may display the image according to the non-common layer and the adjusted common layer of the image in response to the setting input.
The second input may also be voice input by the user, such as "adjust the contrast of the image parameter of the common layer to 50", so that the electronic device may display the image according to the non-common layer and the adjusted common layer of the image in response to the voice input.
Illustratively, the second input is exemplified as a user setting input for saturation of the common layer. The electronic device may receive a setting input from a user that sets the saturation of the common layer from 30 to 45, and superimpose the non-common layer 1 and the common layer of the image to obtain a copy of the image with greater saturation.
Therefore, the image with personalized effect can be obtained based on the adjusted shared image layer and the adjusted non-shared image layer by adjusting the image parameters of the shared image layer, so that the flexibility of storing the image of the electronic equipment is improved.
The above method embodiments, or various possible implementation manners in the method embodiments, may be executed separately, or may be executed in any two or more combinations, which may be specifically determined according to actual use requirements, and this embodiment of the application is not limited.
The image processing method provided in the embodiment of the present application is exemplarily described below with specific examples.
As shown in fig. 6, the image processing method provided in the present application may include steps 201 to 205 described below.
Step 201, the electronic device acquires at least two images with similarity greater than or equal to preset similarity.
In some embodiments of the present application, the electronic device may determine, according to image content and image information of the images, at least two images having a similarity greater than or equal to a preset similarity.
Step 202, the electronic device determines a common layer from at least two images, and identifies layer positions and coordinates of the common layer and non-common layer of the images.
Step 203, the electronic device stores a shared layer in association with a non-shared layer in each image, and adds the same identifier for the shared layer and each non-shared layer.
And 204, the electronic equipment superimposes the stored shared image layer and the stored non-shared image layer to display images.
Step 205, the electronic device receives and responds to the input of deleting the image by the user, and deletes the non-shared image layer of the image.
According to the image processing method provided by the embodiment of the application, after the similar multiple images are determined, the multiple images can be subjected to image layer decomposition into the respective private non-shared image layer and the shared image layer, the consumption of storage space can be greatly reduced through the storage of the shared image layer and the respective private non-shared image layer, and the residual storage space of the electronic equipment is increased. And simultaneously, the common image layer and the private non-common image layer are subjected to superposition preview in the subsequent preview process, so that the original image is displayed or exported. Meanwhile, due to the existence of the pre-classification behavior, a user can view images divided into the same scene when previewing the images, display the similarity with the previewed images on the device, and the user can easily identify repeated images through similarity comparison so as to further perform deletion operation.
According to the image processing method provided by the embodiment of the application, the execution subject can be an image processing device. In the embodiment of the present application, an image processing apparatus provided in the embodiment of the present application will be described by taking an example in which the image processing apparatus executes an image processing method.
Fig. 7 shows a schematic diagram of one possible configuration of an image processing apparatus involved in an embodiment of the present application. As shown in fig. 7, the image processing apparatus 70 may include: a determination module 71 and a storage module 72.
The determining module 71 is configured to determine a common layer from at least two images when the similarity of the at least two images is greater than or equal to a preset similarity; the storage module 72 is configured to store at least two images according to the shared image layer and the non-shared image layer of the at least two images determined by the determination module 71.
In one possible implementation manner, for each of the at least two images, the image information is associated with an image layer that is not shared by the images, and the image information includes at least one of the following: the identification of the shared layer, the storage position of the shared layer, the layer position of the shared layer, the offset information of the shared layer, the shooting time of the image, the shooting place of the image, the shooting parameter of the image and the image parameter of the image.
In one possible implementation manner, the apparatus further includes: a display module;
and the display module is used for displaying the images according to the shared image layer and the non-shared image layer of the images for each image in the at least two images.
In one possible implementation manner, for each of the at least two images, the non-common layers of the images are associated with a layer position of the common layer;
the device further comprises: a receiving module;
the receiving module is configured to receive, before the display module displays, for each of at least two images, the image according to the common layer and the non-common layer of the image, a first input for adjusting a layer position of the common layer;
the display module is specifically configured to display an image according to the non-shared layer of the image and the adjusted shared layer in response to the first input received by the receiving module.
In one possible implementation manner, for each of the at least two images, the non-common image layer of the image is associated with an image parameter of the common image layer;
the receiving module is configured to receive, before the display module displays, for each of the at least two images, the image according to the common layer and the non-common layer of the image, a second input for adjusting an image parameter of the common layer;
The display module is specifically configured to display the image according to the non-shared layer and the adjusted shared layer of the image in response to the second input received by the receiving module.
In one possible implementation manner, the number of the at least two images is M;
the device further comprises: a processing module;
the processing module is configured to, when a deletion instruction of N images in at least two images is detected, and N is less than M, reserve a common layer, where M is a positive integer greater than or equal to 2, and N is a positive integer.
In one possible implementation, the at least two images belong to the same video or different videos.
The embodiment of the application provides an image processing device, which can reduce the storage space occupied by the same image content in at least two images, reduce the storage space occupied by storing the at least two images, and further increase the residual storage space of electronic equipment.
The image processing apparatus in the embodiment of the present application may be an electronic device, or may be a component in an electronic device, for example, an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the electronic device may be a mobile phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented reality (augmented reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-mobile personal computer, UMPC, netbook or personal digital assistant (personal digital assistant, PDA), etc., but may also be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The image processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system, an iOS operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
The image processing device provided in the embodiment of the present application can implement each process implemented by the embodiment of the image processing method, so as to achieve the same technical effect, and in order to avoid repetition, a detailed description is omitted here.
Optionally, as shown in fig. 8, the embodiment of the present application further provides an electronic device 800, including a processor 801 and a memory 802, where a program or an instruction capable of running on the processor 801 is stored in the memory 802, and the program or the instruction implements each step of the embodiment of the image processing method when executed by the processor 801, and the steps achieve the same technical effects, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 9 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1000 includes, but is not limited to: radio frequency unit 1001, network module 1002, audio output unit 1003, input unit 1004, sensor 1005, display unit 1006, user input unit 1007, interface unit 1008, memory 1009, and processor 1010.
Those skilled in the art will appreciate that the electronic device 1000 may also include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 1010 by a power management system to perform functions such as managing charge, discharge, and power consumption by the power management system. The electronic device structure shown in fig. 9 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
The processor 1010 is configured to determine a common layer from at least two images when the similarity of the at least two images is greater than or equal to a preset similarity; and storing the at least two images according to the common layer and the non-common layer of the at least two images.
In one possible implementation manner, for each of the at least two images, the image information is associated with an image layer that is not shared by the images, and the image information includes at least one of the following: the identification of the shared layer, the storage position of the shared layer, the layer position of the shared layer, the offset information of the shared layer, the shooting time of the image, the shooting place of the image, the shooting parameter of the image and the image parameter of the image.
In a possible implementation manner, the display unit 1006 is configured to display, for each of at least two images, the image according to a common image layer and a non-common image layer of the image.
In one possible implementation manner, for each of the at least two images, the non-common layers of the images are associated with a layer position of the common layer;
the user input unit 1007 is configured to receive a first input for adjusting a layer position of the common layer before the display unit 1006 displays, for each of the at least two images, the image according to the common layer and the non-common layer of the image;
the display unit 1006 is specifically configured to display an image according to the non-common layer and the adjusted common layer of the image in response to the first input received by the user input unit 1007.
In one possible implementation manner, for each of the at least two images, the non-common image layer of the image is associated with an image parameter of the common image layer;
the user input unit 1007 is configured to receive a second input for adjusting an image parameter of the common layer before the display unit 1006 displays, for each of the at least two images, the image according to the common layer and the non-common layer of the image;
The display unit 1006 is specifically configured to display an image according to the non-common layer and the adjusted common layer of the image in response to the second input received by the user input unit 1007.
In one possible implementation manner, the number of the at least two images is M;
the processor 1010 is configured to, when a deletion instruction of N images of at least two images is detected, reserve a common layer if N is less than M, where M is a positive integer greater than or equal to 2, and N is a positive integer.
In one possible implementation, the at least two images belong to the same video or different videos.
The embodiment of the application provides electronic equipment, which can reduce the storage space occupied by the same image content in at least two images, reduce the storage space occupied by storing the at least two images and further increase the residual storage space of the electronic equipment.
It should be understood that in the embodiment of the present application, the input unit 1004 may include a graphics processor (Graphics Processing Unit, GPU) 10041 and a microphone 10042, and the graphics processor 10041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes at least one of a touch panel 10071 and other input devices 10072. The touch panel 10071 is also referred to as a touch screen. The touch panel 10071 can include two portions, a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a first memory area storing programs or instructions and a second memory area storing data, wherein the first memory area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 1009 may include volatile memory or nonvolatile memory, or the memory 1009 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 1009 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
The processor 1010 may include one or more processing units; optionally, the processor 1010 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, and the like, and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 1010.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image processing method, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used for running a program or an instruction, so as to implement each process of the embodiment of the image processing method, and achieve the same technical effect, so that repetition is avoided, and no redundant description is provided here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
The embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the embodiments of the image processing method described above, and achieve the same technical effects, and are not repeated herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (15)

1. An image processing method, the method comprising:
determining a common image layer from at least two images under the condition that the similarity of the at least two images is larger than or equal to the preset similarity;
and storing the at least two images according to the shared image layer and the non-shared image layer of the at least two images.
2. The method of claim 1, wherein for each of the at least two images, an image information is associated with a non-common layer of the image, the image information comprising at least one of: the method comprises the steps of identifying a common layer, storing the common layer, storing offset information of the common layer, shooting time of the image, shooting place of the image, shooting parameters of the image and image parameters of the image.
3. The method according to claim 1, wherein the method further comprises:
for each of the at least two images, displaying the image according to the common layer and a non-common layer of the image.
4. A method according to claim 3, wherein for each of the at least two images, non-common layers of the images are associated with layer positions of the common layers;
The method further includes, for each of the at least two images, before displaying the image according to the common layer and the non-common layer of the image:
receiving a first input for adjusting a layer position of the common layer;
the displaying the image according to the common layer and the non-common layer of the image comprises:
and in response to the first input, displaying the image according to the non-common layer of the image and the adjusted common layer.
5. A method according to claim 3, wherein for each of the at least two images, non-common layers of the images are associated with image parameters of the common layer;
the method further includes, for each of the at least two images, before displaying the image according to the common layer and the non-common layer of the image:
receiving a second input for adjusting image parameters of the common layer;
the displaying the image according to the common layer and the non-common layer of the image comprises:
and in response to the second input, displaying the image according to the non-common layer of the image and the adjusted common layer.
6. The method of claim 1, wherein the number of the at least two images is M;
the method further comprises the steps of:
and under the condition that the deleting instruction of N images in the at least two images is detected and N is less than M, reserving the common image layer, wherein M is a positive integer greater than or equal to 2, and N is a positive integer.
7. The method of claim 1, wherein the at least two images belong to the same video or different videos.
8. An image processing apparatus, characterized in that the apparatus comprises: a determining module and a storing module;
the determining module is used for determining a shared image layer from at least two images when the similarity of the at least two images is greater than or equal to a preset similarity;
the storage module is used for storing the at least two images according to the shared image layer and the non-shared image layer of the at least two images, which are determined by the determination module.
9. The apparatus of claim 8, wherein for each of the at least two images, an image information is associated with an unshared layer of the image, the image information comprising at least one of: the method comprises the steps of identifying a common layer, storing the common layer, storing offset information of the common layer, shooting time of the image, shooting place of the image, shooting parameters of the image and image parameters of the image.
10. The apparatus of claim 8, wherein the apparatus further comprises: a display module;
the display module is used for displaying the images according to the shared image layer and the non-shared image layer of the images for each image in the at least two images.
11. The apparatus of claim 10, wherein for each of the at least two images, a non-common layer of the image is associated with a layer position of the common layer;
the apparatus further comprises: a receiving module;
the receiving module is configured to receive, before the display module displays, for each of the at least two images, the image according to the common layer and a non-common layer of the image, a first input for adjusting a layer position of the common layer;
the display module is specifically configured to respond to the first input received by the receiving module, and display the image according to the non-shared layer of the image and the adjusted shared layer.
12. The apparatus of claim 10, wherein for each of the at least two images, an image parameter of the common layer is associated with a non-common layer of the image;
The apparatus further comprises: a receiving module;
the receiving module is configured to receive, before the display module displays, for each of the at least two images, the image according to the common layer and a non-common layer of the image, a second input for adjusting an image parameter of the common layer;
the display module is specifically configured to respond to the second input received by the receiving module, and display the image according to the non-shared layer of the image and the adjusted shared layer.
13. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the image processing method of any one of claims 1 to 7.
14. A readable storage medium, characterized in that the readable storage medium has stored thereon a program or instructions which, when executed by a processor, implement the steps of the image processing method according to any of claims 1 to 7.
15. A computer program product stored in a storage medium, the computer program product being executed by at least one processor to implement the steps of the image processing method of any of claims 1 to 7.
CN202410176399.2A 2024-02-08 2024-02-08 Image processing method, apparatus, electronic device, medium, and computer program product Pending CN117788316A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410176399.2A CN117788316A (en) 2024-02-08 2024-02-08 Image processing method, apparatus, electronic device, medium, and computer program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410176399.2A CN117788316A (en) 2024-02-08 2024-02-08 Image processing method, apparatus, electronic device, medium, and computer program product

Publications (1)

Publication Number Publication Date
CN117788316A true CN117788316A (en) 2024-03-29

Family

ID=90396540

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410176399.2A Pending CN117788316A (en) 2024-02-08 2024-02-08 Image processing method, apparatus, electronic device, medium, and computer program product

Country Status (1)

Country Link
CN (1) CN117788316A (en)

Similar Documents

Publication Publication Date Title
CN111612873B (en) GIF picture generation method and device and electronic equipment
CN108898082B (en) Picture processing method, picture processing device and terminal equipment
CN111241340A (en) Video tag determination method, device, terminal and storage medium
CN112749613B (en) Video data processing method, device, computer equipment and storage medium
CN111429338B (en) Method, apparatus, device and computer readable storage medium for processing video
CN112422817B (en) Image processing method and device
CN111669495B (en) Photographing method, photographing device and electronic equipment
CN103327246A (en) Multimedia shooting processing method, device and intelligent terminal
CN113225451B (en) Image processing method and device and electronic equipment
CN113194256B (en) Shooting method, shooting device, electronic equipment and storage medium
CN110431838B (en) Method and system for providing dynamic content of face recognition camera
CN113596574A (en) Video processing method, video processing apparatus, electronic device, and readable storage medium
CN117788316A (en) Image processing method, apparatus, electronic device, medium, and computer program product
CN114758054A (en) Light spot adding method, device, equipment and storage medium
CN113271379A (en) Image processing method and device and electronic equipment
CN116091572B (en) Method for acquiring image depth information, electronic equipment and storage medium
CN114390205B (en) Shooting method and device and electronic equipment
CN113329259B (en) Video editing method based on continuous interest points and storage medium
CN114115528B (en) Virtual object control method, device, computer equipment and storage medium
US20240062392A1 (en) Method for determining tracking target and electronic device
KR102199292B1 (en) Apparatus for creating movie and method for creating movie using the same
US20140153836A1 (en) Electronic device and image processing method
CN117453099A (en) Image display control method, device and equipment
CN114979482A (en) Shooting method, shooting device, electronic equipment and medium
CN116156250A (en) Video processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination