CN113992789A - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN113992789A
CN113992789A CN202111280822.6A CN202111280822A CN113992789A CN 113992789 A CN113992789 A CN 113992789A CN 202111280822 A CN202111280822 A CN 202111280822A CN 113992789 A CN113992789 A CN 113992789A
Authority
CN
China
Prior art keywords
image
terminal
area
target
adjustment parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111280822.6A
Other languages
Chinese (zh)
Inventor
陈文智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202111280822.6A priority Critical patent/CN113992789A/en
Publication of CN113992789A publication Critical patent/CN113992789A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W56/00Synchronisation arrangements
    • H04W56/001Synchronization between nodes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an image processing method and device, and belongs to the technical field of communication. The image processing method comprises the following steps: receiving a target image adjusting parameter sent by a second terminal; wherein the target image adjustment parameter is an image adjustment parameter of a second image area in a second image; according to the target image adjustment parameter, performing image adjustment on a first image area in the first image to generate a third image; wherein the first image and the second image comprise the same image content, and the first image area and the second image area are image areas at the same position in the first image and the second image.

Description

Image processing method and device
Technical Field
The present application belongs to the field of communication technologies, and in particular, to an image processing method and apparatus.
Background
With the image adjusting function (such as a cropping function, a video special effect adding function, etc.) on the terminal device being more and more convenient to use, after a user obtains an image (such as a photo, a video, etc.), the user often performs an image adjusting operation on the image to obtain a more desirable image.
Different users may have different image adjustment requirements for the same image, and when sharing images, the images shared by the different users after performing different image adjustments on the same image are difficult to obtain images satisfying different users. For example, for the group photo, each person only finely adjusts the beauty of the person in the group photo, which results in that only the person looks beautiful in the photo shared by each person.
In the prior art, if a user wants to acquire images satisfying different users, the user can finish the image repairing by one person and then send the images to the next person for image repairing, and so on until all the users finish the image repairing. However, this method is time-consuming, laborious and cumbersome to operate.
Disclosure of Invention
The embodiment of the application aims to provide an image processing method and device, and the problems that time and labor are consumed and operation is complex when different user satisfactory retouching images are obtained in the prior art can be solved.
In a first aspect, an embodiment of the present application provides an image processing method, which is applied to a first terminal, and the method includes:
receiving a target image adjusting parameter sent by a second terminal; wherein the target image adjustment parameter is an image adjustment parameter of a second image area in a second image;
according to the target image adjustment parameter, performing image adjustment on a first image area in the first image to generate a third image; wherein the first image and the second image comprise the same image content, and the first image area and the second image area are image areas at the same position in the first image and the second image.
In a second aspect, an embodiment of the present application provides an image processing method, which is applied to a second terminal, and the method includes:
acquiring a target image adjustment parameter of a second image area in a second image;
sending the target image adjusting parameter to a first terminal;
the target image adjustment parameter is used for indicating the first terminal to adjust the image of the first image area in the first image according to the target image adjustment parameter so as to generate a third image; the first image and the second image include the same image content, and the first image area and the second image area are image areas at the same position in the first image and the second image.
In a third aspect, an embodiment of the present application provides an image processing apparatus, which is applied to a first terminal, and the apparatus includes:
the receiving module is used for receiving the target image adjustment parameters sent by the second terminal; wherein the target image adjustment parameter is an image adjustment parameter of a second image area in a second image;
the processing module is used for carrying out image adjustment on a first image area in a first image according to the target image adjustment parameter received by the receiving module to generate a third image; wherein the first image and the second image comprise the same image content, and the first image area and the second image area are image areas at the same position in the first image and the second image.
In a fourth aspect, an embodiment of the present application provides an image processing apparatus, which is applied to a second terminal, and the apparatus includes:
the acquisition module is used for acquiring target image adjustment parameters of a second image area in a second image;
the sending module is used for sending the target image adjustment parameters acquired by the acquisition module to a first terminal;
the target image adjustment parameter is used for indicating the first terminal to adjust the image of the first image area in the first image according to the target image adjustment parameter so as to generate a third image; the first image and the second image include the same image content, and the first image area and the second image area are image areas at the same position in the first image and the second image.
In a fifth aspect, the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps in the image processing method according to the first aspect.
In a sixth aspect, the present application provides a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps in the image processing method according to the first aspect.
In a seventh aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the image processing method according to the first aspect.
In the embodiment of the application, the terminal device can adjust the image according to the image adjustment parameters received from other terminal devices, so that the display effect of the image is consistent with the adjustment effect of other terminal devices on the image. The implementation method is simple to operate, and time-saving and rapid.
Drawings
Fig. 1 is a schematic flowchart of an image processing method applied to a first terminal according to an embodiment of the present application;
fig. 2 is a flowchart illustrating an image processing method applied to a second terminal according to an embodiment of the present application;
FIG. 3 is one of the schematic diagrams of an example provided by an embodiment of the present application;
FIG. 4 is a second schematic diagram of an example provided by an embodiment of the present application;
FIG. 5 is a third schematic diagram of an example provided by an embodiment of the present application;
FIG. 6 is a fourth illustration of an example provided by an embodiment of the present application;
FIG. 7 is a fifth illustration of an example provided by an embodiment of the present application;
FIG. 8 is a sixth illustration of an example provided by an embodiment of the present application;
FIG. 9 is a seventh illustration of an example provided by an embodiment of the present application;
fig. 10 is a schematic block diagram of an image processing apparatus applied to a first terminal according to an embodiment of the present application;
fig. 11 is a schematic block diagram of an image processing apparatus applied to a second terminal according to an embodiment of the present application;
FIG. 12 is a block diagram of an electronic device provided by an embodiment of the application;
fig. 13 is a second schematic block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in other sequences than those illustrated or otherwise described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense to distinguish one object from another, and not necessarily to limit the number of objects, e.g., the first object may be one or more. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The image processing method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Fig. 1 is a flowchart illustrating an image processing method according to an embodiment of the present application, where the image processing method is applied to a first terminal, and the first terminal may be a terminal device such as a mobile phone and a tablet computer.
The image processing method may include:
step 101: and receiving the target image adjusting parameters sent by the second terminal.
The target image adjustment parameter is an image adjustment parameter of a second image area in the second image, and the second image area may be a partial area or a whole area in the image area of the second image in which the image parameter is adjusted. The second image region may be user selected or may be automatically determined by the system.
In the process of adjusting the second image, the terminal device may record the corresponding image adjustment parameter and the image area for adjusting the image parameter at the same time. For example, when the image area n in the second image is subjected to filter processing, the image area n and filter parameters of the image area n are recorded. In this embodiment, the second terminal may transmit the recorded image adjustment data to the first terminal.
Step 102: and according to the target image adjustment parameter, performing image adjustment on the first image area in the first image to generate a third image.
The first image and the second image include the same image content, which specifically means: under the condition that the first image and the second image are both images with adjusted image parameters, the images before the image parameters of the first image are not adjusted are the same as the images before the image parameters of the second image are not adjusted; or when the second image is the image with the image parameters adjusted and the first image is the image with the image parameters not adjusted, the second image is the same image as the first image before the image parameters are not adjusted.
The first image area and the second image area are image areas at the same position in the first image and the second image. The second terminal may send the target image adjustment parameter, and may also send image area information (i.e., second image area information) corresponding to the target image adjustment parameter to the first terminal, so that the first terminal matches the corresponding first image area in the first image.
In the embodiment of the application, after receiving the target image adjustment parameter, the first terminal can perform image adjustment on the first image area which is the same as the second image area according to the target image adjustment parameter, so that the image effect of the first image area is consistent with that of the second image area. The implementation method is simple to operate, and time-saving and rapid.
Alternatively, the first image and the second image may be static images (such as photos, etc.) or dynamic images (such as dynamic pictures, videos, etc.), that is: the embodiment of the application is not only suitable for the retouching adjustment of the static image, but also suitable for the retouching adjustment of the dynamic image.
As an alternative embodiment, in the case that the number of the second terminals is at least two, in step 101: after receiving the target image adjustment parameter sent by the second terminal, the image processing method may further include:
a first image region corresponding to each of the second image regions is identified in the first image.
Accordingly, step 102: performing image adjustment on the first image region in the first image according to the target image adjustment parameter to generate a third image, which may include:
receiving a first input of a user to the identified first image region; and responding to the first input, and performing image adjustment on the selected first image area according to the target image adjustment parameter of the second image area corresponding to the first image area selected by the first input to generate a third image.
In this embodiment of the application, after receiving the target image adjustment parameter sent by the second terminal, the first terminal may identify a first image area corresponding to the second image area in the first image. The identified first image area is in a selectable state, and a user can select the first image area for image adjustment through a first input (such as click operation) according to requirements, so that the user has more selectivity, and the flexibility of image adjustment is increased.
As an optional embodiment, when the number of the second terminals is at least two, if there is an overlapping area in the second image areas corresponding to different second terminals, the corresponding first image area may be adjusted according to the target image adjustment parameter of the second image area received last in the overlapping second image area, so as to avoid that the image adjustment is wrong and the image repairing effect required by the user cannot be achieved. Of course, the first image area corresponding to the overlapped second image area may also be identified, and the image adjustment may be performed on the selected first image area according to the selection operation of the user on the identified first image area and the target image adjustment parameter of the second image area corresponding to the first image area selected by the selection operation.
To sum up, in this application embodiment, the terminal device may adjust the image according to the image adjustment parameters received from other terminal devices, so that the display effect of the image is consistent with the adjustment effect of the other terminal devices on the image.
Fig. 2 is a flowchart illustrating an image processing method according to an embodiment of the present application, where the image processing method is applied to a second terminal, and the second terminal may be a terminal device such as a mobile phone and a tablet computer.
The image processing method may include:
step 201: and acquiring target image adjustment parameters of a second image area in the second image.
And the target image adjusting parameter is an image adjusting parameter of a second image area in the second image. The second image area may be a partial area or a whole area of the image area in which the image parameter is adjusted in the second image. The second image region may be user selected or may be automatically determined by the system.
In the process of adjusting the second image, the terminal device may record the corresponding image adjustment parameter and the image area for adjusting the image parameter at the same time. For example, when the image area n in the second image is subjected to filter processing, the image area n and filter parameters of the image area n are recorded.
Step 202: and sending the target image adjusting parameter to the first terminal.
And the target image adjustment parameter is used for indicating the first terminal to adjust the image of the first image area in the first image according to the target image adjustment parameter so as to generate a third image.
The first image and the second image include the same image content, which specifically means: under the condition that the first image and the second image are both images with adjusted image parameters, the images before the image parameters of the first image are not adjusted and the images before the image parameters of the second image are not adjusted are the same images; or, in the case that the second image is the image with the adjusted image parameters and the first image is the image with the unadjusted image parameters, the second image is the same image as the first image before the unadjusted image parameters.
The first image area and the second image area are image areas at the same position in the first image and the second image. The second terminal may send the target image adjustment parameter, and may also send image area information (i.e., second image area information) corresponding to the target image adjustment parameter to the first terminal, so that the first terminal matches the corresponding first image area in the first image.
In the embodiment of the application, the second terminal can send the recorded target image adjustment parameter to the first terminal, and after receiving the target image adjustment parameter, the first terminal can adjust the image of the first image area which is the same as the second image area according to the target image adjustment parameter, so that the image effect of the first image area is consistent with the image effect of the second image area.
Alternatively, the first image and the second image may be static images (such as photos, etc.) or dynamic images (such as dynamic pictures, videos, etc.), that is: the embodiment of the application is not only suitable for the retouching adjustment of the static image, but also suitable for the retouching adjustment of the dynamic image.
As an alternative embodiment, step 202: the sending of the target image adjustment parameter to the first terminal may include:
receiving a dragging operation of a user on the second image or the second image area; and responding to the dragging operation, determining the terminal equipment placed in the dragging operation direction as a first terminal, and sending the target image adjusting parameter to the first terminal.
At least one terminal device is placed in a preset range of the second terminal, the relative position relationship between each terminal device in the preset range and the second terminal is recorded in the second terminal, and the first terminal is one of the at least one terminal device. For example, a terminal a and a terminal B are placed within one meter of the second terminal, and the second terminal records: terminal a is located 45 degrees north and 60 degrees south of the second terminal. Of course, this is merely an example, and the representation of the relative position may be recorded in other ways. The position information can be acquired in a position sharing mode.
In the embodiment of the application, a plurality of terminal devices may be placed together in advance, generally, all the terminal devices may be placed on the same plane, and then each terminal device obtains the relative position relationship between the other terminal devices and the terminal device itself, so that a user may send the target image adjustment parameter to the target terminal (the target terminal is the first terminal in the embodiment of the application) in a manner of dragging the second image or the second image area to the target terminal direction according to a requirement, and the operation is simple. Each terminal device stores an image with the same image content, for example, as shown in fig. 3, three terminal devices, namely terminal a, terminal B and terminal C, previously obtain the same photo x.
When the second image area is dragged to trigger the sending operation of the target image adjustment parameter, the user may first select an image area in the second image, and the selected image area is the second image area.
In order to better understand the transmission scheme of the target image adjustment parameter, the following description is given by way of example.
Example 1
Assume that the image whose image parameters need to be adjusted is photo x.
The terminal devices in this example include a terminal a, a terminal B, and a terminal C, and a photograph x is stored in each of the three terminal devices. The photo x includes two human images of a boy and a girl, the position of the girl image in the photo is an area 1, and the position of the boy image in the photo is an area 2. The user of the terminal A performs beauty adjustment on the girl image in the picture x to generate a picture a; the user of terminal B performs a beauty adjustment on the boy image in photograph x to generate photograph B, as shown in fig. 4.
As shown in fig. 5, a terminal a, a terminal C, and a terminal B are horizontally and sequentially placed on a desktop side by side, when sending the image adjustment parameter of the area 1 of the photo a to the terminal C, a user may first select the area 1 in the photo a, then press the area 1 for a long time, extract and display the image in the area 1 separately, and then drag the area 1 displayed in a floating manner to the right (i.e., in the direction of the terminal C), so as to send the image adjustment parameter of the area 1 to the terminal C. When the image adjustment parameter of the area 2 of the photo b is sent to the terminal C, the operation process is the same as the operation process described above, and details are not repeated here.
After receiving the image adjustment parameters sent by the terminal a and the terminal B, the terminal C may perform image adjustment on the photo x.
Optionally, when the first image and the second image are static images or dynamic images, it may be determined whether the first terminal in the direction of the dragging operation has an image with the same image content when the second image or the second image area is dragged, and the target image adjustment parameter is sent to the first terminal when the first terminal has an image with a compatible image content, and if the first terminal does not have an image with a compatible image content, the sending operation is not performed to avoid invalid sending.
Optionally, when the first image and the second image are video images, it may be determined whether a video image with the same image content is in an open state in the first terminal in the direction of the dragging operation when the second image or the second image area is dragged, and the target image adjustment parameter is sent to the first terminal when the video image with the same image content is in the open state, and if not, the sending operation is not executed to avoid invalid sending.
In the foregoing determination process, the determination may be made based on the identification code of the image, that is: the identification codes of images having the same image content are the same. The identification code may be set by the user or may be a feature code of the image itself.
Alternatively, in addition to the local image area, the whole image may be directly operated, for example, as shown in fig. 6, the photos a and b are directly dragged towards the terminal C. After the dragging is completed, image areas that can receive the image adjustment parameters, such as area 1 and area 2 in fig. 6, are identified on the photo x in the terminal C, i.e. the image areas corresponding to the image areas of the photo a and the photo b that have the image adjustment parameters. At this time, the area 1 and the area 2 in the photo x are in a selectable state, and the user can select an area in which image parameters need to be adjusted according to the needs, for example, after selecting the area 1, the terminal C adjusts the image of the area 1 in the photo x according to the image adjustment parameters of the area 1 in the photo a.
Example two
Assume that the image whose image parameters need to be adjusted is a video image.
Terminal equipment includes terminal A and terminal B in this example, and two terminal equipment are horizontal and place on the desktop side by side in proper order, and all have saved video y in two terminal equipment, and video y is in the open mode. Wherein, the terminal a applies the video special effect 3 to the video y and generates the video z. As shown in fig. 7, when the terminal a sends the parameter of the video special effect 3 in the video z to the terminal B, the user may press the image area to which the video special effect 3 is added in the video z for a long time to select the video special effect 3, and then slide and drag the image area in the direction of the terminal B on the same plane to send the parameter of the video special effect 3 to the terminal B, so that the terminal B adds the video special effect to the video y according to the received parameter of the video special effect 3.
Optionally, in this embodiment of the application, each terminal device placed on the same plane may obtain, in addition to the relative position relationship between the other terminal devices and the terminal device itself, also only obtain the position of the designated terminal, and record the relative position relationship between the terminal device itself and the designated terminal. For example, the first terminal described in this embodiment of the application is set as the designated terminal, and the second terminal only needs to acquire and record the relative position relationship with the first terminal, so that the position data required to be recorded is reduced, and the reduction of power consumption is facilitated. In this way, after receiving a dragging operation of a user to the first terminal direction from the second image or the second image area, the second terminal sends the target image adjustment parameter to the first terminal in response to the dragging operation.
Optionally, a connection relationship may be pre-established between a plurality of terminal devices, for example, before the second terminal sends the target image adjustment parameter, the second terminal and the first terminal have pre-established a connection relationship, so that the target image adjustment parameter can be directly sent, and the data transmission time is short; for example, after the second terminal receives the dragging operation of the second image or the second image area by the user, the second terminal sends the connection request to the first terminal in the dragging operation direction, and sends the target image adjustment parameter to the first terminal under the condition that the first terminal receives the connection request and agrees to establish the connection, so that real-time connection between the devices is not needed, and power consumption is saved.
Optionally, in this embodiment of the application, after receiving a dragging operation of the second image or the second image area by the user, the second terminal may trigger an obtaining operation of the trigger target image adjustment parameter in addition to the sending operation of the trigger target image adjustment parameter. For example, the second terminal receives a dragging operation of a user on the second image or the second image area, responds to the dragging operation, acquires the target image adjustment parameter, determines the terminal device placed in the dragging operation direction as the first terminal, and sends the target image adjustment parameter to the first terminal, so that execution of two processing procedures is triggered through one operation, and user operation is simplified.
It should be noted that, in the case that a plurality of terminal devices are included in the drag operation direction, the device identifiers of the included terminal devices may be displayed, and the user may select a target terminal (in this embodiment, the target terminal is the first terminal). Of course, it may be set that the terminal device closest to the drag operation direction is determined as the target terminal.
As an alternative embodiment, in step 202: before sending the target image adjustment parameter to the first terminal, the image processing method may further include:
step 202: the sending of the target image adjustment parameter to the first terminal may include:
receiving a first input of a user to the displayed equipment identifier of the first terminal; and responding to the first input, and sending the target image adjusting parameter to the first terminal.
The second terminal and at least one terminal device are connected, the device identification of the connected terminal device is displayed on the second terminal, and the first terminal is one of the at least one terminal device.
Wherein the first input may include, but is not limited to: the touch control device comprises at least one of a single-click touch control operation, a double-click touch control operation, a long-press touch control operation, a re-press touch control operation, a sliding touch control operation and the like.
In the embodiment of the application, the connection relationship can be pre-established among a plurality of terminal devices, and each terminal device displays the device identifier of the other terminal device with which the connection relationship is established, so that when the target image adjustment parameter is transmitted, the sending object of the target image adjustment parameter can be selected by selecting the device identifier. The device identifier may be a terminal device name.
For example, as shown in fig. 8, the user of terminal a performs a beauty retouching of the girl image in the photograph and selects the area 1 in the photograph where the user wants to retouch the contents synchronously. And displaying the equipment identifier 'B' of the terminal B and the equipment identifier 'C' of the terminal C below the photo, and clicking the equipment identifier 'C' by a user of the terminal A, so that the beauty repair picture parameter of the area 1 can be sent to the terminal C. And the user of the terminal B performs beauty retouching on the boy image in the photo and selects the area 2 which wants to retouch the content synchronously in the photo. The device identifier "a" of the terminal a and the device identifier "C" of the terminal C are displayed below the photo, and the user of the terminal B clicks the device identifier "C", so that the beauty repair map parameter of the area 2 can be sent to the terminal C.
Optionally, the device identifier may also be displayed after the second terminal receives the display instruction, for example, the second terminal receives a fourth input of the user for the second image, and displays the device identifier of the terminal device having a connection relationship with the second terminal in response to the fourth input.
Wherein, the fourth input may include, but is not limited to: at least one of a single-click touch operation, a double-click touch operation, a long-press touch operation, a double-press touch operation, and the like.
The fourth input operation may be performed on an arbitrary position of the second image, or may be performed on the second image area of the second image. For example, as shown in fig. 9, the user of terminal a presses the image area where the video special effect 3 in the video is located (i.e., performs the fourth input operation on the second image area of the second image), and displays the device identifier "device B" of terminal B and the device identifier "device C" of terminal C, which have established a connection relationship with terminal a. The user of terminal a clicks the device identifier "device B", and may send the parameter of video special effect 3 to terminal B.
As an alternative embodiment, step 201: acquiring a target image adjustment parameter of a second image region in a second image may include:
receiving a second input of the user to the second image; and in response to the second input, determining the image area selected by the second input as a second image area, and acquiring the target image adjustment parameter.
In the embodiment of the application, the user can select the image area of the synchronous image adjustment parameter, then the image adjustment parameter of the image area selected by the user is obtained and sent to the first terminal, so that the user has more selectivity, and the flexibility of image adjustment is improved.
The second input described herein may be a sliding touch operation, for example, determining an image area selected by the user according to a sliding track.
As an alternative embodiment, step 201: acquiring a target image adjustment parameter of a second image region in a second image may include:
receiving a third input of the second image by the user; in response to a third input, all image areas of the adjusted image parameters in the second image are determined as second image areas, and target image adjustment parameters are acquired.
In an embodiment of the present application, the image area of the synchronization image adjustment parameter may be determined by the system. After receiving a third input of the user to the second image, the second terminal automatically determines an image area of the second image for adjusting the image parameters, and obtains the image adjustment parameters (i.e., the target image adjustment parameters) of the image area.
The third input described herein may include, but is not limited to: at least one of a single-click touch operation, a double-click touch operation, a long-press touch operation, a double-press touch operation, a sliding touch operation (e.g., a drag operation of the second image toward the first terminal), and the like.
To sum up, in this application embodiment, the terminal device can adjust the image according to the image adjustment parameters received from other terminal devices, so that the display effect of the image is consistent with the adjustment effect of other terminal devices on the image, and thus, according to the technical scheme provided by this application embodiment, different image adjustment parameters for the same image in different terminal devices can be synchronized to the same terminal device, the image is subjected to image repairing adjustment by the terminal device according to the image adjustment parameters sent by other terminal devices, so that the image which can satisfy different terminal users can be obtained quickly, and better image sharing is performed. The implementation method is simple to operate, and time-saving and rapid.
It should be noted that, in the image processing method provided in the embodiment of the present application, the execution subject may be an image processing apparatus, or a control module in the image processing apparatus for executing the image processing method. The image processing apparatus provided in the embodiment of the present application is described with an example in which an image processing apparatus executes an image processing method.
Fig. 10 is a schematic block diagram of an image processing apparatus provided in an embodiment of the present application, and the image processing apparatus is applied to a first terminal.
As shown in fig. 10, the image processing apparatus may include:
the receiving module 1001 is configured to receive a target image adjustment parameter sent by the second terminal.
And the target image adjusting parameter is an image adjusting parameter of a second image area in the second image.
The processing module 1002 is configured to perform image adjustment on the first image region in the first image according to the target image adjustment parameter received by the receiving module, so as to generate a third image.
Wherein the first image and the second image comprise the same image content, and the first image area and the second image area are image areas at the same position in the first image and the second image.
Optionally, the first image, the second image and the third image are all static images or dynamic images.
Optionally, the apparatus further comprises:
and the identification module is used for identifying the first image area corresponding to each second image area in the first image.
Wherein the processing module 1002 comprises:
a first receiving unit, configured to receive a first input of the identified first image area by a user.
And the processing unit is used for responding to the first input received by the first receiving unit, and performing image adjustment on the selected first image area according to the target image adjustment parameter of the second image area corresponding to the first image area selected by the first input to generate a third image.
In the embodiment of the application, the terminal device can adjust the image according to the image adjustment parameters received from other terminal devices, so that the display effect of the image is consistent with the adjustment effect of other terminal devices on the image. The implementation method is simple to operate, and time-saving and rapid.
Fig. 11 is a schematic block diagram of an image processing apparatus provided in an embodiment of the present application, and the image processing apparatus is applied to a second terminal.
As shown in fig. 11, the image processing apparatus may include:
an obtaining module 1101, configured to obtain a target image adjustment parameter of a second image region in a second image.
A sending module 1102, configured to send the target image adjustment parameter obtained by the obtaining module to a first terminal.
The target image adjustment parameter is used for indicating the first terminal to adjust the image of the first image area in the first image according to the target image adjustment parameter so as to generate a third image; the first image and the second image include the same image content, and the first image area and the second image area are image areas at the same position in the first image and the second image.
Optionally, the first image, the second image and the third image are static images or dynamic images.
Optionally, the sending module 1102 includes:
a second receiving unit, configured to receive a dragging operation of the second image or the second image area by a user.
And the first sending unit is used for responding to the dragging operation received by the second receiving unit, determining the terminal equipment placed in the dragging operation direction as the first terminal, and sending the target image adjusting parameter to the first terminal.
At least one terminal device is placed in a preset range of the second terminal, the relative position relationship between each terminal device in the preset range and the second terminal is recorded in the second terminal, and the first terminal is one of the at least one terminal device.
Optionally, the sending module 1102 includes:
and the third receiving unit is used for receiving a first input of the equipment identifier of the first terminal displayed by the user.
The second terminal and at least one terminal device establish a connection relationship, and the second terminal displays the device identification of the connected terminal device; the first terminal is one of the at least one terminal device.
And the second sending unit is used for responding to the first input received by the third receiving unit and sending the target image adjusting parameter to the first terminal.
Optionally, the obtaining module 1101 includes:
and the fourth receiving unit is used for receiving a second input of the second image by the user.
A first obtaining unit, configured to determine, in response to the second input received by the fourth receiving unit, an image area selected by the second input as the second image area, and obtain the target image adjustment parameter.
Optionally, the obtaining module 1101 includes:
and the fifth receiving unit is used for receiving a third input of the second image by the user.
A second obtaining unit, configured to determine, in response to the third input received by the fifth receiving unit, all image areas in the second image in which the image parameters are adjusted as the second image area, and obtain the target image adjustment parameters.
In the embodiment of the application, the terminal device can adjust the image according to the image adjustment parameters received from other terminal devices, so that the display effect of the image is consistent with the adjustment effect of other terminal devices on the image. The implementation method is simple to operate, and time-saving and rapid.
The image processing apparatus in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a Network Attached Storage (NAS), a personal computer (personal computer, PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not limited in particular.
The image processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The image processing apparatus provided in the embodiment of the present application can implement each process implemented by the image processing method embodiments shown in fig. 1 and fig. 2, and for avoiding repetition, details are not repeated here.
Optionally, as shown in fig. 12, an embodiment of the present application further provides an electronic device 1200, including: the processor 1201 and the memory 1202 are stored in the memory 1202, and a program or an instruction that is stored in the memory 1202 and can be executed on the processor 1201 is executed by the processor 1201, and the program or the instruction implements each process of the above-described image processing method embodiment, and can achieve the same technical effect, and is not described herein again to avoid repetition.
It should be noted that the electronic device 1200 in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 13 is a schematic hardware structure diagram of an electronic device implementing an embodiment of the present application.
The electronic device 1300 includes, but is not limited to: a radio frequency unit 1301, a network module 1302, an audio output unit 1303, an input unit 1304, a sensor 1305, a display unit 1306, a user input unit 1307, an interface unit 1308, a memory 1309, a processor 1310, and the like.
Those skilled in the art will appreciate that the electronic device 1300 may further comprise a power source (e.g., a battery) for supplying power to the various components, and the power source may be logically connected to the processor 1310 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system. The electronic device structure shown in fig. 13 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
Wherein, in case the electronic device is a first terminal, the processor 1310 is configured to: when the input unit 1304 receives a target image adjustment parameter transmitted by the second terminal, the first image area in the first image is adjusted according to the target image adjustment parameter, and a third image is generated.
And the target image adjusting parameter is an image adjusting parameter of a second image area in the second image. The first image and the second image include the same image content, and the first image area and the second image area are image areas at the same position in the first image and the second image.
Optionally, the processor 1310 is further configured to: first image areas corresponding to each of the second image areas are identified in a first image, and when a first input of the identified first image areas by a user is received by user input unit 1307, in response to the first input, an image adjustment is performed on the selected first image areas according to target image adjustment parameters of the second image areas corresponding to the first image areas selected by the first input, so as to generate a third image.
In the case where the electronic device is a second terminal, the processor 1310 is configured to: and acquiring a target image adjusting parameter of a second image area in a second image, and sending the target image adjusting parameter to the first terminal.
The target image adjustment parameter is used for indicating the first terminal to adjust the image of the first image area in the first image according to the target image adjustment parameter so as to generate a third image; the first image and the second image include the same image content, and the first image area and the second image area are image areas at the same position in the first image and the second image.
Optionally, the processor 1310 is further configured to: when user input unit 1307 receives a drag operation of the user on the second image or the second image area, in response to the drag operation, the terminal device placed in the drag operation direction is determined as the first terminal, and the target image adjustment parameter is sent to the first terminal.
At least one terminal device is placed in a preset range of the second terminal, the relative position relationship between each terminal device in the preset range and the second terminal is recorded in the second terminal, and the first terminal is one of the at least one terminal device.
Optionally, the processor 1310 is further configured to: in a case where the user input unit 1307 receives a first input of the device identifier of the first terminal displayed by the user, the target image adjustment parameter is sent to the first terminal in response to the first input. The second terminal and at least one terminal device establish a connection relationship, and the second terminal displays the device identification of the connected terminal device; the first terminal is one of the at least one terminal device;
optionally, the processor 1310 is further configured to: in a case where user input unit 1307 receives a second input of the second image by the user, in response to the second input, an image area selected by the second input is determined as the second image area, and the target image adjustment parameter is acquired.
Optionally, the processor 1310 is further configured to: in a case where user input unit 1307 receives a third input to the second image by the user, in response to the third input, all image areas of the second image in which the image parameters are adjusted are determined as the second image area, and the target image adjustment parameters are acquired.
In the embodiment of the application, the terminal device can adjust the image according to the image adjustment parameters received from other terminal devices, so that the display effect of the image is consistent with the adjustment effect of other terminal devices on the image. The implementation method is simple to operate, and time-saving and rapid.
It should be understood that in the embodiment of the present application, the input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics Processing Unit 7041 processes image data of still pictures or videos obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 706 may include a display panel 7061, and the display panel 7061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 707 includes a touch panel 7071 and other input devices 7072. The touch panel 7071 is also referred to as a touch screen. The touch panel 7071 may include two parts of a touch detection device and a touch controller. Other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. Memory 709 may be used to store software programs as well as various data, including but not limited to applications and operating systems. Processor 710 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the embodiment of the image processing method, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM, RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An image processing method applied to a first terminal is characterized by comprising the following steps:
receiving a target image adjusting parameter sent by a second terminal; wherein the target image adjustment parameter is an image adjustment parameter of a second image area in a second image;
according to the target image adjustment parameter, performing image adjustment on a first image area in the first image to generate a third image; wherein the first image and the second image comprise the same image content, and the first image area and the second image area are image areas at the same position in the first image and the second image.
2. The image processing method according to claim 1, wherein after the receiving the target image adjustment parameter sent by the second terminal, the method further comprises:
identifying a first image region in the first image corresponding to each of the second image regions;
wherein, the image adjusting the first image area in the first image according to the target image adjusting parameter to generate a third image includes:
receiving a first input of a user to the identified first image region;
and responding to the first input, and performing image adjustment on the selected first image area according to the target image adjustment parameter of the second image area corresponding to the first image area selected by the first input to generate a third image.
3. An image processing method applied to a second terminal is characterized by comprising the following steps:
acquiring a target image adjustment parameter of a second image area in a second image;
sending the target image adjusting parameter to a first terminal;
the target image adjustment parameter is used for indicating the first terminal to adjust the image of the first image area in the first image according to the target image adjustment parameter so as to generate a third image; the first image and the second image include the same image content, and the first image area and the second image area are image areas at the same position in the first image and the second image.
4. The image processing method of claim 3, wherein the sending the target image adjustment parameter to the first terminal comprises:
receiving a dragging operation of a user on the second image or the second image area;
responding to the dragging operation, determining the terminal equipment placed in the dragging operation direction as the first terminal, and sending the target image adjusting parameter to the first terminal;
at least one terminal device is placed in a preset range of the second terminal, the relative position relationship between each terminal device in the preset range and the second terminal is recorded in the second terminal, and the first terminal is one of the at least one terminal device.
5. The image processing method of claim 3, wherein the sending the target image adjustment parameter to the first terminal comprises:
receiving a first input of a user to the displayed equipment identifier of the first terminal; the second terminal and at least one terminal device establish a connection relationship, and the second terminal displays the device identification of the connected terminal device; the first terminal is one of the at least one terminal device;
and responding to the first input, and sending the target image adjusting parameter to the first terminal.
6. An image processing apparatus applied to a first terminal, the apparatus comprising:
the receiving module is used for receiving the target image adjustment parameters sent by the second terminal; wherein the target image adjustment parameter is an image adjustment parameter of a second image area in a second image;
the processing module is used for carrying out image adjustment on a first image area in a first image according to the target image adjustment parameter received by the receiving module to generate a third image; wherein the first image and the second image comprise the same image content, and the first image area and the second image area are image areas at the same position in the first image and the second image.
7. The image processing apparatus according to claim 6, characterized in that the apparatus further comprises:
the identification module is used for identifying a first image area corresponding to each second image area in a first image;
wherein the processing module comprises:
a first receiving unit, configured to receive a first input of the identified first image area by a user;
and the processing unit is used for responding to the first input received by the first receiving unit, and performing image adjustment on the selected first image area according to the target image adjustment parameter of the second image area corresponding to the first image area selected by the first input to generate a third image.
8. An image processing apparatus applied to a second terminal, the apparatus comprising:
the acquisition module is used for acquiring target image adjustment parameters of a second image area in a second image;
the sending module is used for sending the target image adjustment parameters acquired by the acquisition module to a first terminal;
the target image adjustment parameter is used for indicating the first terminal to adjust the image of the first image area in the first image according to the target image adjustment parameter so as to generate a third image; the first image and the second image include the same image content, and the first image area and the second image area are image areas at the same position in the first image and the second image.
9. The image processing apparatus according to claim 8, wherein the transmission module includes:
a second receiving unit, configured to receive a dragging operation of the second image or the second image area by a user;
a first sending unit, configured to determine, in response to the dragging operation received by the second receiving unit, a terminal device placed in the direction of the dragging operation as the first terminal, and send the target image adjustment parameter to the first terminal;
at least one terminal device is placed in a preset range of the second terminal, the relative position relationship between each terminal device in the preset range and the second terminal is recorded in the second terminal, and the first terminal is one of the at least one terminal device.
10. The image processing apparatus according to claim 8, wherein the transmission module includes:
the third receiving unit is used for receiving first input of the equipment identifier of the first terminal displayed by a user; the second terminal and at least one terminal device establish a connection relationship, and the second terminal displays the device identification of the connected terminal device; the first terminal is one of the at least one terminal device;
and the second sending unit is used for responding to the first input received by the third receiving unit and sending the target image adjusting parameter to the first terminal.
CN202111280822.6A 2021-10-29 2021-10-29 Image processing method and device Pending CN113992789A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111280822.6A CN113992789A (en) 2021-10-29 2021-10-29 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111280822.6A CN113992789A (en) 2021-10-29 2021-10-29 Image processing method and device

Publications (1)

Publication Number Publication Date
CN113992789A true CN113992789A (en) 2022-01-28

Family

ID=79745178

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111280822.6A Pending CN113992789A (en) 2021-10-29 2021-10-29 Image processing method and device

Country Status (1)

Country Link
CN (1) CN113992789A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101945499A (en) * 2010-09-06 2011-01-12 深圳市同洲电子股份有限公司 Method, terminal and system for transferring files
CN102496147A (en) * 2011-11-30 2012-06-13 宇龙计算机通信科技(深圳)有限公司 Image processing device, image processing method and image processing system
CN104184879A (en) * 2013-05-22 2014-12-03 中兴通讯股份有限公司 Method and device for realizing file transmission
CN104244022A (en) * 2014-08-29 2014-12-24 形山科技(深圳)有限公司 Method and system for processing image
CN109712082A (en) * 2018-12-05 2019-05-03 厦门美图之家科技有限公司 The method and device of figure is repaired in cooperation
WO2020220873A1 (en) * 2019-04-30 2020-11-05 维沃移动通信有限公司 Image display method and terminal device
CN112039929A (en) * 2019-05-15 2020-12-04 阿里巴巴集团控股有限公司 File editing method and device and electronic equipment
CN112437190A (en) * 2019-08-08 2021-03-02 华为技术有限公司 Data sharing method, graphical user interface, related device and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101945499A (en) * 2010-09-06 2011-01-12 深圳市同洲电子股份有限公司 Method, terminal and system for transferring files
CN102496147A (en) * 2011-11-30 2012-06-13 宇龙计算机通信科技(深圳)有限公司 Image processing device, image processing method and image processing system
CN104184879A (en) * 2013-05-22 2014-12-03 中兴通讯股份有限公司 Method and device for realizing file transmission
CN104244022A (en) * 2014-08-29 2014-12-24 形山科技(深圳)有限公司 Method and system for processing image
CN109712082A (en) * 2018-12-05 2019-05-03 厦门美图之家科技有限公司 The method and device of figure is repaired in cooperation
WO2020220873A1 (en) * 2019-04-30 2020-11-05 维沃移动通信有限公司 Image display method and terminal device
CN112039929A (en) * 2019-05-15 2020-12-04 阿里巴巴集团控股有限公司 File editing method and device and electronic equipment
CN112437190A (en) * 2019-08-08 2021-03-02 华为技术有限公司 Data sharing method, graphical user interface, related device and system

Similar Documents

Publication Publication Date Title
CN113467660A (en) Information sharing method and electronic equipment
CN113794795B (en) Information sharing method and device, electronic equipment and readable storage medium
CN112449110B (en) Image processing method and device and electronic equipment
CN112269522A (en) Image processing method, image processing device, electronic equipment and readable storage medium
CN112486444A (en) Screen projection method, device, equipment and readable storage medium
CN113179205A (en) Image sharing method and device and electronic equipment
CN112929494A (en) Information processing method, information processing apparatus, information processing medium, and electronic device
CN112698762B (en) Icon display method and device and electronic equipment
CN112399010B (en) Page display method and device and electronic equipment
CN112416199B (en) Control method and device and electronic equipment
CN113703634A (en) Interface display method and device
CN111885298B (en) Image processing method and device
CN112734661A (en) Image processing method and device
WO2023155858A1 (en) Document editing method and apparatus
CN113726953B (en) Display content acquisition method and device
WO2023005908A1 (en) Photographing method and apparatus, device, and storage medium
CN113872849B (en) Message interaction method and device and electronic equipment
CN113873081B (en) Method and device for sending associated image and electronic equipment
CN111796733B (en) Image display method, image display device and electronic equipment
CN113992789A (en) Image processing method and device
CN112383708B (en) Shooting method and device, electronic equipment and readable storage medium
CN112202958B (en) Screenshot method and device and electronic equipment
CN113986080A (en) Multimedia file editing method and device and electronic equipment
CN112287131A (en) Information interaction method and information interaction device
CN112416230B (en) Object processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220128

RJ01 Rejection of invention patent application after publication