CN117729415A - Image processing method, device, electronic equipment and medium - Google Patents

Image processing method, device, electronic equipment and medium Download PDF

Info

Publication number
CN117729415A
CN117729415A CN202311740950.3A CN202311740950A CN117729415A CN 117729415 A CN117729415 A CN 117729415A CN 202311740950 A CN202311740950 A CN 202311740950A CN 117729415 A CN117729415 A CN 117729415A
Authority
CN
China
Prior art keywords
application
image
camera
input
image element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311740950.3A
Other languages
Chinese (zh)
Inventor
祝贤威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202311740950.3A priority Critical patent/CN117729415A/en
Publication of CN117729415A publication Critical patent/CN117729415A/en
Pending legal-status Critical Current

Links

Abstract

The application discloses an image processing method, an image processing device, electronic equipment and a medium, and belongs to the technical field of image processing. The method comprises the following steps: under the condition that a first application calls a camera to collect an image, receiving a first input of a user to a second application; in response to the first input, the second application is controlled to acquire a first image element associated with the second application in the image acquired by the camera, and the second application is controlled to execute a first operation based on the first image element.

Description

Image processing method, device, electronic equipment and medium
Technical Field
The application belongs to the technical field of image processing, and particularly relates to an image processing method, an image processing device, electronic equipment and a medium.
Background
Currently, an application in an electronic device may call a camera of the electronic device to perform image acquisition, and perform a corresponding operation according to the acquired image.
For example, the payment application may invoke the camera to capture a two-dimensional code image, such that the payment application may identify the two-dimensional code image and effect payment via an image recognition function. For another example, the communication application may invoke a camera to make a video call.
However, if the communication application calls the rear camera to make a video call, if the user needs to pay through the payment application, the user needs to interrupt the video call first, and then, through input to the payment application, the payment application is triggered to call the rear camera and make payment. Therefore, the call of one application to the camera can be realized after the call of the other application to the camera is interrupted, so that the flexibility of calling the camera by the application is poor.
Disclosure of Invention
The embodiment of the application aims to provide an image processing method, an image processing device, electronic equipment and a medium, which can solve the problem of poor flexibility of calling a camera by an application.
In a first aspect, an embodiment of the present application provides an image processing method, including: under the condition that a first application calls a camera to collect an image, receiving a first input of a user to a second application; in response to the first input, the second application is controlled to acquire a first image element associated with the second application in the image acquired by the camera, and the second application is controlled to execute a first operation based on the first image element.
In a second aspect, an embodiment of the present application provides an image processing apparatus including: the receiving module is used for receiving a first input of a user to a second application under the condition that the first application calls the camera to collect an image; and the control module is used for responding to the first input received by the receiving module, controlling the second application to acquire a first image element associated with the second application in the image acquired by the camera, and controlling the second application to execute a first operation based on the first image element.
In a third aspect, embodiments of the present application provide an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, under the condition that a first application calls a camera to collect an image, receiving a first input of a user to a second application; in response to the first input, the second application is controlled to acquire a first image element associated with the second application in the image acquired by the camera, and the second application is controlled to execute a first operation based on the first image element. According to the scheme, under the condition that one camera is called by one application, as the user can trigger the electronic equipment to control the application to acquire the image elements in the image acquired by the camera through input and execute the corresponding function, a plurality of applications can share the image elements in the image acquired by the same camera. Therefore, the effect that a plurality of applications call the same camera at the same time can be achieved, and the flexibility and convenience of calling the camera by the applications can be improved.
Drawings
Fig. 1 is a schematic flow chart of an image processing method according to an embodiment of the present application;
FIG. 2A is a schematic diagram of an interface to which an image processing method according to an embodiment of the present application is applied;
FIG. 2B is a second schematic diagram of an interface of an application of the image processing method according to the embodiment of the present application;
FIG. 3 is a third exemplary interface diagram of an application of the image processing method according to the embodiment of the present application;
FIG. 4 is a fourth schematic diagram of an interface of an application of an image processing method according to an embodiment of the present disclosure;
FIG. 5 is a fifth exemplary interface diagram of an application of an image processing method according to an embodiment of the present disclosure;
FIG. 6 is a sixth exemplary interface diagram of an application of an image processing method according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an image processing apparatus provided in an embodiment of the present application;
fig. 8 is one of schematic structural diagrams of an electronic device according to an embodiment of the present application;
fig. 9 is a second schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Technical solutions in the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The terms "first," "second," and the like in the description of the present application, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the objects identified by "first," "second," etc. are generally of a type and do not limit the number of objects, for example, the first object may be one or more. In addition, "and/or" in the specification means at least one of the connected objects, and the character "/", generally means a relationship in which the associated objects are one kind of "or".
The terms "at least one", and the like in the description of the present application refer to any one, any two, or a combination of two or more of the objects that it comprises. For example, at least one of a, b, c (item) may represent: "a", "b", "c", "a and b", "a and c", "b and c" and "a, b and c", wherein a, b, c may be single or plural. Similarly, the term "at least two" means two or more, and the meaning of the expression is similar to the term "at least one".
The image processing method, the device, the electronic equipment and the medium provided by the embodiment of the application are described in detail through specific embodiments and application scenes thereof with reference to the accompanying drawings.
The image processing method provided by the embodiment of the application can be applied to a scene of calling the camera to execute the corresponding function.
With the development of camera technology, the frequency of using a camera in an electronic device by a user is increasing, for example, the user uses the camera in the electronic device to take a picture, record a video, scan a code, and the like.
In a practical implementation, the electronic device may control an application in the electronic device to call the camera, so that the application performs a specific function through an image acquired by the camera that the application calls.
For example, the payment application invokes the camera to collect the two-dimensional code image, and then realizes the payment function by identifying the two-dimensional code.
At present, one camera can only be called by one application at the same time, and images acquired by the camera can only be transmitted to a single application, so that the flexibility of the operation of calling the camera by the application is poor.
In the image processing method provided by the embodiment of the application, under the condition that the first application calls the camera to collect the image, the first input of the user to the second application is received; in response to the first input, the second application is controlled to acquire a first image element associated with the second application in the image acquired by the camera, and the second application is controlled to execute a first operation based on the first image element. According to the scheme, under the condition that one camera is called by one application, as the user can trigger the electronic equipment to control the application to acquire the image elements in the image acquired by the camera through input and execute the corresponding function, a plurality of applications can share the image elements in the image acquired by the same camera. Therefore, the effect that a plurality of applications call the same camera at the same time can be achieved, and the flexibility and convenience of calling the camera by the applications can be improved.
It can be seen that the purpose of the image processing method provided in the embodiment of the present application is: multiple applications can call the same camera together, namely, image elements in images acquired by the same camera are shared, so that the maximum utilization of the images is achieved.
The execution body of the image processing method provided in the embodiment of the present application may be an electronic device, including a mobile electronic device or a non-mobile electronic device, or may be a functional module or a functional entity capable of implementing the image processing method in the electronic device, which may be specifically determined according to actual use requirements, and embodiments of the present invention are not limited. The image processing method provided in the embodiment of the present application will be described in an exemplary manner by taking an example in which the electronic device executes the image processing method.
Fig. 1 shows a flowchart of an image processing method provided in an embodiment of the present application, and as shown in fig. 1, the image processing method provided in an embodiment of the present application may include the following steps 101 and 102.
Step 101, the electronic device receives a first input of a user to a second application under the condition that the first application calls the camera to collect an image.
In some embodiments, the first application may invoke the camera to take a photograph, record a video, talk a video, pay, etc., which may be specifically determined according to the actual use requirement.
In some embodiments, the first input is to control the second application to invoke the camera while maintaining the first application invoking the camera.
In some embodiments, the first input may be a user input of a first shortcut function identification in a shortcut function interface of the electronic device, the first shortcut function identification indicating a function associated with the image in the second application, such as a payment function.
In some embodiments, the user may first trigger the electronic device to return to the desktop of the electronic device, and the user may perform a second input on an application icon of a second application in the desktop.
In some embodiments, when the first application invokes the camera, the electronic device may display an image display interface corresponding to the camera, where the image display interface includes an image acquired by the camera.
In some embodiments, the first input includes, but is not limited to: the user performs touch input on the second application through a touch device such as a finger or a stylus pen, or inputs a voice command input by the user, or inputs a specific gesture input by the user, or inputs other feasibility. The specific determination may be determined according to actual use requirements, and the embodiment of the invention is not limited.
In some embodiments of the present application, the specific gesture may be any one of a single-click gesture, a swipe gesture, a drag gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture, and a double-click gesture.
In some embodiments of the present application, the click input may be a single click input, a double click input, or any number of click inputs, and may also be a long press input or a short press input.
Step 102, the electronic device responds to the first input, controls the second application to acquire a first image element associated with the second application in the image acquired by the camera, and controls the second application to execute a first operation based on the first image element.
In some embodiments, the first image element may include one or more image elements.
In some embodiments, the first operation may include any one of: payment operation, image recognition operation, code scanning operation, image processing operation, head portrait setting operation, image sending operation, and the like.
The image processing operation may include: any possible image processing operation such as a portrait beautifying operation, a landscape beautifying operation, a background blurring operation, etc.
It will be appreciated that after the second application has acquired the first image element, the first operation may be performed by the second application based on the first image element.
In some embodiments, the electronic device controlling the second application to obtain the first image element in the image captured by the camera includes: and controlling the second application to acquire the image data acquired by the camera, and extracting the image data of the first image element from the image data to acquire the first image element.
In some embodiments, "controlling the second application to acquire the first image element associated with the second application in the image acquired by the camera" in step 102 may include the following step a.
And A, controlling a first application to acquire image data of a first image element associated with a second application in an image acquired by a camera by using the electronic equipment, and transmitting the image data to the second application through a bottom-layer camera data channel of the camera.
In some embodiments, the second application passively acquires the first image element. Of course, the second application may also actively acquire the image data of the first image element from the first application through the underlying camera data channel.
In some embodiments, the underlying camera data channel may be understood as a data transfer channel between applications.
In some embodiments, the camera may directly transfer image data of the acquired image to the application that first invoked the camera, i.e., the first application.
Therefore, the first application which successfully calls the camera can firstly acquire the image data of the first image element associated with the second application in the image acquired by the camera, and then the image data of the first image element is transmitted to the second application through the bottom camera data channel, so that the effects that the first application and the second application call the camera and share the image data of the image acquired by the camera can be achieved.
The image processing method provided in the embodiment of the present application is described below with reference to specific examples.
Example 1, assuming that the first application is application a, as shown in fig. 2A, the electronic device displays an image display interface 20 corresponding to the camera in application a, where the image display interface 20 includes an image acquired by the camera, and the image includes a plurality of different types of image elements, including a character element 21, a dog element 22, and a scene element 23. If the user needs to make beautification edits to different image elements, and application a in the electronic device is good at beautifying people and application B is good at beautifying scenes, then: the user can trigger the electronic device to control the application A and the application B to share the image acquired by the same camera through the first input, so that the application B acquires the landscape element 23 in the image, and the landscape element 23 can be beautified through the application B.
In example 2, assuming that the first application is application B, as shown in fig. 2B, the electronic device displays an image display interface 24 corresponding to the camera in application B, where the image display interface 24 includes an image collected by the camera, and the image includes a character element 25, a dog element 26, and a scenic element 27, if the user is not satisfied with the beautifying result of the camera B on the character element 25, the user may execute the first input on application a, trigger the electronic device to control application a to acquire the character element 25 in the image, and perform the beautifying operation on the character element 25.
In the image processing method provided by the embodiment of the application, under the condition that one application calls one camera, as the user can trigger the electronic equipment to control the application to acquire the image elements in the image acquired by the camera through input and execute the corresponding function, a plurality of applications can share the image elements in the image acquired by the same camera. Therefore, the effect that a plurality of applications call the same camera at the same time can be achieved, and the flexibility and convenience of calling the camera by the applications can be improved.
In some embodiments, after the step 101, the image processing method provided in the embodiments of the present application may further include the following step 103.
Step 103, the electronic device controls and controls the first application to acquire a second image element associated with the first application in the image acquired by the camera, and executes a second operation based on the second image element.
In some embodiments, the second image element may be the same as or different from the first image element.
In some embodiments, the second image element may include one or more image elements.
In some embodiments, the second operation and the first operation may be the same or different.
For example, the second operation may be a payment operation and the first operation may be an image processing operation.
For example, the second operation is a person beautification operation, and the first operation is a landscape beautification operation.
The image processing method provided in the embodiment of the present application is described below with reference to specific examples.
Example 3, assuming that the first application is application a, as shown in fig. 2A, the electronic device displays an image display interface 20 corresponding to the camera, where the image display interface 20 includes an image acquired by the camera, and the image includes a plurality of different types of image elements, including a character element 21, a dog element 22, and a scene element 23. If the user needs to make beautification and editing on different image elements. If application a in the electronic device is good at beautifying people and application B is good at beautifying scenes, then: the user can trigger the electronic device to control the application A and the application B to share the image acquired by the same camera through the first input, and enable the application A to acquire the figure element 21 in the image and the application B to acquire the landscape element 23 in the image, so that the figure element 21 can be beautified through the application A and the landscape element 23 can be beautified through the application B.
Example 4, assuming that the first application is application B, as shown in fig. 2B, the electronic device displays an image display interface 24 corresponding to the camera, where the image display interface 24 includes an image collected by the camera, and the image includes a character element 25, a dog element 26, and a scenic element 27, and if the user is not satisfied with the beautifying result of the character element 25 by the camera B, the user may correspond to the first input of application a, trigger the electronic device to control application a to obtain the character element 25, and perform the beautifying operation on the character element 25.
In the image processing method provided by the embodiment of the application, under the condition that one camera is called by one application, as the user can trigger the electronic equipment to control the application and other applications to respectively acquire the image elements in the images acquired by the camera through input, and execute corresponding functions, a plurality of applications can share the image elements in the images acquired by the same camera. Therefore, the effect that a plurality of applications call the same camera at the same time can be achieved, and the flexibility and convenience of calling the camera by the applications can be improved.
In some embodiments, before "controlling the second application to acquire the first image element associated with the second application in the image acquired by the camera" in the above step 102, the image processing method provided in the embodiment of the present application may further include the following step 104.
104, the electronic equipment determines the association relation between the third application and the image element according to the first information;
wherein the first information includes any one of:
mode 1, user input to a third application;
mode 2, application characteristics of the third application;
in mode 3, the third applications respectively correspond to element type parameters, and the corresponding element type parameters are used for indicating element types of the image elements associated with the applications.
Wherein the third application comprises any one of: a second application, a first application, and a second application.
In some embodiments, when the third application includes the second application, it indicates that the electronic device has determined the correspondence of the first application to the image element in advance, so that a re-determination is not necessary.
In some embodiments, the element types of the image elements may include: any possible element type such as people, landscapes, delicacies, buildings or objects.
In this way, the association relationship between the third application and the image element can be determined according to the input of the user to the third application, the application characteristics of the third application, or the element type parameters corresponding to the third application, so that the flexibility and diversity of determining the association relationship between the application and the image element can be improved.
Modes 1 to 3 are described below.
Mode 1
In some embodiments, the first information includes user input to a third application. The above step 101 may include the following steps 101a to 101c; the step 104 may include a step 104a described below.
Step 101a, the electronic device receives a first sub-input of a user under the condition that a first application calls a camera to collect an image.
In some embodiments, the first sub-input may be user input of an application icon of the second application.
Or, the first sub-input may be an input of the user to an image display interface corresponding to the camera in the first application, where the image display interface includes an image acquired by the camera, for example, the first sub-input may be a long press input on the image display interface by the user or an input to a preset control in the image display interface. Alternatively, the first sub-input may be a double-press input of a preset key, such as a volume key, on the electronic device by the user.
For example, assuming that the first application is a camera application and the first application invokes the front-facing camera, the image display interface may be an image display interface corresponding to the front-facing camera in the camera application, such as a preview interface of the camera.
For example, assuming that the first application is an instant messaging application and the first application invokes the front-facing camera, the image display interface may be a video call interface corresponding to the front-facing camera in the instant messaging application.
Step 101b, the electronic device responds to the first sub-input, and displays a first identifier of a third application in an image display interface corresponding to the camera in the first application.
The image display interface may include an image acquired by a camera.
In some embodiments, the first identifier may be an application icon of the third application, a name of the third application, or any identifier that may indicate the third application.
For example, assuming that the first application is APPA, the second application is APPB, and the third application includes the first application and the third application, after receiving the first sub-input of the user, the electronic device may, as shown in fig. 3, display the name 31 of APP a and the name 32 of APP a in the image display interface 30 corresponding to the camera in the first application.
Step 101c, the electronic device receives a second sub-input that the user drags the first identifier to at least one image element in the image acquired by the camera.
Step 104a, the electronic device creates an association between the third application and the at least one image element in response to the second sub-input.
It should be noted that one application of the third applications corresponds to one second sub-input.
For example, if the third application includes the first application and the second application, the first identifier includes the third identifier of the first application and the second identifier of the second application, so that the user may drag the third identifier onto at least one fourth image element in the image acquired by the camera through a second sub-input, to trigger the electronic device to associate the at least one fourth image element with the first application; and, the user may drag the second identifier over at least one fifth image element in the image captured by the camera through another second sub-input to trigger the electronic device to associate the at least one fifth image element with the second application. It will be appreciated that the second image element is comprised in at least one fourth image element and the first image element is comprised in at least one fifth image element.
In some embodiments, the at least one fourth image element and the at least one fifth image element may be partially different or may be entirely different.
It can be understood that the electronic device may control the third application to acquire the image element in the image acquired by the camera according to the association relationship after creating the association relationship between the application and the at least one image element.
In this way, the user can trigger the electronic device to associate the third application with the specific image element through the input, so that the image element associated with the application can be ensured to meet the use requirement of the user.
Mode 2
In some embodiments, in mode 2, the electronic device may automatically determine the association relationship between the application and the image element according to the application characteristics of the application.
In some embodiments, the application characteristics of the application may include at least one of: the function the application has, the type of application, etc.
Among other things, applications may have functions that include, but are not limited to: an image processing function, an image transmitting function, a payment function, a drawing recognition function, or the like;
types of applications may include, but are not limited to: image processing class, image recognition class, payment class, pet class, navigation class, instant messaging class, etc.
In some embodiments, the electronic device may automatically associate the third application with an image element in the image captured by the camera that is compatible with the application characteristic according to the application characteristic of the third application.
For example, as shown in fig. 4, the image element of "address type" and the image element of "pet" exist in the preview image 40 corresponding to the camera; if the application A is a navigation application, the electronic equipment can know that the application A needs to acquire the location image element; application B is a pet-like application, and application B may acquire "pet" image elements, such as dog elements, so that the electronic device may automatically associate the image elements according to the characteristics of the application, and thus may extract and distribute the corresponding image elements in the preview image to the associated application after association. It will be appreciated that the arrows in fig. 4 and the words "navigation application" and "pet application" are merely used to illustrate the image elements associated with the application, and that in actual implementations, the electronic device may not display the words and arrows.
Therefore, the electronic equipment can automatically determine the association relation between the application and the image element according to the application characteristics of the application without manual determination of a user, so that the automation of determining the association relation between the application and the image element can be improved, and the operation of the user is reduced.
Mode 3
In some embodiments, in mode 3, an element type parameter corresponding to the third application may be stored in the electronic device or in a server corresponding to the third application in advance, so that after receiving the first input of the user, the electronic device may call the element type parameter, and determine, according to the element type parameter, an association relationship between the third application and the image element.
For example, after controlling the third application to call the camera, the camera system in the electronic device may provide a call entry of the element type parameter, and the application B (i.e. the third application) may transmit the element type parameter of the application B, and after the camera system acquires the element type parameter of the application B, it may determine which image elements of the element type are associated with the application B, especially which image elements in the image acquired by the camera are associated with the application B, so that the camera system may extract the image elements associated with the application B from the image and transmit the image elements to the application B.
In some embodiments, multiple parameters may be included in the element type parameter, with different parameters indicating image elements of different element types.
In some embodiments, each of the plurality of parameters may include: numbers or letters.
In this way, the association relationship between the third application and the image element can be determined by the element type parameter corresponding to the third application, so that the convenience of determining the association relationship between the application and the image element can be improved.
In some embodiments, the first application and the second application are both associated with a third image element in the image acquired by the camera, after the step 104, the image processing method provided in the embodiments of the present application may further include steps 105 to 107, and the "controlling the second application to acquire the first image element associated with the second application in the image acquired by the camera" in the step 102 may include the following step B or step C.
Step 105, the electronic device displays a second identifier of the second application and a third identifier of the first application in an image display interface corresponding to the camera in the first application.
In some embodiments, the electronic device may display the second identifier and the third identifier superimposed on the image collected by the camera, or may display the image collected by the camera and the second identifier and the third identifier in regions.
Step 106, the electronic device receives a second input of the second identifier and the third identifier from the user.
In some embodiments, the second input may be an input by the user moving from the second identifier to the third identifier, or from the third identifier to the second identifier.
In some embodiments, the second input may be a click input by the user on the second and third identifications, respectively.
In some embodiments, the second input may be a user sliding input on the second and third identifiers.
Step 107, the electronic device sets a transfer order of the third image element between the first application and the second application in response to the second input.
It will be appreciated that the order of transfer of the third image element between the first application and the second application is an order of operations for instructing the first application and the second application to perform operations based on the third image element.
In some embodiments, the electronic device may set a transfer order of the third image element between the first application and the second application in accordance with the first schedule. The first time sequence is the input time sequence of the second identifier and the third identifier of the second input.
For example, the user clicks on the second identifier and then clicks on the third identifier, and then the electronic device may set the transfer order as: from the second application to the first application.
In some embodiments, the electronic device may set a transfer order of the third image element between the first application and the second application in accordance with an input direction of the second input.
For example, assuming that the second input is a gesture input whose input direction is a direction pointing from the second identifier to the third identifier, the electronic device may set the transfer order as: and transferring the first application to the second application, otherwise setting the transfer sequence from the second application to the first application.
In some embodiments, the first image element includes a third image element, and the second image element also includes a third image element.
And B, controlling the first application to acquire the third image element in the image acquired by the camera and controlling the second application to acquire the third image element from the first application under the condition that the transmission sequence of the third image element is from the first application to the second application.
In some embodiments, the electronic device may control the first application to obtain image data of a third image element in the image captured by the camera.
In some embodiments, the electronic device controls the second application to obtain the third image element from the first application, including: the image data extracted by the first application is obtained from the first application, or the image data of the third image element, such as the third image element after the second operation is performed or the third image element after the second operation is performed, is obtained from the first application by the first application.
In some embodiments, image data of an image captured by a camera may be acquired by a processor in the electronic device and transferred by the processor to the first application.
In some embodiments, the second application obtaining the third image element from the first application may include: obtaining an operation result after the first application executes the second operation on the third image element from the first application; alternatively, the third image element is directly acquired without performing the second operation by the first application.
And C, controlling the second application to acquire the third image element in the image acquired by the camera by the electronic equipment under the condition that the transmission sequence of the third image element is from the second application to the first application.
In some embodiments, the electronic device may control the second application to acquire image data of a third image element in the image acquired by the camera.
For example, the second application acquires image data of an image acquired by the camera, and then acquires image data of a third image element from the image data.
In some embodiments, after the electronic device controls the second application to obtain the third image element, the first application may be controlled to obtain the third image element from the second application or obtain an operation result after the second application performs the first operation on the third image element.
It is understood that in steps 105 to 107, the electronic device may control one of the first application and the second application to acquire the third image element, perform a corresponding operation on the third image element, and then transfer the operation result to the other of the first application and the second application. Thus, when the first application and the second application are related to the same image element, the user can set the transmission and processing sequence of the image element between the two applications through the input triggering electronic equipment, so that the two applications can be ensured to sequentially execute corresponding operations on the image element, the finally obtained image element is the image element processed by the two applications, and the image display effect of the processed image element can be improved.
For example, the user inputs the second identifier first and then inputs the third identifier, so the electronic device may control the first application to acquire and perform the second operation on the third image element first to obtain the image element 1, and then the electronic device transfers the image element 1 to the second application, and the second application performs the first operation on the image element 2.
The image processing method provided in the embodiment of the present application is described below with reference to the accompanying drawings.
Illustratively, taking the third image element as a portrait element, both application a and application B are associated with the portrait element, and both the second operation and the first operation are the person beautification operations, assuming application a is good at "peeling" and application B is good at the person "young nature" process, then:
after determining the association relationship between the application a and the application B and the image element, the electronic device may display the name 51 of the application a and the name 52 of the application B on the image preview image 50 collected by the camera in the application a, as shown in fig. 5. If the user needs to invoke the application a to perform the "skin-polishing" operation on the character element, and then transfer the skin-polished character element to the application B, so as to perform the "young natural" operation on the character element by the application B, as shown in fig. 5, the user may drag the name 51 of the application a onto the name of the application B, so as to trigger the electronic device to set and transfer the character element in the image preview image 50 to the application a, and then transfer the processing result of the character element to the application B by the application a.
Therefore, when the first application and the second application are related to the same image element, the user can trigger the electronic equipment to set the transmission sequence of the image element between the two applications through input, namely, one application acquires the image element and shares the image element with the other application, and the two applications do not need to acquire the image element from the images acquired by the cameras respectively, so that the data processing flow for acquiring the image element can be saved, and the power consumption of the electronic equipment can be saved.
In some embodiments, the first operation is an image processing operation, and the image processing method provided in the embodiments of the present application may further include step 108 and step 109.
And step 108, the electronic equipment performs image synthesis processing on the processed first image element and the image acquired by the camera to obtain a synthesized image.
In some embodiments, the electronic device may directly perform image fusion on the first image element processed by the second application and the original image acquired by the camera, so as to obtain a composite image. Thus, the first image element in the composite image is processed, and the display effect of the image can be improved.
For example, if the second image element is a portrait element and the image processing operation is a portrait beautifying operation, then the portrait in the composite image is beautified by the second application, so that the display effect of the portrait in the composite image can be improved.
In some embodiments, the electronic device may control the first application to perform image synthesis processing on the processed first image element and the image acquired by the camera to obtain a synthesized image, where the second application needs to first acquire image data of the original image acquired by the camera.
In some embodiments, the electronic device may control the second application to perform image synthesis processing on the processed first image element and the image acquired by the camera to obtain a synthesized image, where the second application needs to first acquire image data of the original image acquired by the camera.
In some embodiments, the electronic device may control other than the first application and the second application to perform the above-described image composition processing.
Step 109, the electronic device displays the composite image in an image display interface corresponding to the camera in the first application.
In some embodiments, when the first application is not good at or cannot perform the image processing operation on the first image element, the user may trigger the electronic device to control the second application to perform the second operation on the first image element through the first input, and then the electronic device performs the synthesis processing on the second image element after the image processing and the original image acquired by the camera, and displays the synthesized image in the image display interface corresponding to the camera in the first application.
For example, assuming that the image display interface is a camera preview interface corresponding to a camera in a camera application, the electronic device may replace a preview image acquired by the camera with the composite image, thereby implementing replacement of an original image element, and thus, a better image processing effect may be achieved.
For another example, assuming that the image display interface is a video call interface corresponding to the camera in the instant messaging application, the electronic device may share the composite image to the video object through the video call interface after displaying the composite image in the video call interface.
Therefore, even if the first application cannot execute image processing operation on the image elements, the image after image processing can be displayed in the image display interface corresponding to the camera in the first application, and the defect of functions of the first application can be overcome, so that function borrowing and sharing among different applications can be realized.
In some embodiments, the image processing method provided in the embodiments of the present application may further include the following step 110.
Step 110, the electronic device displays, in an image display interface corresponding to the camera in the first application, an image element with a first element type in an image acquired by the camera.
Wherein the first element type is an element type of an image element associated with the first application.
In some embodiments, after receiving the first input of the user, the electronic device may control the first application to obtain a first image element in the image collected by the camera, and display, in an image display interface corresponding to the camera in the first application, an image element associated with the second application, such as the first image element, so as to facilitate the user to confirm that the second application successfully shares the image element in the image collected by the camera.
In some embodiments, when the camera is called, the second application may transmit the image parameter type corresponding to the second application to the first application, inform the first application to obtain only the image element corresponding to the image parameter type, and display the image element of the type in the image display interface corresponding to the camera in the first application, that is, the image element of other element types in the image acquired by the camera is not displayed. In some embodiments, assuming that the first operation is a photographing operation or a video recording operation, the second application also includes only the image elements of the element type indicated by the image parameter type in the finally obtained photograph and video recording, and does not include the image elements of other element types.
In some embodiments, the first operation and the second operation are both image processing operations, and the image processing method provided in the embodiments of the present application may further include step 111 and step 112.
And 111, the electronic equipment performs image synthesis processing on the processed second image element and the processed first image element to obtain a synthesized image.
In some embodiments, the electronic device may directly stitch the processed second image element and the processed first image element to obtain the composite image.
In some embodiments, the electronic device may combine the processed second image element and the processed first image element into the original image acquired by the camera to obtain the combined image.
For example, according to the areas where the second image element and the first image element are located in the image collected by the camera, the processed second image element and the processed first image element are synthesized into the original image collected by the camera, so as to obtain a synthesized image.
In some embodiments, the electronic device may control the first application to transfer the processed second image element to the second application, and then the second application performs the image synthesis process described above; or the electronic device may control the second application to transfer the processed first image element to the first application, and then the first application performs the image synthesis process. Alternatively, the electronic device may control the first application to transfer the processed second image element to the fourth application, and control the second application to transfer the processed first image element to the fourth application, and execute the image composition process by the fourth application.
Step 112, the electronic device displays the composite image.
In some embodiments, the electronic device may display the composite image in an image presentation interface corresponding to the camera in the first application. Alternatively, the electronic device may display the composite image in an image presentation interface corresponding to the camera in the second application.
The image processing method provided in the embodiment of the present application is described below with reference to the accompanying drawings.
For example, assuming that the second application is application a and the first application is application B, referring to fig. 2B, if application B performs a beautifying process on the landscape element 27 and the dog element 26 in the picture acquired by the camera and application a performs a beautifying process on the person element 25 in the picture acquired by the camera, the electronic device may control application B to synthesize the landscape element 27' and the dog element 26' processed by application B and the person element 25' processed by application a to obtain a synthesized image, and display the synthesized image in the image display interface 24 corresponding to the camera in application B.
Therefore, when the first operation and the second operation are both image processing operations, the electronic equipment can conduct image synthesis processing on the processed second image element and the processed first image element to obtain a synthesized image and display the synthesized image, so that the processing speed of image processing on the image collected by the camera can be improved, the combination of image processing functions of different applications can be realized, and diversified image processing effects are realized.
In some embodiments, after the user triggers the electronic device to control the first application to call the camera, the electronic device may display, in an image display interface corresponding to the camera, image elements in the image acquired by the camera, which meet the image processing requirement of the first application, according to the image processing requirement of the first application on the image. I.e. display the image elements of the image that meet the image processing requirements of the first application and hide the remaining image elements of the image.
For example, assuming that the purpose of the first application to call the camera to collect the image and the image processing requirement are to acquire the character element as the contact head portrait in the first application, that is, the main purpose is to acquire the character, the electronic device may hide the image elements except the character in the image, so that the user does not need to cut the processed image manually later, and thus the operation procedure of setting the head portrait may be simplified.
It will be appreciated that after the electronic device conceals a portion of the image elements in the image, the electronic device may control the second application to acquire the image elements in the image that are in the display state, and the image elements associated with the second application. I.e. the second application does not acquire any image elements in the image that are hidden.
It is to be understood that the foregoing method embodiments, or various possible implementation manners in the method embodiments, may be executed separately, or may be executed in combination with each other on the premise that no contradiction exists, and may be specifically determined according to actual use requirements, which is not limited by the embodiment of the present application.
According to the image processing method provided by the embodiment of the application, the execution subject can be an image processing device. In the embodiment of the present application, an image processing apparatus provided in the embodiment of the present application will be described by taking an example in which the image processing apparatus executes an image processing method.
Fig. 7 shows a schematic structural diagram of an image processing apparatus according to an embodiment of the present application, and as shown in fig. 7, the image processing apparatus 70 may include:
a receiving module 71, configured to receive a first input of a user to a second application when the first application invokes the camera to collect an image;
the control module 72 is configured to, in response to the first input received by the receiving module 71, control the second application to obtain a first image element associated with the second application in the image acquired by the camera, and control the second application to perform a first operation based on the first image element.
In some embodiments, the control module is further configured to, after the receiving module receives a first input of a second application from a user, control the first application to obtain a second image element associated with the first application in an image acquired by the camera, and control the first application to perform a second operation based on the second image element, in response to the first input.
In some embodiments, the apparatus further comprises a determination module;
the determining module is used for determining the association relationship between the third application and the image element according to the first information before the control module controls the second application to acquire the first image element associated with the second application in the image acquired by the camera;
Wherein the first information includes any one of:
user input to the third application;
application characteristics of the third application;
the element type parameter list corresponding to the third application is used for indicating the element type of the image element associated with the application;
wherein the third application comprises any one of: the second application, the first application, and the second application.
In some embodiments, the first information includes user input to the third application;
the receiving module is specifically configured to: receiving a first sub-input of a user; responding to the first sub-input, and displaying a first identification of the third application in an image display interface corresponding to the camera in a first application, wherein the image display interface comprises images acquired by the camera; and receiving a second input by a user dragging the first identifier to at least one image element in the image acquired by the camera;
the determining module is specifically configured to create an association relationship between the third application and the at least one image element in response to the second sub-input.
In some embodiments, the first application and the second application are each associated with a third image element in the image captured by the camera;
the device also comprises a display module and a setting module;
the display module is used for displaying a second identifier of the second application and a third identifier of the first application in an image display interface corresponding to the camera in the first application after the determining module determines the association relationship between the third application and the image element according to the first information;
the receiving module is further used for receiving second input of the second identifier and the third identifier from a user;
the setting module is used for responding to the second input received by the receiving module and setting the transmission sequence of the third image element between the first application and the second application;
the control module is specifically configured to:
controlling the first application to acquire a third image element in the image acquired by the camera and controlling the second application to acquire the third image element from the first application under the condition that the transmission sequence of the third image element is from the first application to the second application;
And controlling the second application to acquire the third image element in the image acquired by the camera under the condition that the transmission sequence of the third image element is from the second application to the first application.
In some embodiments, the first operation is an image processing operation;
the device also comprises a synthesis module and a display module;
the synthesis module is used for carrying out synthesis processing on the processed first image element and the image acquired by the camera to obtain a synthesized image;
the display module is used for displaying the synthesized image synthesized by the synthesis module in an image display interface corresponding to the camera in the first application.
In the image processing device provided by the embodiment of the application, under the condition that one application calls one camera, as the user can trigger the image processing device to control the application to acquire the image elements in the image acquired by the camera through input, and execute the corresponding function, a plurality of applications can share the image elements in the image acquired by the same camera. Therefore, the effect that a plurality of applications call the same camera at the same time can be achieved, and the flexibility and convenience of calling the camera by the applications can be improved.
The image processing apparatus in the embodiment of the present application may be an electronic device, or may be a component in an electronic device, for example, an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the electronic device may be a mobile phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented reality (augmented reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-mobile personal computer, UMPC, netbook or personal digital assistant (personal digital assistant, PDA), etc., but may also be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The image processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
The image processing device provided in the embodiment of the present application can implement each process implemented by the embodiments of the methods of fig. 1 to 6, so as to achieve the same technical effects, and in order to avoid repetition, a detailed description is omitted here.
In some embodiments, as shown in fig. 8, the embodiment of the present application further provides an electronic device 800, including a processor 801 and a memory 802, where a program or an instruction capable of being executed on the processor 801 is stored in the memory 802, and the program or the instruction is executed by the processor 801 to implement each step of the embodiment of the image processing method, and achieve the same technical effect, so that repetition is avoided and no further description is given here.
It should be noted that, the electronic device in the embodiment of the present application includes a mobile electronic device and a non-mobile electronic device.
Fig. 9 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 700 includes, but is not limited to: radio frequency unit 701, network module 702, audio output unit 703, input unit 704, sensor 705, display unit 706, user input unit 707, interface unit 708, memory 709, and processor 710.
Those skilled in the art will appreciate that the electronic device 700 may also include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 710 via a power management system so as to perform functions such as managing charge, discharge, and power consumption via the power management system. The electronic device structure shown in fig. 9 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
The user input unit 707 is configured to receive a first input of a user to a second application when the first application invokes the camera to collect an image;
a processor 710, configured to, in response to the first input received by the user input unit 707, control the second application to acquire a first image element associated with the second application in the image acquired by the camera, and control the second application to perform a first operation based on the first image element.
In some embodiments, the processor 710 is further configured to, after the user input unit 707 receives a first input of a second application from a user, control the first application to acquire a second image element associated with the first application in an image acquired by the camera in response to the first input, and control the first application to perform a second operation based on the second image element.
In some embodiments, the processor 710 is configured to determine, according to the first information, an association relationship between a third application and an image element before the processor 710 controls the second application to acquire the first image element associated with the second application in the image acquired by the camera;
Wherein the first information includes any one of:
user input to the third application;
application characteristics of the third application;
the element type parameter list corresponding to the third application is used for indicating the element type of the image element associated with the application;
wherein the third application comprises any one of: the second application, the first application, and the second application.
In some embodiments, the first information includes user input to the third application;
the user input unit 707 specifically is configured to: receiving a first sub-input of a user; responding to the first sub-input, and displaying a first identification of the third application in an image display interface corresponding to the camera in a first application, wherein the image display interface comprises images acquired by the camera; and receiving a second input by a user dragging the first identifier to at least one image element in the image acquired by the camera;
the processor 710 is specifically configured to create an association relationship between the third application and the at least one image element in response to the second sub-input.
In some embodiments, the first application and the second application are each associated with a third image element in the image captured by the camera;
the display unit 706 is configured to display, after the processor 710 determines, according to the first information, an association relationship between a third application and an image element, a second identifier of the second application and a third identifier of the first application in an image display interface corresponding to the camera in the first application;
the user input unit 707 is further configured to receive a second input of the second identifier and the third identifier from a user;
the processor 710 is configured to set a transfer order of the third image element between the first application and the second application in response to the second input received by the user input unit 707;
the processor 710 is specifically configured to:
controlling the first application to acquire a third image element in the image acquired by the camera and controlling the second application to acquire the third image element from the first application under the condition that the transmission sequence of the third image element is from the first application to the second application;
And controlling the second application to acquire the third image element in the image acquired by the camera under the condition that the transmission sequence of the third image element is from the second application to the first application.
In some embodiments, the first operation is an image processing operation;
the display unit 706 is configured to perform a synthesis process on the processed first image element and the image acquired by the camera to obtain a synthesized image;
the display unit 706 is configured to display, in an image display interface corresponding to the camera in the first application, the composite image synthesized by the display unit 706.
In the electronic device provided by the embodiment of the application, under the condition that one application calls one camera, as the user can trigger the electronic device to control the application to acquire the image elements in the image acquired by the camera through input and execute the corresponding function, a plurality of applications can share the image elements in the image acquired by the same camera. Therefore, the effect that a plurality of applications call the same camera at the same time can be achieved, and the flexibility and convenience of calling the camera by the applications can be improved.
It should be appreciated that in embodiments of the present application, the input unit 704 may include a graphics processor (Graphics Processing Unit, GPU) 7041 and a microphone 7042, with the graphics processor 7041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 706 may include a display panel 7061, and the display panel 7061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 707 includes at least one of a touch panel 7071 and other input devices 7072. The touch panel 7071 is also referred to as a touch screen. The touch panel 7071 may include two parts, a touch detection device and a touch controller. Other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
The memory 709 may be used to store software programs as well as various data. The memory 709 may mainly include a first storage area storing programs or instructions and a second storage area storing data, wherein the first storage area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 709 may include volatile memory or nonvolatile memory, or the memory 709 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 709 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 710 may include one or more processing units; in some embodiments, processor 710 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, and the like, and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 710.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image processing method, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used for running a program or an instruction, so as to implement each process of the embodiment of the image processing method, and achieve the same technical effect, so that repetition is avoided, and no redundant description is provided here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
The embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the embodiments of the image processing method described above, and achieve the same technical effects, and are not repeated herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (16)

1. An image processing method, the method comprising:
under the condition that a first application calls a camera to collect an image, receiving a first input of a user to a second application;
and responding to the first input, controlling the second application to acquire a first image element associated with the second application in the image acquired by the camera, and controlling the second application to execute a first operation based on the first image element.
2. The method of claim 1, wherein after receiving the first input from the user to the second application, the method further comprises:
and responding to the first input, controlling the first application to acquire a second image element associated with the first application in the image acquired by the camera, and controlling the first application to execute a second operation based on the second image element.
3. The method according to claim 1 or 2, wherein before the controlling the second application to obtain the first image element associated with the second application in the image acquired by the camera, the method further comprises:
determining an association relationship between the third application and the image element according to the first information;
Wherein the first information includes any one of:
user input to the third application;
application characteristics of the third application;
the element type parameter corresponding to the third application is used for indicating the element type of the image element associated with the application;
wherein the third application comprises any one of: the second application, the first application, and the second application.
4. A method according to claim 3, wherein the first information comprises user input to the third application;
the receiving a first input from a user to a second application includes:
receiving a first sub-input of a user;
responding to the first sub-input, and displaying a first identification of the third application in an image display interface corresponding to the camera in the first application, wherein the image display interface comprises images acquired by the camera;
receiving a second sub-input that a user drags the first identifier to at least one image element in an image acquired by the camera;
the determining, according to the first information, the association relationship between the third application and the image element includes:
In response to the second sub-input, an association between the third application and the at least one image element is created.
5. A method according to claim 3, wherein the first application and the second application are each associated with a third image element in the image captured by the camera;
after determining the association relationship between the third application and the image element according to the first information, the method further includes:
displaying a second identifier of the second application and a third identifier of the first application in an image display interface corresponding to the camera in the first application;
receiving a second input of a user to the second identifier and the third identifier;
in response to the second input, setting a transfer order of the third image element between the first application and the second application;
the controlling the second application to acquire the first image element associated with the second application in the image acquired by the camera includes:
controlling the first application to acquire a third image element in the image acquired by the camera and controlling the second application to acquire the third image element from the first application under the condition that the transmission sequence of the third image element is from the first application to the second application;
And controlling the second application to acquire the third image element in the image acquired by the camera under the condition that the transmission sequence of the third image element is from the second application to the first application.
6. The method of claim 1, wherein the first operation is an image processing operation; the method further comprises the steps of:
performing image synthesis processing on the processed first image element and an image acquired by the camera to obtain a synthesized image;
and displaying the synthesized image in an image display interface corresponding to the camera in the first application.
7. The method of claim 1, wherein the controlling the second application to obtain the first image element associated with the second application in the image captured by the camera comprises:
and controlling the first application to acquire image data of a first image element associated with the second application in the image acquired by the camera, and transmitting the image data to the second application through a bottom camera data channel of the camera.
8. The method according to claim 1, wherein the method further comprises:
Displaying image elements with the element types of the first element type in the images acquired by the cameras in an image display interface corresponding to the cameras in the first application;
wherein the first element type is an element type of an image element associated with the first application.
9. An image processing apparatus, characterized in that the apparatus comprises:
the receiving module is used for receiving a first input of a user to a second application under the condition that the first application calls the camera to collect an image;
and the control module is used for responding to the first input received by the receiving module, controlling the second application to acquire a first image element associated with the second application in the image acquired by the camera, and controlling the second application to execute a first operation based on the first image element.
10. The apparatus of claim 9, wherein the control module is further configured to, after the receiving module receives a first input from a user to a second application, control the first application to obtain a second image element associated with the first application in the image captured by the camera in response to the first input, and control the first application to perform a second operation based on the second image element.
11. The apparatus according to claim 9 or 10, further comprising a determination module;
the determining module is used for determining the association relationship between the third application and the image element according to the first information before the control module controls the second application to acquire the first image element associated with the second application in the image acquired by the camera;
wherein the first information includes any one of:
user input to the third application;
application characteristics of the third application;
the element type parameter list corresponding to the third application is used for indicating the element type of the image element associated with the application;
wherein the third application comprises any one of: the second application, the first application, and the second application.
12. The apparatus of claim 11, wherein the first information comprises user input to the third application;
the receiving module is specifically configured to: receiving a first sub-input of a user; responding to the first sub-input, and displaying a first identification of the third application in an image display interface corresponding to the camera in a first application, wherein the image display interface comprises images acquired by the camera; and receiving a second input by a user dragging the first identifier to at least one image element in the image acquired by the camera;
The determining module is specifically configured to create an association relationship between the third application and the at least one image element in response to the second sub-input.
13. The apparatus of claim 11, wherein the first application and the second application are each associated with a third image element in an image captured by the camera;
the device also comprises a display module and a setting module;
the display module is used for displaying a second identifier of the second application and a third identifier of the first application in an image display interface corresponding to the camera in the first application after the determining module determines the association relationship between the third application and the image element according to the first information;
the receiving module is further used for receiving second input of the second identifier and the third identifier from a user;
the setting module is used for responding to the second input received by the receiving module and setting the transmission sequence of the third image element between the first application and the second application;
the control module is specifically configured to:
controlling the first application to acquire a third image element in the image acquired by the camera and controlling the second application to acquire the third image element from the first application under the condition that the transmission sequence of the third image element is from the first application to the second application;
And controlling the second application to acquire the third image element in the image acquired by the camera under the condition that the transmission sequence of the third image element is from the second application to the first application.
14. The apparatus of claim 9, wherein the first operation is an image processing operation;
the device also comprises a synthesis module and a display module;
the synthesis module is used for carrying out synthesis processing on the processed first image element and the image acquired by the camera to obtain a synthesized image;
the display module is used for displaying the synthesized image synthesized by the synthesis module in an image display interface corresponding to the camera in the first application.
15. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the image processing method of any of claims 1-8.
16. A readable storage medium, characterized in that the readable storage medium has stored thereon a program or instructions which, when executed by a processor, implement the steps of the image processing method according to any of claims 1-8.
CN202311740950.3A 2023-12-15 2023-12-15 Image processing method, device, electronic equipment and medium Pending CN117729415A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311740950.3A CN117729415A (en) 2023-12-15 2023-12-15 Image processing method, device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311740950.3A CN117729415A (en) 2023-12-15 2023-12-15 Image processing method, device, electronic equipment and medium

Publications (1)

Publication Number Publication Date
CN117729415A true CN117729415A (en) 2024-03-19

Family

ID=90210228

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311740950.3A Pending CN117729415A (en) 2023-12-15 2023-12-15 Image processing method, device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN117729415A (en)

Similar Documents

Publication Publication Date Title
CN108108079A (en) A kind of icon display processing method and mobile terminal
CN111159449A (en) Image display method and electronic equipment
CN112672061A (en) Video shooting method and device, electronic equipment and medium
CN110908638A (en) Operation flow creating method and electronic equipment
CN114153362A (en) Information processing method and device
CN112449110B (en) Image processing method and device and electronic equipment
CN113852540B (en) Information transmission method, information transmission device and electronic equipment
CN115357158A (en) Message processing method and device, electronic equipment and storage medium
CN117729415A (en) Image processing method, device, electronic equipment and medium
CN114374761A (en) Information interaction method and device, electronic equipment and medium
CN117750177A (en) Image display method, device, electronic equipment and medium
CN115412524B (en) Message processing method, device, electronic equipment and medium
CN113691443B (en) Image sharing method and device and electronic equipment
CN117149020A (en) Information management method, device, equipment and medium
CN116156312A (en) File sharing method and device, electronic equipment and readable storage medium
CN115525379A (en) Content sharing method and content sharing device
CN117592098A (en) Image storage method and device
CN114860122A (en) Application program control method and device
CN115712471A (en) Cross-device control method, cross-device control device and electronic device
CN115129219A (en) Function execution method and device
CN114398129A (en) Shared object sharing method and device, electronic equipment and readable storage medium
CN113989421A (en) Image generation method, apparatus, device and medium
CN117251093A (en) Image processing method, image processing device and electronic equipment
CN117376299A (en) Group management method and device and electronic equipment
CN117596326A (en) Control method and device of electronic equipment, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination