CN111130995B - Image control method, electronic device, and storage medium - Google Patents

Image control method, electronic device, and storage medium Download PDF

Info

Publication number
CN111130995B
CN111130995B CN201911294231.7A CN201911294231A CN111130995B CN 111130995 B CN111130995 B CN 111130995B CN 201911294231 A CN201911294231 A CN 201911294231A CN 111130995 B CN111130995 B CN 111130995B
Authority
CN
China
Prior art keywords
image
input
control
electronic device
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911294231.7A
Other languages
Chinese (zh)
Other versions
CN111130995A (en
Inventor
王程刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911294231.7A priority Critical patent/CN111130995B/en
Publication of CN111130995A publication Critical patent/CN111130995A/en
Priority to PCT/CN2020/134815 priority patent/WO2021121093A1/en
Application granted granted Critical
Publication of CN111130995B publication Critical patent/CN111130995B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/18Commands or executable codes

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses an image control method, relates to the technical field of communication, and can solve the problem of complicated operation process of inputting information in an application program according to image content. The method comprises the following steps: receiving a first image sent by second electronic equipment, wherein the first image comprises N controls; displaying the first image; receiving a first input of a user to a first control of the N controls; in response to the first input, performing a first operation associated with the first control on a first object, the first object being an object in a first image region of the first image, the first object being an object associated with the first control; wherein N is a positive integer.

Description

Image control method, electronic device, and storage medium
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an image control method, electronic equipment and a storage medium.
Background
With the development of electronic devices, transmitting or receiving images using electronic devices has become an important part of life and work.
If the user A needs the user B to share a group member list, the user B directly sends the screenshot of the group member list interface to the user A, after receiving the picture, the user A needs to open an application program, inputs the application program word by word according to the member names in the picture, and then completes the operation of adding friends; if the user A needs to set a certain function, the user B is requested to ask for help, the user B captures the screen of the setting interface and sends the screen to the user A, the user A needs to open the setting, corresponding options are selected one by one according to the content in the image to complete the setting, and the operation process is complicated.
Disclosure of Invention
The embodiment of the invention provides an image control method, electronic equipment and a storage medium, which can solve the problem that the conventional operation process of inputting information in an application program according to image content is complicated.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image control method, which is applied to a first electronic device, and the method includes: the method comprises the following steps: receiving a first image sent by second electronic equipment, wherein the first image comprises N controls; displaying the first image; receiving a first input of a user to a first control of the N controls; in response to the first input, performing a first operation associated with the first control on a first object, the first object being an object in a first image region of the first image, the first object being an object associated with the first control; wherein N is a positive integer.
In a second aspect, an embodiment of the present invention provides an image control method, which is applied to a second electronic device, and includes: the method comprises the following steps: receiving a first input of a user; displaying N controls on a first image in response to the first input; sending the first image to a first electronic device; wherein N is a positive integer.
In a third aspect, an embodiment of the present invention provides an electronic device, including: the receiving module is used for receiving a first image sent by second electronic equipment, and the first image comprises N controls; a display module for displaying the first image; the receiving module is further used for receiving a first input of a user to a first control in the N controls; an execution module, configured to, in response to the first input, execute a first operation associated with the first control on a first object, where the first object is an object in a first image region of the first image, and the first object is an object associated with the first control; wherein N is a positive integer.
In a fourth aspect, an embodiment of the present invention provides an electronic device, including: the receiving module is used for receiving a first input of a user; a display module to display N controls on a first image in response to the first input; the sending module is used for sending the first image to first electronic equipment; wherein N is a positive integer.
In a fifth aspect, an embodiment of the present invention provides an electronic device, which includes a processor, a memory, and a computer program stored on the memory and executable on the processor, and when executed by the processor, the computer program implements the steps of the image control method in the first aspect.
In a sixth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the image control method as in the first aspect.
In the embodiment of the invention, by receiving a first image sent by a second electronic device, the first image comprises N controls; displaying the first image; receiving a first input of a user to a first control of the N controls; in response to the first input, performing a first operation associated with the first control on a first object, the first object being an object in a first image region of the first image, the first object being an object associated with the first control; and N is a positive integer, namely, the first image sent by the second electronic equipment is received, the first image is displayed, and then first operation associated with the first control is executed on the first object by receiving first input of a user to the first control in the N controls. Therefore, the associated operation can be executed according to the input of the user to the control in the image, the user is not required to open the application program, and then the information is manually input one by one according to the image content, so that the operation process is simplified.
Drawings
FIG. 1 is a flowchart of an image control method according to an embodiment of the present invention;
FIG. 2 is a second flowchart of an image control method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an embodiment of a display device for displaying a first image;
FIG. 4 is a second schematic diagram illustrating a first image according to an embodiment of the present invention;
FIG. 4(a) is a third schematic diagram illustrating a first image according to an embodiment of the present invention;
FIG. 5 is a fourth schematic diagram illustrating a first image according to an embodiment of the present invention;
FIG. 6 is a fifth schematic diagram illustrating a first image according to an embodiment of the present invention;
FIG. 7 is a sixth schematic diagram illustrating a first image according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a first electronic device according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a second electronic device according to an embodiment of the present invention;
fig. 10 is a hardware schematic diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," and "fourth," etc. in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first input, the second input, the third input, the fourth input, etc. are used to distinguish between different inputs, rather than to describe a particular order of inputs.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
Referring to fig. 1, an embodiment of the present invention provides an image control method applied to a first electronic device, and the method may include steps 111 to 114 described below.
Step 111, receiving a first image sent by a second electronic device, wherein the first image comprises N controls;
alternatively, the first image may be a screenshot or other arbitrary image sent by the second electronic device.
Step 112, displaying the first image;
optionally, the first image may be displayed with N controls at the same time, as shown in fig. 4, where the first image is a screenshot of an address list sent by the second electronic device, and includes the N controls, and the identifier 402 is one of the N controls. The displaying of the first image may be displaying only the content of the first image, receiving a touch input of a user after the displaying of the first image, and displaying the N controls. Illustratively, as shown in fig. 3, after the user clicks the "acquire picture information" control, N controls may be displayed, and as shown in fig. 4, may be obtained.
Step 113, receiving a first input of a user to a first control in the N controls; wherein N is a positive integer.
Alternatively, the first input may be a first operation, and the first input may include, but is not limited to, a click touch input, a double click touch input, a swipe touch input, a long press touch input, and the like of the user. Such as a user may click on a control indicated at 402 in fig. 4.
And step 114, responding to the first input, and executing a first operation associated with the first control on a first object, wherein the first object is an object in a first image area of the first image, and the first object is an object associated with the first control.
Alternatively, the first image area may be any area of the first image, such as the whole area of the first image, or may be a partial area of the first image, such as an area in any column of the address list in fig. 4.
Alternatively, the first object may include, but is not limited to, pictures, text, numbers, controls, and the like. As shown in the first column of the address list of fig. 4, the first object 401 is a contact and a communication number.
Optionally, the first object is an object associated with the first control, and an association relationship between the first object and the first control may be set by the second electronic device.
Optionally, the first operation associated with the first control is set for the second electronic device.
Illustratively, as shown in fig. 4, the first image is an image sent by the second electronic device, the first image is a screenshot of the address book, the first image area of the first image may be an area in a first column of the address book list, such as a dashed area indicated by 403 in fig. 4, the first control may be a "add to address book" control indicated by 402, the first object may be associated with the control, such as Tony13812345678, and the first operation associated with the control may be adding Tony13812345678 to the address book.
In the embodiment of the invention, by receiving a first image sent by a second electronic device, the first image comprises N controls; displaying the first image; receiving a first input of a user to a first control of the N controls; in response to the first input, performing a first operation associated with the first control on a first object, the first object being an object in a first image region of the first image, the first object being an object associated with the first control; and N is a positive integer, namely, the first image sent by the second electronic equipment is received, the first image is displayed, and then first operation associated with the first control is executed on the first object by receiving first input of a user to the first control in the N controls. Therefore, the associated operation can be executed according to the input of the user to the control in the image, the user is not required to open the application program, and then the information is manually input one by one according to the image content, so that the operation process is simplified.
Optionally, the first operation is determined based on an object type of the first object. In this way, the first operation can be determined according to the type of the first object.
Alternatively, the object type of the first object may be acquired first; the first operation is determined based on an object type of the first object. The object type may be a picture, an application, a number, etc., and of course, other object types are also included in the scope of the present invention.
Illustratively, the type of the first object is a communication number, it may be determined that the first operation is to add the number to a chat application, such as to an address book; if the type of the first object is a picture, the first operation can be determined to be a cutout; if the first object is an application icon, it may be determined that the first operation is downloading the application. Of course, the invention is not limited to the above-listed cases, and other cases are also included in the scope of the invention.
Optionally, the performing the first operation associated with the first control on the first object includes:
and adding the first object to a target area of a target interface of a target application program, or updating the interface content of the target interface of the target application program. Therefore, the first object can be added to the target area of the target interface of the target application program, or the interface content of the target interface of the target application program can be updated, the user does not need to open the target interface of the target application program for operation, and the operation of the user is simplified.
Optionally, the target interface of the target application includes, but is not limited to, a main interface, a function interface, a shortcut interface, and the like of the target application.
Optionally, the target area of the target interface of the target application may be a full screen area of the target interface of the target application or may be a partial area of the target interface of the target application. For example, the partial area of the target interface of the target application program can be any column in the contact list, and can be an area where an input box for editing the contact in the contact interface is added.
Optionally, adding the first object to a target area of a target interface of a target application, or updating interface content of the target interface of the target application, may be determined by the second electronic device.
Illustratively, as shown in fig. 4, the first object 401 is a name and a phone number, such as Tony13812345678, and the first operation associated with the first control 402 may be to add the phone number to the list of addresses. The user may click on the first control 402 to add Tony13812345678 to the list of contacts.
Illustratively, as shown in fig. 5, a first image sent by a second electronic device is received, where the first image includes N controls, and the first image is a screenshot of a setting interface of the second electronic device, and specifically may be a screenshot of a setting interface of display and brightness, where setting options such as "automatically adjust screen brightness", "global eye protection", "dark mode" and the like are displayed in the screenshot, a user may click a control "add to my setting" indicated by 502, may set the "automatically adjust screen brightness" option of the first electronic device to an on state, and a user may click a control "all add to my setting" indicated by 503, may set the display and brightness of the first electronic device to be consistent in the image, that is, may update the setting interface content of display and brightness of the first electronic device. Therefore, the user does not need to open a display and brightness setting interface, and the setting is carried out one by one according to the first image content, so that the operation of the user is simplified.
Illustratively, as shown in fig. 6, a first image sent by a second electronic device is received, where the first image includes N controls, and the first image is a screenshot of a desktop of the second electronic device. The user may click on a first control 602 of the N controls to download the application app5 indicated by 601, i.e., to add the first object to the target area of the desktop of the first electronic device. It should be noted that the target area may be set by default, or may be set by a user, and may be determined according to actual situations.
Optionally, after the step 101 displays the first image, the method further includes:
step a, receiving a second input of a user to a second image area in the first image;
alternatively, the second input may be a second operation, which may include, but is not limited to, a click touch input, a double click touch input, a slide touch input, a long press touch input, etc., of the user on the second image area in the first image.
Optionally, the second image region in the first image may be any region in the first image, which is not limited in this embodiment of the present invention.
Step b, responding to the second input, and sending first information to second electronic equipment, wherein the first information comprises first position information of a second object of the second image area in the first image or image content of the second image area;
and c, receiving second information sent by the second electronic equipment, wherein the second information is information associated with the second object. In this way, the first electronic device can be automatically triggered to send the first information to the second electronic device according to the input of the first electronic device at any position in the first image, and the second information required by the user and sent by the second electronic device based on the first information can be received.
Optionally, the second information is information associated with the second object, and may be obtained by the second electronic device based on the first information. For example, the second information associated with the second object may be obtained by the second electronic device querying according to the association relation table stored in the server based on the first information.
For example, as shown in fig. 6, a first image sent by a second electronic device is received, where the first image is a screenshot of a desktop of the second electronic device, a user may click on a second image area of the first image, such as an area where app4 is located, and may send first information to the second electronic device, where the first information may be first location information of the second object in the first image, such as location information of app4 in the first image, such as coordinates of app4 in a coordinate system of the first image, and it should be noted that the coordinate system may be a coordinate system established with an arbitrary point of the first image as an origin. The first information may also be the image content of the second image region, such as the name of app 4. Second information sent by the second electronic device may be received, where the second information is information associated with the second object, and may be a download link of app4 or an installation package of app 4.
For example, as shown in fig. 7, a first image sent by a second electronic device is received, where the first image is a screenshot of an album interface of the second electronic device, a user may click on a second image area of the first image, such as click on picture 1, and may send first information to the second electronic device, where the first information may be first position information of the first object 501 in the first image, such as position information of picture 1 in the first image, and the position information may be coordinate position information of picture 1 in the first image, or position information of picture 1 in a second row and a first column in the first image. The first information may also be the image content of the second image area, such as the image content of picture 1. And receiving second information sent by the second electronic equipment, wherein the second information can be the original image of the picture 1.
Optionally, after receiving the second information sent by the second electronic device, the second information may be displayed, and if the second information is an original image of the picture 1, the original image of the picture 1 may be displayed. Corresponding operations may also be performed based on the second information, such as the second information being a download link of app4, based on which app4 may be downloaded. Of course, the method is not limited to the above-listed cases, and the specific method can be determined according to actual situations.
In the embodiment of the invention, by receiving a first image sent by a second electronic device, the first image comprises N controls; displaying the first image; receiving a first input of a user to a first control of the N controls; in response to the first input, performing a first operation associated with the first control on a first object, the first object being an object in a first image region of the first image, the first object being an object associated with the first control; wherein N is a positive integer. Therefore, the associated operation can be executed according to the input of the user to the control in the image, and the operation process is simplified. The first operation can be determined according to a type of the first object. The first object can be added to the target area of the target interface of the target application program, or the interface content of the target interface of the target application program is updated, so that the user does not need to open the target interface of the target application program for operation, and the operation of the user is simplified. The method can automatically trigger the first information to be sent to the second electronic equipment according to the input of the first electronic equipment at any position in the first image, and can receive the second information which is sent by the second electronic equipment based on the first information and is related to the second object required by the user.
Referring to fig. 2, an embodiment of the present invention provides an image control method applied to a second electronic device, and the method may include steps 201 to 203 described below.
Step 201, receiving a first input of a user;
alternatively, the first input may be a first operation, which may include, but is not limited to, a click touch input, a double click touch input, a swipe touch input, a long press touch input, and the like of the user.
Step 202, responding to the first input, and displaying N controls on a first image; wherein N is a positive integer.
Illustratively, as shown in fig. 4, N controls are displayed on the first image, wherein the identifier 402 is used to indicate one control.
Step 203, sending the first image to a first electronic device.
Optionally, the first image is sent to the first electronic device, the N controls may be displayed on the first image and sent to the first electronic device, or the N controls may be hidden and sent to the first electronic device, and the first electronic device may display the N controls by acquiring image information. Illustratively, as shown in fig. 3, after the user clicks the "acquire picture information" option, N controls may be displayed, and fig. 4 may be obtained.
In the embodiment of the invention, the first input of the user is received; displaying N controls on a first image in response to the first input; sending the first image to a first electronic device; the method includes the steps that N is a positive integer, namely, N controls are displayed on a first image by receiving first input of a user, and then the first image is sent to first electronic equipment. A control can be displayed on the first image.
Optionally, the first input comprises an input for triggering an add control;
alternatively, the first input may be an input for triggering the add control, and may include, but is not limited to, a click touch input, a double click touch input, a swipe touch input, a long press touch input, etc. of the user.
Step 202, before displaying the N controls on the first image, further includes:
acquiring the object type of an ith object in an ith image area of the first image;
alternatively, the image region may be any region of the first image, may be a whole region, or may be a partial region, and may be determined specifically according to the actual situation.
Alternatively, the object types may include, but are not limited to, numbers, pictures, applications, and the like.
Alternatively, the object type of the ith object in the ith image area of the first image may be obtained by performing image recognition on the ith object in the ith image area of the first image through an image recognition technology. Of course, other acquisition methods are also included in the scope of the present invention.
Determining an ith operation associated with an ith control based on the object type;
optionally, the electronic device may establish and store an association between the control and the operation.
For example, if the i object type in fig. 4 is a number, it may be determined that the control is associated with an operation of adding the number to the address book; if the i object type in fig. 5 is a setting, the control may be associated with an operation of updating the setting; as in fig. 6, where the i-object type is an application, the control may be associated with the operation of downloading app 5; as in FIG. 7, the i object type is a picture, the control may be associated with an operation to add to the album.
Step 202 displays N controls on a first image, including:
displaying an ith control in an ith image area;
wherein i is a positive integer, and i is not more than N. In this way, the ith operation associated with the ith control can be determined according to the object type of the ith object in the ith image area in the first image, that is, the control can be automatically added based on the object type, and the operation associated with the control can be added.
Illustratively, control 402 may be displayed as in FIG. 4; control 502 may be displayed as in FIG. 5; control 602 may be displayed as in FIG. 6; control 702 can be displayed as in fig. 7.
For example, as shown in fig. 4(a), after the user captures a screenshot of the address list, the user may click on the image to trigger adding the control, the second electronic device obtains an object type of a 1 st object in a 1 st image area of the first image, where a dashed area indicated by 403 is the 1 st image area, the 1 st object is Tony13812345678 indicated by 401, and an object type of the 1 st object is a number class, and based on the object type, a 1 st operation associated with the 1 st control may be determined, such as adding the 1 st object to the address list. The 1 st control 402 is displayed in the 1 st image area 403. Similarly, the second electronic device may continue to acquire the object type of the 2 nd object in the 2 nd image area of the first image, the dashed area indicated by 405 is the 2 nd image area, the 2 nd object is Mike 13823456789 indicated by 404, the object type of the 2 nd object is a number class, and based on the object type, the 2 nd operation associated with the 2 nd control may be determined, and the 2 nd object is added to the address book. A 2 nd control 406 is displayed in the 2 nd image area 405.
Optionally, the receiving a first input of a user includes:
receiving a first sub-input of a user to an m image area in the first image and a second sub-input of the user, wherein the second sub-input is used for creating an m control and establishing an association relation between the m control and an m operation;
optionally, receiving the first sub-input of the user to the mth image area in the first image may include, but is not limited to, a click touch input, a double click touch input, a slide touch input, a long press touch input, etc. of the user to the mth image area in the first image.
Optionally, the second sub-input may include, but is not limited to, a click touch input, a double click touch input, a swipe touch input, a long press touch input, and the like.
Optionally, the second sub-input is used to create an mth control and establish an association relationship between the mth control and the mth operation. The electronic device can establish and store an association between the control and the operation.
The displaying N controls on a first image in response to the first input, comprising:
displaying the mth control in the mth image area in response to the first and second sub-inputs. In this way, controls can be added to the first image and associated with related operations in accordance with user input.
Illustratively, as shown in fig. 4(a), the first image is a screenshot of the address book, the user may click on the 1 st image region in the image, may make the image in an editable state, and the user may click on the "add widget and establish an association relationship" widget to create the 1 st widget and establish an association relationship between the 1 st widget and the 1 st operation, for example, adding Tony13812345678 to the address book. After the association relationship is established, the association relationship can be stored in a server.
Optionally, after the sending the first image to the first electronic device, the method further includes:
step d, receiving first information sent by first electronic equipment;
extracting first position information in the first information or image content of the second image area, wherein the first position information is position information of a second object in the second image area in the first image;
step e, acquiring second information related to the first information based on the first position information or the image content of the second image area;
optionally, second information associated with the first information may be acquired based on image content of a second image area; the second information associated with the first information may also be obtained based on the first position information, for example, the image content at the first position may be searched according to the first position information, and then the second information associated with the first information may be obtained based on the image content at the first position.
Alternatively, the second information associated with the first information may be obtained by performing a query according to an association table stored in a server based on the first position information or the image content of the second image area, and then obtaining the second information. It should be noted that the association table may be set by default, or may be set by a user, and may be determined according to actual conditions.
Illustratively, as shown in fig. 6, first information sent by the first electronic device is received, and the first information may be first location information of an object in the first image, such as location information of app4 in the first image. The first information may also be the image content of the second image region, such as the name of app 4. The second electronic device extracts the first position information in the first information or the image content of the second image area, locates the second object in the first image, and obtains second information associated with the second object, where the second information may be a download link of app4, may be an installation package of app4, and may be determined according to actual conditions.
Illustratively, as shown in fig. 7, the first information sent by the first electronic device is received, and may be first position information of the object 703 in the first image, such as position information of the picture 1 in the first image. The first information may also be the image content of the second image area, such as the image content of picture 1. The second electronic device extracts the first position information in the first information or the image content of the second image area, and acquires second information associated with the first information, wherein the second information may be the original image of the picture 1.
And f, sending the second information to the first electronic equipment. Therefore, by receiving the first information sent by the first electronic device, acquiring the second information based on the first position information in the first information or the image content of the second image area, and sending the second information to the first electronic device, the first information can be automatically triggered to be sent to the second electronic device according to the input of the user at any position in the first image of the first electronic device, and the second electronic device can send the information related to the second object required by the user based on the first information, so that the interactive operation between the first electronic device and the second electronic device is facilitated.
In the embodiment of the invention, the first input of the user is received; displaying N controls on a first image in response to the first input; sending the first image to a first electronic device; the method includes the steps that N is a positive integer, namely, N controls are displayed on a first image by receiving first input of a user, and then the first image is sent to first electronic equipment. A control can be displayed on the first image. The ith operation associated with the ith control can be determined according to the object type of the ith object in the ith image area in the first image, namely, the control can be automatically added based on the object type, and the operation associated with the control can be added. Controls can be added to the first image and associated with related operations based on user input. The first information can be automatically triggered and sent to the second electronic device according to the input of the user at any position in the first image of the first electronic device, and the second electronic device can send the information related to the second object required by the user based on the first information, so that the interactive operation between the first electronic device and the second electronic device is facilitated.
As shown in fig. 8, an embodiment of the present invention provides an electronic device 120, where the electronic device 120 includes: a receiving module 121, a display module 122 and an execution module 123.
The receiving module 121 is configured to receive a first image sent by a second electronic device, where the first image includes N controls; the display module 122 is configured to display the first image; the receiving module 121 is further configured to receive a first input of a user to a first control of the N controls; the executing module 123 is configured to, in response to the first input, execute a first operation associated with the first control on a first object, where the first object is an object in a first image region of the first image, and the first object is an object associated with the first control; wherein N is a positive integer.
Optionally, the first operation is determined based on an object type of the first object. In this way, the first operation can be determined according to the type of the first object.
Optionally, the electronic device further comprises: and the processing module is used for adding the first object to a target area of a target interface of a target application program or updating the interface content of the target interface of the target application program. Therefore, the first object can be added to the target area of the target interface of the target application program, or the interface content of the target interface of the target application program can be updated, the user does not need to open the target interface of the target application program for operation, and the operation of the user is simplified.
Optionally, the receiving module 121 is further configured to receive a second input of the user to a second image area in the first image; the electronic device further includes: a sending module, configured to send, in response to the second input, first information to a second electronic device, where the first information includes first position information of a second object of the second image area in the first image or image content of the second image area; the receiving module 121 is further configured to receive second information sent by the second electronic device, where the second information is information associated with the second object. In this way, the first electronic device can be automatically triggered to send the first information to the second electronic device according to the input of the first electronic device at any position in the first image, and the second information required by the user and sent by the second electronic device based on the first information can be received.
According to the electronic device provided by the embodiment of the invention, the electronic device can receive a first image sent by a second electronic device, wherein the first image comprises N controls; displaying the first image; receiving a first input of a user to a first control of the N controls; in response to the first input, performing a first operation associated with the first control on a first object, the first object being an object in a first image region of the first image, the first object being an object associated with the first control; wherein N is a positive integer. Therefore, the associated operation can be executed according to the input of the user to the control in the image, the user is not required to open the application program, and then the information is manually input one by one according to the image content, so that the operation process is simplified. The first operation can be determined according to a type of the first object. The first object can be added to the target area of the target interface of the target application program, or the interface content of the target interface of the target application program is updated, so that the user does not need to open the target interface of the target application program for operation, and the operation of the user is simplified. The method can automatically trigger the first information to be sent to the second electronic equipment according to the input of the first electronic equipment at any position in the first image, and can receive the second information which is sent by the second electronic equipment based on the first information and is needed by a user.
Optionally, as shown in fig. 9, an embodiment of the present invention provides an electronic device 130, where the electronic device 130 includes:
a receiving module 131, configured to receive a first input of a user; a display module 132 for displaying N controls on a first image in response to the first input; a sending module 133, configured to send the first image to a first electronic device; wherein N is a positive integer.
Optionally, the first input comprises an input for triggering an add control; the electronic device further includes: the acquisition module is used for acquiring the object type of the ith object in the ith image area of the first image; the determining module is used for determining the ith operation associated with the ith control based on the object type; the display module 132 is further configured to display an ith control in the ith image area; wherein i is a positive integer, and i is not more than N. In this way, the ith operation associated with the ith control can be determined according to the object type of the ith object in the ith image area in the first image, that is, the control can be automatically added based on the object type, and the operation associated with the control can be added.
Optionally, the receiving the first input of the user comprises receiving a first sub-input of the user to an m-th image area in the first image and creating an m-th control sum; the display module 132 is further configured to display the mth control in the mth image area in response to the first sub-input and the second sub-input. In this way, controls can be added to the first image and associated with related operations in accordance with user input.
Optionally, the receiving module 131 is further configured to receive first information sent by a first electronic device; the electronic device further includes: an extracting module, configured to extract first position information in the first information or image content of the second image region, where the first position information is position information of a second object in the second image region in the first image; the obtaining module is further configured to obtain second information associated with the first information based on the first position information or image content of the second image area; the sending module 133 is further configured to send the second information to the first electronic device. Therefore, by receiving the first information sent by the first electronic device, acquiring the second information based on the first position information in the first information or the image content of the second image area, and sending the second information to the first electronic device, the first information can be automatically triggered to be sent to the second electronic device according to the input of the user at any position in the first image of the first electronic device, and the second electronic device can send the second information required by the user based on the first information, so that the interactive operation between the first electronic device and the second electronic device is facilitated.
According to the electronic equipment provided by the embodiment of the invention, the electronic equipment can receive the first input of a user; displaying N controls on a first image in response to the first input; sending the first image to a first electronic device; the method includes the steps that N is a positive integer, namely, N controls are displayed on a first image by receiving first input of a user, and then the first image is sent to first electronic equipment. A control can be displayed on the first image. The ith operation associated with the ith control can be determined according to the object type of the ith object in the ith image area in the first image, namely, the control can be automatically added based on the object type, and the operation associated with the control can be added. Controls can be added to the first image and associated with related operations based on user input. The first information can be automatically triggered and sent to the second electronic device according to the input of the user at any position in the first image of the first electronic device, and the second electronic device can send second information required by the user based on the first information, so that the interactive operation between the first electronic device and the second electronic device is facilitated.
The electronic device provided by the embodiment of the present invention can implement each process implemented by the electronic device in the above method embodiments, and is not described herein again to avoid repetition.
Fig. 10 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present invention. As shown in fig. 10, the electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 10 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a wearable device, a pedometer, and the like.
The processor 110 is configured to receive a first image sent by a second electronic device, where the first image includes N controls; a display unit 106 for displaying the first image; the user input unit 107 is further configured to receive a first input of a user to a first control of the N controls; a processor 110, configured to perform a first operation associated with the first control on a first object in response to the first input, where the first object is an object in a first image region of the first image, and the first object is an object associated with the first control; wherein N is a positive integer.
According to the electronic device provided by the embodiment of the invention, the electronic device can receive a first image sent by a second electronic device, wherein the first image comprises N controls; displaying the first image; receiving a first input of a user to a first control of the N controls; in response to the first input, performing a first operation associated with the first control on a first object, the first object being an object in a first image region of the first image, the first object being an object associated with the first control; wherein N is a positive integer. Therefore, by the scheme, the associated operation can be executed according to the input of the user to the control in the image, and the operation process is simplified.
Or, the user input unit 107 may be further configured to receive a first input from a user; a display unit 106 operable to display N controls on a first image in response to the first input; the processor 110 may be configured to transmit the first image to a first electronic device.
According to the electronic equipment provided by the embodiment of the invention, the electronic equipment can receive the first input of a user; displaying N controls on a first image in response to the first input; sending the first image to a first electronic device; the method includes the steps that N is a positive integer, namely, N controls are displayed on a first image by receiving first input of a user, and then the first image is sent to first electronic equipment. A control can be displayed on the first image.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, receive downlink data from a base station and then process the received downlink data to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 102, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic apparatus 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The electronic device 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the electronic device 100 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 10, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the electronic apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 100 or may be used to transmit data between the electronic apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the electronic device. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The electronic device 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the electronic device 100 includes some functional modules that are not shown, and are not described in detail herein.
Optionally, an electronic device may further include the processor 110 shown in fig. 10, the memory 109, and a computer program stored in the memory 109 and capable of being executed on the processor 110, where the computer program, when executed by the processor 110, implements each process of the image control method shown in any one of fig. 2 to fig. 7 in the above method embodiments, and may achieve the same technical effect, and details are not described here to avoid repetition.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the image control method shown in any one of fig. 2 to 7 in the foregoing method embodiments, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling an electronic device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (18)

1. An image control method applied to a first electronic device is characterized by comprising the following steps:
receiving a first image sent by second electronic equipment, wherein the first image comprises N controls;
displaying the first image;
receiving a first input of a user to a first control of the N controls;
in response to the first input, performing a first operation associated with the first control on a first object, the first object being an object in a first image region of the first image, the first object being an object associated with the first control;
wherein N is a positive integer.
2. The method of claim 1, wherein the first operation is determined based on an object type of the first object.
3. The method of claim 1, wherein performing the first operation associated with the first control on the first object comprises:
and adding the first object to a target area of a target interface of a target application program, or updating the interface content of the target interface of the target application program.
4. The method of claim 1, wherein after displaying the first image, further comprising:
receiving a second input of a user to a second image area in the first image;
in response to the second input, sending first information to a second electronic device, wherein the first information comprises first position information of a second object of the second image area in the first image or image content of the second image area;
and receiving second information sent by the second electronic equipment, wherein the second information is information associated with the second object.
5. An image control method applied to a second electronic device is characterized by comprising the following steps:
receiving a first input of a user;
displaying N controls on a first image in response to the first input;
sending the first image to a first electronic device;
wherein N is a positive integer;
before displaying the N controls on the first image, the method further includes:
automatically adding a control on the first image based on the object type and adding the operation associated with the control;
or,
and adding a control on the first image according to the input of the user, and adding the operation associated with the control.
6. The method of claim 5, wherein the first input comprises an input for triggering an add control;
before displaying the N controls on the first image, the method further includes:
acquiring the object type of an ith object in an ith image area of the first image;
determining an ith operation associated with an ith control based on the object type;
the displaying N controls on the first image includes:
displaying an ith control in an ith image area;
wherein i is a positive integer, and i is not more than N.
7. The method of claim 5, wherein receiving a first input from a user comprises:
receiving a first sub-input of a user to an m image area in the first image and a second sub-input of the user, wherein the second sub-input is used for creating an m control and establishing an association relation between the m control and an m operation;
the displaying N controls on a first image in response to the first input, comprising:
displaying the mth control in the mth image area in response to the first and second sub-inputs;
wherein m is a positive integer and is less than or equal to N.
8. The method of claim 5, wherein after sending the first image to the first electronic device, further comprising:
receiving first information sent by first electronic equipment;
extracting first position information or image content of a second image area in the first information, wherein the first position information is position information of a second object of the second image area in the first image;
acquiring second information associated with the first information based on the first position information or the image content of the second image area;
and sending the second information to the first electronic equipment.
9. An electronic device, comprising:
the receiving module is used for receiving a first image sent by second electronic equipment, and the first image comprises N controls;
a display module for displaying the first image;
the receiving module is further configured to receive a first input of a user to a first control of the N controls;
an execution module, configured to, in response to the first input, execute a first operation associated with the first control on a first object, where the first object is an object in a first image region of the first image, and the first object is an object associated with the first control; wherein N is a positive integer.
10. The electronic device of claim 9, wherein the first operation is determined based on an object type of the first object.
11. The electronic device of claim 9, further comprising:
and the processing module is used for adding the first object to a target area of a target interface of a target application program or updating the interface content of the target interface of the target application program.
12. The electronic device of claim 9, wherein the receiving module is further configured to receive a second input from a user to a second image area in the first image;
the electronic device further includes:
a sending module, configured to send, in response to the second input, first information to a second electronic device, where the first information includes first position information of a second object of the second image area in the first image or image content of the second image area;
the receiving module is further configured to receive second information sent by the second electronic device, where the second information is information associated with the second object.
13. An electronic device, comprising:
the receiving module is used for receiving a first input of a user;
a display module to display N controls on a first image in response to the first input;
the sending module is used for sending the first image to first electronic equipment;
wherein N is a positive integer;
the display module is further configured to:
automatically adding a control on the first image based on the object type and adding the operation associated with the control;
or,
and adding a control on the first image according to the input of the user, and adding the operation associated with the control.
14. The electronic device of claim 13, wherein the first input comprises an input to trigger an add control;
the electronic device further includes:
the acquisition module is used for acquiring the object type of the ith object in the ith image area of the first image;
the determining module is used for determining the ith operation associated with the ith control based on the object type;
the display module is further used for displaying the ith control in the ith image area;
wherein i is a positive integer, and i is not more than N.
15. The electronic device of claim 13, wherein the receiving of the first input of the user comprises receiving a first sub-input of the user to an mth image area in the first image and a second sub-input of the user, wherein the second sub-input is used for creating an mth control and establishing an association relationship between the mth control and an mth operation;
the display module is further configured to display the mth control in the mth image area in response to the first sub-input and the second sub-input;
wherein m is a positive integer and is less than or equal to N.
16. The electronic device of claim 14, wherein the receiving module is further configured to receive first information sent by a first electronic device;
the electronic device further includes:
an extracting module, configured to extract first position information or image content of a second image region in the first information, where the first position information is position information of a second object in the second image region in the first image;
the obtaining module is further configured to obtain second information associated with the first information based on the first position information or image content of the second image area;
the sending module is further configured to send the second information to the first electronic device.
17. An electronic device, comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the image control method according to any one of claims 1 to 8.
18. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image control method according to any one of claims 1 to 8.
CN201911294231.7A 2019-12-16 2019-12-16 Image control method, electronic device, and storage medium Active CN111130995B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911294231.7A CN111130995B (en) 2019-12-16 2019-12-16 Image control method, electronic device, and storage medium
PCT/CN2020/134815 WO2021121093A1 (en) 2019-12-16 2020-12-09 Image control method, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911294231.7A CN111130995B (en) 2019-12-16 2019-12-16 Image control method, electronic device, and storage medium

Publications (2)

Publication Number Publication Date
CN111130995A CN111130995A (en) 2020-05-08
CN111130995B true CN111130995B (en) 2021-08-10

Family

ID=70499245

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911294231.7A Active CN111130995B (en) 2019-12-16 2019-12-16 Image control method, electronic device, and storage medium

Country Status (2)

Country Link
CN (1) CN111130995B (en)
WO (1) WO2021121093A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111130995B (en) * 2019-12-16 2021-08-10 维沃移动通信有限公司 Image control method, electronic device, and storage medium
CN112843692B (en) * 2020-12-31 2023-04-18 上海米哈游天命科技有限公司 Method and device for shooting image, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0166046A1 (en) * 1984-06-25 1986-01-02 International Business Machines Corporation Graphical display apparatus with pipelined processors
CN103197855A (en) * 2013-04-19 2013-07-10 中国建设银行股份有限公司 Method and device for inputting image files
CN105549888A (en) * 2015-12-15 2016-05-04 芜湖美智空调设备有限公司 Combined control generation method and device
CN105700789A (en) * 2016-01-11 2016-06-22 广东欧珀移动通信有限公司 Image sending method and terminal device
CN108549568A (en) * 2018-04-18 2018-09-18 Oppo广东移动通信有限公司 Using entrance processing method, apparatus, storage medium and electronic equipment
CN108563473A (en) * 2018-04-04 2018-09-21 歌尔股份有限公司 scanning module configuration method and device
CN108769374A (en) * 2018-04-25 2018-11-06 维沃移动通信有限公司 A kind of image management method and mobile terminal
CN109840129A (en) * 2019-01-30 2019-06-04 维沃移动通信有限公司 A kind of display control method and electronic equipment

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8468465B2 (en) * 2010-08-09 2013-06-18 Apple Inc. Two-dimensional slider control
CN104636029A (en) * 2014-12-31 2015-05-20 魅族科技(中国)有限公司 Display control method and system for control
CN105843494B (en) * 2015-01-15 2020-06-09 中兴通讯股份有限公司 Method, device and terminal for realizing area screen capture
WO2019104478A1 (en) * 2017-11-28 2019-06-06 华为技术有限公司 Method and terminal for recognizing screenshot text
CN109829070B (en) * 2019-01-29 2021-01-08 维沃移动通信有限公司 Image searching method and terminal equipment
CN110456956A (en) * 2019-08-05 2019-11-15 腾讯科技(深圳)有限公司 Screenshot method, device, computer equipment and storage medium
CN110339568B (en) * 2019-08-19 2024-06-21 网易(杭州)网络有限公司 Virtual control display method and device, storage medium and electronic device
CN111130995B (en) * 2019-12-16 2021-08-10 维沃移动通信有限公司 Image control method, electronic device, and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0166046A1 (en) * 1984-06-25 1986-01-02 International Business Machines Corporation Graphical display apparatus with pipelined processors
CN103197855A (en) * 2013-04-19 2013-07-10 中国建设银行股份有限公司 Method and device for inputting image files
CN105549888A (en) * 2015-12-15 2016-05-04 芜湖美智空调设备有限公司 Combined control generation method and device
CN105700789A (en) * 2016-01-11 2016-06-22 广东欧珀移动通信有限公司 Image sending method and terminal device
CN108563473A (en) * 2018-04-04 2018-09-21 歌尔股份有限公司 scanning module configuration method and device
CN108549568A (en) * 2018-04-18 2018-09-18 Oppo广东移动通信有限公司 Using entrance processing method, apparatus, storage medium and electronic equipment
CN108769374A (en) * 2018-04-25 2018-11-06 维沃移动通信有限公司 A kind of image management method and mobile terminal
CN109840129A (en) * 2019-01-30 2019-06-04 维沃移动通信有限公司 A kind of display control method and electronic equipment

Also Published As

Publication number Publication date
CN111130995A (en) 2020-05-08
WO2021121093A1 (en) 2021-06-24

Similar Documents

Publication Publication Date Title
CN110995923B (en) Screen projection control method and electronic equipment
CN110752980B (en) Message sending method and electronic equipment
CN111049979B (en) Application sharing method, electronic equipment and computer readable storage medium
CN108255378B (en) Display control method and mobile terminal
CN107734175B (en) Notification message prompting method and mobile terminal
CN110062105B (en) Interface display method and terminal equipment
CN111026484A (en) Application sharing method, first electronic device and computer-readable storage medium
CN110109593B (en) Screen capturing method and terminal equipment
CN109240577B (en) Screen capturing method and terminal
CN109857297B (en) Information processing method and terminal equipment
CN109343788B (en) Operation control method of mobile terminal and mobile terminal
CN108334272B (en) Control method and mobile terminal
CN109710349B (en) Screen capturing method and mobile terminal
CN110752981B (en) Information control method and electronic equipment
CN110908750B (en) Screen capturing method and electronic equipment
CN110096203B (en) Screenshot method and mobile terminal
CN108287644B (en) Information display method of application program and mobile terminal
CN111610903A (en) Information display method and electronic equipment
CN111131607A (en) Information sharing method, electronic equipment and computer readable storage medium
CN111399715B (en) Interface display method and electronic equipment
CN110012151B (en) Information display method and terminal equipment
CN109284146B (en) Light application starting method and mobile terminal
CN109597546B (en) Icon processing method and terminal equipment
CN111130995B (en) Image control method, electronic device, and storage medium
CN111061446A (en) Display method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant