CN110941341A - Image control method and electronic equipment - Google Patents

Image control method and electronic equipment Download PDF

Info

Publication number
CN110941341A
CN110941341A CN201911202454.6A CN201911202454A CN110941341A CN 110941341 A CN110941341 A CN 110941341A CN 201911202454 A CN201911202454 A CN 201911202454A CN 110941341 A CN110941341 A CN 110941341A
Authority
CN
China
Prior art keywords
image
input
user
electronic device
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911202454.6A
Other languages
Chinese (zh)
Other versions
CN110941341B (en
Inventor
何正宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911202454.6A priority Critical patent/CN110941341B/en
Publication of CN110941341A publication Critical patent/CN110941341A/en
Priority to PCT/CN2020/130776 priority patent/WO2021104194A1/en
Application granted granted Critical
Publication of CN110941341B publication Critical patent/CN110941341B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/61Scene description

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides an image control method and electronic equipment, relates to the technical field of communication, and aims to solve the problem that the AR image displayed by the conventional electronic equipment is relatively disordered, so that the AR image display mode of the electronic equipment is not flexible enough. The method comprises the following steps: receiving first input of a user on a target object in N objects, wherein each object in the N objects corresponds to at least one AR image, and the AR images corresponding to the N objects are all AR images corresponding to live-action images collected by electronic equipment; in response to the first input, at least one target AR image corresponding to the target object is displayed or de-displayed. The method is applied to the scene of the electronic equipment displaying the AR image.

Description

Image control method and electronic equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an image control method and electronic equipment.
Background
With the rapid development of communication technology, electronic devices are more and more widely used, and users have higher and higher requirements for the performance of the electronic devices.
Currently, an electronic device may superimpose an image (hereinafter, referred to as an image 1) drawn by a user on the electronic device onto an image (i.e., a real image) of a real scene captured by the electronic device through an Augmented Reality (AR) function to obtain an AR image, and store the image 1 in a server. When other electronic devices use the AR function in the geographic location corresponding to the real scene, the other electronic devices may also acquire the image 1, and superimpose the image 1 on the acquired real image to obtain an AR image. For example, when a family member passes a birthday, a user may draw a plurality of images related to birthday party, such as balloon, flower, ribbon, etc., at home through an electronic device, and then when the family member returns home, the family member may see an AR image on which the images are superimposed through other electronic devices.
However, according to the above method, since the server may store images drawn by multiple users in the same real scene, when a certain electronic device uses an AR function in a geographic location corresponding to a certain real scene, the electronic device may acquire multiple images, so that the electronic device may obtain too many images superimposed on the real-scene image acquired by the electronic device, that is, the AR image displayed by the electronic device is relatively confused, and the manner in which the electronic device displays the AR image is not flexible.
Disclosure of Invention
The embodiment of the invention provides an image control method and electronic equipment, and aims to solve the problem that the AR image displayed by the conventional electronic equipment is relatively disordered, so that the AR image display mode of the electronic equipment is not flexible enough.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides a method for controlling display, where the method includes: receiving first input of a user on a target object in N objects, wherein each object in the N objects corresponds to at least one AR image, and the AR images corresponding to the N objects are all AR images corresponding to a live-action image collected by electronic equipment; in response to the first input, at least one target AR image corresponding to the target object is displayed or de-displayed.
In a second aspect, an embodiment of the present invention provides an electronic device, which includes a receiving module and a display module. The receiving module is used for receiving first input of a user to a target object in N objects, wherein each object in the N objects corresponds to at least one AR image, and the AR images corresponding to the N objects are all AR images corresponding to a live-action image acquired by electronic equipment; and the display module is used for responding to the first input received by the receiving module and displaying or canceling at least one target AR image corresponding to the target object.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a processor, a memory, and a computer program stored on the memory and operable on the processor, and when executed by the processor, the electronic device implements the steps of the image control method in the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the image control method as in the first aspect described above.
In the embodiment of the invention, a first input of a user to a target object in the N objects can be received; and in response to the first input, displaying or canceling display of at least one target AR image corresponding to the target object. Each object in the N objects corresponds to at least one AR image, and the AR images corresponding to the N objects are all AR images corresponding to the live-action images collected by the electronic equipment. According to the scheme, each object in the N objects can correspond to at least one AR image, so that when the electronic equipment uses an AR function at a geographic position corresponding to a certain real scene, a user can trigger the electronic equipment to display or cancel displaying of the AR image corresponding to the object through input of the object in the N objects according to the use requirement of the user, so that the electronic equipment can only superimpose the AR image which the user wants to see on the real-scene image of the real scene, the electronic equipment can display the image according to the actual use requirement of the user, and the mode of displaying the image by the electronic equipment is flexible.
Drawings
Fig. 1 is a schematic diagram of an architecture of an android operating system according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating an image control method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an interface applied by an image control method according to an embodiment of the present invention;
FIG. 4 is a second schematic interface diagram of an application of the image control method according to the embodiment of the present invention;
fig. 5 is a third schematic interface diagram of an application of the image control method according to the embodiment of the present invention;
FIG. 6 is a second schematic diagram of an image control method according to an embodiment of the present invention;
FIG. 7 is a third schematic diagram illustrating an image control method according to an embodiment of the present invention;
FIG. 8 is a fourth schematic view of an interface applied by the image control method according to the embodiment of the present invention;
FIG. 9 is a fourth schematic diagram illustrating an image control method according to an embodiment of the present invention;
FIG. 10 is a fifth exemplary diagram of an interface applied by the image control method according to the embodiment of the present invention;
FIG. 11 is a fifth schematic diagram illustrating an image control method according to an embodiment of the present invention;
FIG. 12 is a sixth schematic view illustrating an image control method according to an embodiment of the present invention;
FIG. 13 is a seventh schematic diagram illustrating an image control method according to an embodiment of the present invention;
fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 15 is a hardware schematic diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The term "and/or" herein is an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, a/B denotes a or B.
The terms "first" and "second," and the like, in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first input and the second input, etc. are for distinguishing different inputs, rather than for describing a particular order of inputs.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present invention, unless otherwise specified, "a plurality" means two or more, for example, a plurality of processing units means two or more processing units, and the like.
The terms/nouns referred to in the embodiments of the present invention will be explained first.
The AR function: the method refers to a function of calculating the position and the angle of an image (namely a real image) acquired by a camera module from a real scene in real time and adding the image (namely a virtual image) in a computer. The AR function may combine and interact with the virtual image and the live view image.
AR image: the embodiment of the invention refers to an image drawn by a user through an AR function of the electronic device, namely the virtual image.
The embodiment of the invention provides an image control method and electronic equipment, which can receive first input of a user on a target object in N objects; and in response to the first input, displaying or canceling display of at least one target AR image corresponding to the target object. Each object in the N objects corresponds to at least one AR image, and the AR images corresponding to the N objects are all AR images corresponding to the live-action images collected by the electronic equipment. According to the scheme, each object in the N objects can correspond to at least one AR image, so that when the electronic equipment uses an AR function at a geographic position corresponding to a certain real scene, a user can trigger the electronic equipment to display or cancel displaying of the AR image corresponding to the object through input of the object in the N objects according to the use requirement of the user, so that the electronic equipment can only superimpose the AR image which the user wants to see on the real-scene image of the real scene, the electronic equipment can display the image according to the actual use requirement of the user, and the mode of displaying the image by the electronic equipment is flexible.
The electronic device in the embodiment of the present invention may be an electronic device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the image control method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the image control method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the image control method may operate based on the android operating system shown in fig. 1. Namely, the processor or the electronic device can implement the image control method provided by the embodiment of the invention by running the software program in the android operating system.
The electronic device in the embodiment of the invention can be a mobile electronic device or a non-mobile electronic device. For example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiment of the present invention is not particularly limited.
An execution main body of the image control method provided in the embodiment of the present invention may be the electronic device, or may also be a functional module and/or a functional entity capable of implementing the image control method in the electronic device, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited. The following takes an electronic device as an example to exemplarily describe the image control method provided by the embodiment of the present invention.
In the embodiment of the invention, under the condition that the electronic equipment uses the AR function, the electronic equipment can collect and display the live-action image of the geographical position where the electronic equipment is located, and the electronic equipment can also display the AR image corresponding to the live-action image on the live-action image. If the user wants to view certain specific AR images (e.g., an AR image drawn by a certain user, an AR image of a certain shape), the user may trigger the electronic device to display only those AR images by inputting an object corresponding to those AR images; or, the user may trigger the electronic device to cancel displaying the AR image that the user wants to cancel displaying by inputting the object corresponding to the AR object that the user wants to trigger the electronic device to cancel displaying, so that the electronic device may superimpose only the AR image that the user wants to view on the live-action image, that is, the electronic device may superimpose the AR image on the live-action image according to a certain rule, thereby making the mode of displaying the image by the electronic device more flexible.
The following describes an exemplary image control method according to an embodiment of the present invention with reference to the drawings.
As shown in fig. 2, an embodiment of the present invention provides an image control method, which may include S201 and S202 described below.
S201, the electronic equipment receives a first input of a user to a target object in the N objects.
Each of the N objects may respectively correspond to at least one AR image, and the AR images corresponding to the N objects may all be AR images corresponding to live-action images acquired by the electronic device.
S202, the electronic equipment responds to the first input, and at least one target AR image corresponding to the target object is displayed or canceled to be displayed.
In the embodiment of the present invention, when the electronic device uses the AR function, if the user wants to view some AR images corresponding to live-action images (i.e., live-action images corresponding to the geographical location where the electronic device is located) collected by the electronic device, the user may trigger the electronic device to display an AR image corresponding to a target object through a first input to the target object among the N objects; alternatively, the user may trigger the electronic device to cancel displaying the AR image corresponding to the target object (i.e., the at least one target AR image). Therefore, the electronic equipment can flexibly display the AR image corresponding to the live-action image acquired by the electronic equipment according to the actual use requirement of the user.
It should be noted that, in the embodiment of the present invention, the AR image corresponding to the live-action image acquired by the electronic device may be: and all users of the electronic equipment draw the AR image on the live-action image through the AR function of the electronic equipment and save the AR image to the server.
In addition, for convenience of description, the live-action image captured by the electronic device is simply referred to as a live-action image in the following embodiments.
Optionally, in the embodiment of the present invention, the AR image corresponding to the live-action image may be an AR image updated in real time. That is, the AR image corresponding to the live-view image changes as the AR image corresponding to the live-view image stored in the server changes.
Optionally, in this embodiment of the present invention, the first input may be any possible input, such as a single-click input, a double-click input, a long-press input, or a re-press input, of the user on the target object. The method and the device can be determined according to actual use requirements, and the embodiment of the invention is not limited.
In the embodiment of the present invention, the long press input may be an input that a user presses on the target object for a duration greater than or equal to a time threshold; the above-mentioned re-pressing input may be an input that a user presses on the above-mentioned target object with a pressure value greater than or equal to a pressure threshold value.
Optionally, in the embodiment of the present invention, both the time threshold and the pressure threshold may be preset values in the electronic device by a manufacturer of the electronic device, and may be determined specifically according to actual use requirements, which is not limited in the embodiment of the present invention.
Optionally, in this embodiment of the present invention, the N objects may include at least one of the following: the image determination control, the second AR image and the user information identification.
The image determination control may be configured to trigger the electronic device to display an AR image corresponding to the image determination control (specifically, an AR image corresponding to the image determination control in an AR image corresponding to a real-scene image), the second AR image may be configured to trigger the electronic device to display an AR image drawn by a user who draws the second AR image (specifically, an AR image drawn by a user who draws the second AR image in an AR image corresponding to a real-scene image), and the user information identifier may be configured to trigger the electronic device to display an AR image drawn by a user indicated by the user information identifier (specifically, an AR image drawn by a user indicated by the user information identifier in an AR image corresponding to a real-scene image).
It can be understood that, in the embodiment of the present invention, the N objects may only include one object of the image determination control, the second AR image, and the user information identifier, or may include multiple objects of the image determination control, the second AR image, and the user information identifier, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited.
Optionally, in an embodiment of the present invention, when the target object is the image determination control, the step S202 may be specifically implemented by the step S202b described below.
S202b, the electronic device displays or cancels display of the AR image corresponding to the image determination control in response to a first input to the image determination control.
It should be noted that, in the embodiment of the present invention, in a case that the electronic device does not display the AR image corresponding to the image determination control, the electronic device may display the AR image corresponding to the image determination control in response to the first input to the image determination control; in a case where the electronic device displays an AR image corresponding to the image determination control, the electronic device may cancel displaying the AR image corresponding to the image determination control in response to a first input to the image determination control.
Optionally, in this embodiment of the present invention, the image determination control may be configured to trigger the electronic device to display or cancel displaying of a part of the AR image corresponding to the real-image (the real-image collected by the electronic device), or may be configured to trigger the electronic device to display or cancel displaying of all the AR images in the AR image corresponding to the real-image.
Optionally, in an embodiment of the present invention, the image determination control may include an "all selection" control 31 and/or an "intelligent recommendation" control 32 as shown in fig. 3. The "all-selection" control may be used to trigger the electronic device to display or cancel displaying all of the AR images corresponding to the live-action images, and the "intelligent recommendation" control may be used to trigger the electronic device to display or cancel displaying some of the AR images corresponding to the live-action images.
Of course, in actual implementation, the image determination control may also be any other possible control, which may be determined according to actual use requirements, and the embodiment of the present invention is not limited.
Optionally, in the embodiment of the present invention, when the electronic device does not display the AR image corresponding to the "intelligent recommendation" control (i.e., the target object), and the electronic device receives a first input of the "intelligent recommendation" control from the user, the electronic device may display, according to history data of the host user or big data acquired from the server, an AR image that the user may want to view or may like in the AR image corresponding to the real-scene image.
For example, assume that when a user draws an AR image or views an AR image, the number of circular images and the number of rectangular images in the AR image that the user prefers to use are large; the AR image corresponding to the live-action image comprises an AR image drawn by the user A and an AR image drawn by the user B; and the AR image drawn by the user A has more circular images and less rectangular images, and the AR image drawn by the user B has less circular images and more rectangular images. Then, in a case where the electronic device does not display an AR image corresponding to the "smart recommendation" control 31, when the electronic device receives a first input of the "smart recommendation" control 31 from the user, the electronic device may display an AR image drawn by the user a in response to the first input.
It should be noted that, in the embodiment of the present invention, after the electronic device uses the AR function, the electronic device may display the real-image acquired by the electronic device, and in order to clearly indicate the AR image corresponding to the real-image, the real-image acquired by the electronic device is not shown in each interface schematic diagram in the embodiment of the present invention.
Alternatively, in the embodiment of the present invention, when the target object is the second AR image, the step S202 may be specifically implemented by the step S202c described below.
S202c, the electronic device displays or cancels display of the AR image drawn by the user who drawn the second AR image in response to the first input to the second AR image.
In this embodiment of the present invention, the second AR image may be an AR image drawn by any user in an AR image corresponding to the live-action image.
Optionally, in the embodiment of the present invention, when the electronic device displays and draws AR images drawn by multiple users, the electronic device may cancel displaying the AR images drawn by other users in response to the first input to the second AR image, and continue displaying the AR images drawn by the users who draw the second AR image; in a case where the electronic device displays only the AR image drawn by the user who draws the second AR image, the electronic device may cancel displaying AR images other than the second AR image among the AR images drawn by the user who draws the second AR image in response to a first input to the second AR image.
In this embodiment of the present invention, when the electronic device receives a first input of the user to the second AR image (i.e., the target object), in response to the first input, the electronic device may display or cancel displaying the user-drawn AR image that draws the second AR image.
For example, it is assumed that the AR image displayed by the electronic device includes an AR image "moon" and an AR image "cloud" drawn by the user a, an AR image "lightning" and an AR image "love heart" drawn by the user B, and an AR image "smiling face" drawn by the user C, and the second AR image is an AR image "moon", and the first input is a click input of the second AR image by the user. When the user feels that the AR image drawn by the user a is good looking and wants to trigger the electronic device to display the AR image drawn by only the user a, then as shown in (a) of fig. 4, the user may click on "moon" of the AR image drawn by the user a (i.e., receive a first input), and in response to the input, the electronic device may cancel displaying "lightning" and "love" of the AR image drawn by the user B, and "smiling face" of the AR image drawn by the user C, and as shown in (B) of fig. 4, continue displaying "moon" and "cloud" of the AR image drawn by the user a.
As another example, assuming that the AR image displayed by the electronic device includes an AR image "moon" and an AR image "cloud" drawn by the user a, if the user wants only the electronic device to display the AR image "moon" (i.e., the second AR image), the user may click on the AR image "cloud" as shown in (a) of fig. 5, and the electronic device may cancel displaying the AR image "cloud" in response to the input, i.e., the electronic device may display only the AR image "moon" as shown in (b) of fig. 5.
Optionally, in this embodiment of the present invention, when the target object is the user information identifier, the step S202 may be specifically implemented by the step S202d described below.
S202d, the electronic device displays or cancels display of the AR image drawn by the user indicated by the user information identifier in response to the first input to the user information identifier.
It should be noted that, in the embodiment of the present invention, in a case that the electronic device does not display the AR image drawn by the user indicated by the user information identifier, the electronic device may display the AR image drawn by the user indicated by the user information identifier in response to the first input to the user information identifier; in a case where the electronic device displays an AR image drawn by the user indicated by the user information identification, the electronic device may cancel displaying the AR image drawn by the user indicated by the user information identification in response to the first input to the user information identification.
Optionally, in this embodiment of the present invention, one user information identifier may be user identity information (for example, a nickname of a user, an account of the user, and the like) of one user who draws the AR image corresponding to the real-image, may also be a thumbnail of the AR image drawn by one user who draws the AR image corresponding to the real-image, and may also be a combination of the user identity information of the one user and the thumbnail of the AR image drawn by the user.
Of course, in actual implementation, one user information identifier may also be any other information identifier that can be used to indicate a user, and may be determined specifically according to actual usage requirements, and the embodiment of the present invention is not limited.
For example, assuming that the AR image corresponding to the live-action image is an AR image drawn by 3 users, the 3 users are user a, user B, and user C, respectively, and the user information identifier is a nickname of the user, the N objects may include 3 user information identifiers, which are user information identifier "a" 33, user information identifier "B" 34, and user information identifier "C" 35 shown in fig. 3, respectively.
Optionally, in this embodiment of the present invention, when the number of the user information identifiers included in the N objects is large, if the electronic device cannot display the user information identifiers at one time, the user may trigger the electronic device to display other user information identifiers that are not displayed through a sliding input (for example, a downward sliding input, a leftward sliding input, a rightward sliding input, and the like) in an area where the user information identifiers are displayed.
Optionally, in this embodiment of the present invention, when the electronic device cancels display of the at least one target AR object in response to the first input, the user may first trigger the electronic device to display the at least one target AR image through an input (for example, a second input in this embodiment of the present invention) to the target object. In this way, after the electronic device receives the first input, the electronic device may determine that the user does not want to continue viewing the at least one target AR image, and thus the electronic device may cancel displaying the at least one target AR image.
For example, referring to fig. 2, as shown in fig. 6, before S201 described above, the image control method provided in the embodiment of the present invention may further include S203 and S204 described below. Specifically, S202 may be implemented as S202a described below.
S203, the electronic equipment receives a second input of the target object by the user.
S204, the electronic equipment responds to the second input and displays at least one target AR image.
S202a, the electronic device, in response to the first input, dismissing the display of the at least one target AR object.
In the embodiment of the present invention, before the electronic device receives the first input, the user may trigger the electronic device to display the at least one target AR image through a second input to the target object, so that after the electronic device receives the first input, the electronic device may cancel displaying the at least one target AR image, and thus, the way of displaying the AR image by the electronic device may be flexible.
Optionally, in this embodiment of the present invention, the second input may be any possible input, such as a single-click input, a double-click input, a long-press input, or a re-press input, of the user on the target object. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
It should be noted that, in the embodiment of the present invention, for the related description of the at least one target AR image and the target object, reference may be specifically made to the detailed description of the at least one target AR image and the target object in the foregoing embodiment, and in order to avoid repetition, details are not described here again.
The embodiment of the invention provides an image control method, because each object in N objects can correspond to at least one AR image, when an electronic device uses an AR function at a geographic position corresponding to a certain real scene, a user can trigger the electronic device to display or cancel displaying of the AR image corresponding to the object through input of the object in the N objects according to the use requirement of the user, so that the electronic device can only superpose the AR image which the user wants to see on the real image of the real scene, the electronic device can display the image according to the actual use requirement of the user, and the mode of displaying the image by the electronic device is flexible.
Optionally, in this embodiment of the present invention, before the electronic device receives a first input of the user on a target object in the N objects, in a case that the electronic device uses an AR function in the electronic device, the electronic device may display an interface (for example, a first interface in this embodiment of the present invention), where the interface may include a live-action image and at least one operation control acquired by the electronic device, and then the user may trigger the electronic device to display another interface (for example, a second interface in this embodiment of the present invention) through an input on a control (for example, the first operation control in this embodiment of the present invention) in the at least one operation control, where the second interface may include the N objects. In this way, the user can trigger the electronic device to display or cancel displaying the AR image in the AR image corresponding to the live-action image acquired by the electronic device through the input to the object in the N objects.
For example, with reference to fig. 2, as shown in fig. 7, before S201, the image control method according to the embodiment of the present invention may further include S205 to S207.
S205, the electronic equipment displays a first interface, and the first interface comprises at least one operation control and a live-action image collected by the electronic equipment.
S206, the electronic equipment receives a third input of the user to the first operation control in the at least one operation control.
S207, the electronic equipment responds to the third input and displays a second interface, and the second interface comprises N objects.
In the embodiment of the present invention, when the electronic device uses the AR function, the electronic device may first display the first interface, and then the user may trigger the electronic device to display the second interface by inputting the first operation control. After the electronic device displays the second interface, the user may trigger the electronic device to display or cancel displaying the AR image corresponding to the object through an input to the object of the N objects.
It is understood that the first interface may be a main interface of the electronic device using the AR function.
For example, assuming that the electronic device uses the AR function through an application, the first interface may be a home page of the application.
Optionally, in an embodiment of the present invention, the first interface may further include an AR image corresponding to the live-action image. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
Optionally, in an embodiment of the present invention, the third input may be a single-click input, a double-click input, a long-press input, or a re-press input of the user to the first operation control. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
Optionally, in this embodiment of the present invention, the first operation control may be configured to trigger the electronic device to display an interface that includes an AR image corresponding to a live-action image acquired by the electronic device, that is, the second interface.
For example, the first operation control may be a "view all" control 41 as shown in (a) of fig. 8.
Of course, in actual implementation, the first operation control may also be any other possible control, which may be determined according to actual use requirements, and the embodiment of the present invention is not limited.
For example, it is assumed that the first operation control is a "view all" control, the third input is a single-click input, and the number of the N objects may be 5, which are respectively 3 user information identifiers and 2 image determination controls. Then, in the case where the electronic device displays the first interface as shown in (a) of fig. 8, the user clicks the "view all" control 41 (i.e., the third input), and the electronic device displays a second interface as shown in (B) of fig. 8 in response to the third input, where the objects in the second interface may be the "all select" control 31, the "smart recommendation" control 32, the user information identification "a" 33, the user information identification "B" 34, and the user information identification "C" 35.
Optionally, in this embodiment of the present invention, the second interface may further include a return control, where the return control may be used to trigger the electronic device to return to the first interface. Specifically, after the electronic device receives an eighth input to the "back" control from the user, the electronic device may cancel displaying the second interface and display the first interface.
Optionally, in this embodiment of the present invention, the second interface may further include a selection control, where the selection control may be used to trigger the electronic device to store the AR image displayed in the second interface and return to the first interface.
Illustratively, the return control may be a "return" control 42 as shown in fig. 8 (b), and the selection control may be a "confirm selection" control 43 as shown in fig. 8 (b).
In the embodiment of the present invention, because the first interface may be a main interface of the electronic device using the AR function, a user may trigger the electronic device to display a second interface corresponding to a first operation control in the first interface by inputting the first operation control, so that the user may view or select an image in the AR image corresponding to the live-action image in the electronic device.
Optionally, in this embodiment of the present invention, the at least one operation control may further include a mask control, for example, a "mask all" control 44 shown in fig. 8 (a). Wherein the mask control can be used to trigger the electronic device to display only the AR image drawn by the user.
Optionally, in this embodiment of the present invention, the at least one operation control may further include an erasing control, for example, an "eraser" control 45 shown in fig. 8 (a). Wherein the erasure control can be used to trigger the electronic device to cancel displaying the AR image displayed in the first interface.
Optionally, in this embodiment of the present invention, after the electronic device displays the first interface, the user may further trigger the electronic device to display another interface (for example, a third interface in this embodiment of the present invention) by inputting another operation control (for example, a second operation control in this embodiment of the present invention) in the at least one operation control, so that the user may draw an AR image in the interface. Specifically, the interface may include two areas, where one area may be an area for the user to draw the AR image, and the other area may be an area for the electronic device to display the AR image corresponding to the captured live-action image.
For example, in conjunction with fig. 7, as shown in fig. 9, after S205 described above, the image control method provided by the embodiment of the present invention may further include S208 and S209 described below.
It should be noted that, in the embodiment of the present invention, the execution sequence among S208-S209, S206-S207, and S201-S202 may not be limited, that is, S208-S209 may be executed first, and then S206-S207 and S201-S202 may be executed; it is also possible to perform S206-S207 and S201-S202 first, and then perform S208-S209. The embodiment of the present invention is exemplified by executing S206-S207 and S201-S202 first, and then executing S208-S209.
And S208, the electronic equipment receives fourth input of the user to a second operation control in the at least one operation control.
S209, the electronic device responds to the fourth input and displays a third interface, and the third interface comprises a first area and a second area.
The first area may be used to draw an AR image, and the second area may be used to display an AR image corresponding to a live-action image captured by the electronic device.
In the embodiment of the present invention, after the electronic device displays the first interface, a user may trigger the electronic device to display the third interface by inputting the second operation control, so that the user may draw an AR image in a first area of the third interface, and may select an image in the AR image corresponding to a live-action image acquired by the electronic device in a second area of the third interface.
Optionally, in an embodiment of the present invention, the fourth input may be a single-click input, a double-click input, a long-press input, or a re-press input of the user to the second operation control. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
In this embodiment of the present invention, the second operation control may be used to trigger the electronic device to display an interface (i.e., the third interface) for a user to draw an AR image.
For example, the second operation control may be a "draft system" control 51 as shown in (a) of fig. 10.
In the embodiment of the present invention, the user may also directly draw the AR image in the first interface.
Optionally, in this embodiment of the present invention, the first area is an area that is not networked. That is, in a case where the user does not trigger the electronic device, the electronic device does not actively save the AR image drawn by the user in the first area to the server.
For example, it is assumed that the second operation control is a "draft system" control, and the fourth input is a single-click input. As shown in fig. 10 (a), the user clicks the "draft system" control 51 (i.e., a fourth input), and the electronic device displays a third interface as shown in fig. 10 (b) in response to the fourth input, wherein the third interface includes a first area 52 and a second area 53.
In this embodiment of the present invention, since the second operation control may be used to trigger the electronic device to display an interface on which the user draws an AR image, the user may trigger the electronic device to display the third interface through the fourth input, so that the user may draw an AR image in the first area and may refer to an AR image corresponding to the real-scene image in the second area. Therefore, the method for drawing the AR image in the electronic equipment by the user is convenient and fast.
Optionally, in this embodiment of the present invention, the second area may include at least one image selection control, and after the electronic device displays the third interface, if the user wants to trigger the electronic device to add some of the AR images in the AR images corresponding to the live-action images to the AR images drawn by the electronic device, the user may trigger the electronic device to display the AR image selected by the control in the first area by inputting a control (for example, the first image selection control in the embodiment of the present invention) in the at least one image selection control.
Illustratively, in conjunction with fig. 9, as shown in fig. 11, after S209, the image control method according to the embodiment of the present invention may further include S210 and S211 described below.
S210, the electronic equipment receives a fifth input of the user to a first image selection control in the at least one image selection control.
And S211, the electronic equipment responds to the fifth input, and displays the AR image selected by the first image selection control in the first area.
In this embodiment of the present invention, after the electronic device displays the third interface, the user may trigger the electronic device to display the AR image selected by the first image selection control in the first area through a fifth input to the first image selection control. Therefore, the user can quickly trigger the electronic device to add the image in the AR image corresponding to the real image into the first area, that is, the user can quickly trigger the electronic device to add the image in the AR image corresponding to the real image into the AR image drawn by the user.
Optionally, in this embodiment of the present invention, the fifth input may be any possible input, such as a single-click input, a double-click input, a long-press input, or a re-press input of the user on the first image selection control. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
Optionally, in an embodiment of the present invention, the at least one image selection control may include at least one of: and selecting controls of all AR images corresponding to the live-action images, and selecting controls of partial AR images corresponding to the live-action images.
For example, the above-described control for selecting all AR images corresponding to live-action images may be an "all-add" control 61 as shown in (b) of fig. 10.
Optionally, in this embodiment of the present invention, the control for selecting the part of the AR image corresponding to the live-action image may be a combination control. One control (hereinafter referred to as a first control) in the combined control may be used to select a portion of the AR image corresponding to the live-action image, and another control (hereinafter referred to as a second control) may be used to trigger the electronic device to display the AR image selected by the first control in the first area.
Optionally, in this embodiment of the present invention, the first control may select a continuous AR image in the second area, or may select an AR image in a sub-area in the second area.
Specifically, when the first control is to select consecutive AR images in the second area, after the user inputs the first control, the user may click on one AR image in the second area, and in response to the input, the electronic device may select an AR image connected to the AR image. When the first control is to select an AR image in one of the sub-regions in the second region, after a user input to the first control, the user may select one of the sub-regions in the second region, and in response to the input, the electronic device may select the AR image in the sub-region.
For example, the first control may be a "selection portion" control 62 as shown in (b) of fig. 10, or may be a "selection area" control 63 as shown in (b) of fig. 10, and the second control may be a "determination addition" control 64 as shown in (b) of fig. 10. Where the "select portion" control 62 is used to select a consecutive AR image in the second region and the "select region" control 63 is used to select an AR image in one of the sub-regions in the second region.
In the embodiment of the present invention, because the image selection control may be used to trigger the electronic device to display the AR image selected by the image selection control in the first area, a user may quickly trigger the electronic device to add an image in the AR image corresponding to the live-action image to the first area through input to the first image selection control, so that convenience in operation of the user in the electronic device may be improved.
Optionally, in this embodiment of the present invention, after the electronic device displays the third interface, if the user wants to preview an effect of the AR image drawn by the user in the first area superimposed on the live-action image, the user may trigger the electronic device to display the AR image drawn by the user in the first area in the second area through an input to the preview control, and keep displaying the AR image drawn by the user in the first area, so that the user may preview the effect of the AR image drawn by the user superimposed on the live-action image in the second area.
Illustratively, in conjunction with fig. 9 described above, as shown in fig. 12, after S209 described above, the image control method provided by the embodiment of the present invention may further include S212 and S213 described below.
And S212, under the condition that the third AR image is displayed in the first area, the electronic equipment receives sixth input of the user to the preview control.
The third AR image may be an AR image drawn by the user in the first area.
And S213, in response to a sixth input, the electronic device displays the third AR image in the second area and keeps displaying the third AR image in the first area.
In this embodiment of the present invention, after the electronic device displays the third interface, if the user wants to preview an effect that the AR image drawn in the first area (i.e., the third AR image) is added to the live-action image when the user draws the AR image in the first area, the user may trigger the electronic device to display the third AR image in the second area by a sixth input to the preview control, and keep displaying the third AR image in the first area.
Optionally, in this embodiment of the present invention, the sixth input may be any possible input, such as a single-click input, a double-click input, a long-press input, or a re-press input of the user on the preview control. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
In this embodiment of the present invention, the third AR image may be an AR image drawn by the user in the first area. That is, the third AR image does not include an image in the AR image corresponding to the live-view image.
It is to be understood that, when the AR image displayed in the first area includes an AR image drawn by the user in the first area and an image in the AR image corresponding to the real-image, if the electronic device receives a sixth input to the preview control from the user, in response to the sixth input, the electronic device displays the AR image drawn by the user only in the second area, and the electronic device keeps displaying the AR image drawn by the user and the image in the AR image corresponding to the real-image in the first area.
Optionally, in this embodiment of the present invention, after the electronic device displays the third AR image in the second area, if the user wants to continue to draw the AR image or wants to cancel the preview, the user may trigger the electronic device to cancel displaying the third AR image in the second area through a ninth input to the preview control, and keep displaying the third AR image in the first area.
Optionally, in this embodiment of the present invention, the preview control may be any possible control such as an "add preview" control 65 or an "add" control shown in (b) in fig. 10, which may be determined specifically according to actual use requirements, and this embodiment of the present invention is not limited.
In the embodiment of the present invention, since the preview control may trigger the electronic device to display the third AR image in the second area, the user may quickly preview an effect that the third AR image is added to the live-action image through the sixth input, so that the user may operate the electronic device more conveniently.
Optionally, in this embodiment of the present invention, after the electronic device displays the third interface, if the user wants to trigger the electronic device to superimpose the AR image drawn by the user in the first area onto the live-action image and store the AR image drawn by the user in the server, the user may trigger the terminal to display the AR image drawn by the user in the first area in the second area through an input to the add control, and cancel displaying the AR image drawn by the user in the first area, so that the AR image drawn by the user in the first area may be superimposed onto the live-action image and stored in the server.
For example, in conjunction with fig. 9, as shown in fig. 13, after S209, the image control method according to the embodiment of the present invention may further include S214 and S215 described below.
And S214, under the condition that the fourth AR image is displayed in the first area, the electronic equipment receives seventh input of the user to the adding control.
The fourth AR image may be an AR image drawn by the user in the first area.
S215, the electronic device responds to a seventh input, displays the fourth AR image in the second area, and cancels the display of the fourth AR image in the first area.
In this embodiment of the present invention, after the electronic device displays the third interface, if the user draws an AR image in the first area, and the user wants to trigger the electronic device to add the AR image drawn in the first area (i.e., the fourth AR image) to the live-action image and save the fourth AR image to the server, the user may trigger the electronic device to display the fourth AR image in the second area and cancel displaying the third AR image in the first area by a seventh input to the add control.
Optionally, in this embodiment of the present invention, the seventh input may be any possible input, such as a single-click input, a double-click input, a long-press input, or a re-press input, of the user on the addition control. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
In this embodiment of the present invention, the fourth AR image may be an AR image drawn by the user in the first area. That is, the fourth AR image does not include an image in the AR image corresponding to the live-view image.
It is to be understood that, when the AR image displayed in the first area includes the AR image drawn by the user in the first area and the image in the AR image corresponding to the real-image, if the electronic device receives a seventh input to the add control from the user, in response to the seventh input, the electronic device displays the AR image drawn by the user only in the second area, and the electronic device saves both the AR image drawn by the user and the image in the AR image corresponding to the real-image to the server.
Optionally, in the embodiment of the present invention, the adding control may be any possible control such as an "adding" control 66 shown in fig. 10 (b), which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited.
In the embodiment of the present invention, since the adding control may trigger the electronic device to display the fourth AR image in the second area, the user may quickly trigger the electronic device to add the fourth AR image to the live-action image through the seventh input, and store the fourth AR image in the server, so that the user may operate the electronic device more conveniently.
Optionally, in this embodiment of the present invention, the first area may further include a return control (for example, a "return to home" control 67 in fig. 10 (b)). The user can quickly trigger the electronic equipment to cancel the display of the third interface and display the first interface by inputting the return control.
Optionally, in this embodiment of the present invention, the first area may further include an erasing drawing control (for example, an "erasing" control 68 shown in fig. 10 (b)). The user may trigger the electronic device to cancel displaying the AR image in the first area by an input to the erase mapping control.
Illustratively, when the user triggers the electronic device to display all of the AR images corresponding to the live-action images in the first area by an input to the "all-add" control 61 shown in (b) of fig. 10, if the user wants to trigger the electronic device to cancel displaying some of the AR images in the first area, the user may click on an input to the "erase" control 68 shown in (b) of fig. 10 and click on the AR image that the user wants to trigger the electronic device to cancel displaying, and the electronic device may cancel displaying the AR image in response to the input.
In the embodiment of the present invention, the image control methods shown in the above drawings are all exemplarily described with reference to one drawing in the embodiment of the present invention. In specific implementation, the image control method shown in each of the above drawings may also be implemented by combining any other drawings that may be combined, which are illustrated in the above embodiments, and are not described herein again.
As shown in fig. 14, an embodiment of the present invention provides an electronic device 700, and the electronic device 700 may include a receiving module 701 and a display module 702. A receiving module 701, configured to receive a first input of a target object of the N objects from a user; a display module 702 may be configured to display or cancel display of at least one target AR image corresponding to the target object in response to the first input received by the receiving module 701. Each object in the N objects corresponds to at least one AR image, and the AR images corresponding to the N objects are all AR images corresponding to the live-action images collected by the electronic equipment.
Optionally, the display module 702 may be specifically configured to cancel displaying the at least one target AR object in response to the first input; the receiving module 701 may be further configured to receive a second input of the user on the target object in the N objects before receiving the first input of the user on the target object; the display module 702 may be further configured to display at least one target AR object in response to the second input received by the receiving module 701.
Optionally, the N objects may include at least one of: the image determination control, the second AR image and the user information identifier; a display module 702, specifically configured to display or cancel displaying the AR image corresponding to the image determination control in response to a first input to the image determination control; or the display module is specifically configured to display or cancel displaying the AR image drawn by the user who draws the second AR image in response to a first input to the second AR image; or, the display module is specifically configured to display or cancel displaying the AR image drawn by the user indicated by the user information identifier in response to the first input to the user information identifier.
Optionally, the display module 702 may be further configured to display a first interface before the receiving module 701 receives the first input of the target object by the user, where the first interface includes at least one operation control and a live-action image captured by the electronic device; the receiving module 701 may further be configured to receive a third input of the user to a first operation control of the at least one operation control displayed by the display module 702; the display module 702 is further configured to display a second interface in response to the third input received by the receiving module 701, where the second interface includes N objects.
Optionally, the receiving module 701 may be further configured to receive, after the displaying module 702 displays the first interface, a fourth input of the user to a second operation control of the at least one operation control; the display module 702 may be further configured to display, in response to the fourth input received by the receiving module 701, a third interface, where the third interface includes a first area and a second area, the first area is used to draw an AR image, and the second area is used to display an AR image corresponding to a live-action image captured by the electronic device.
Optionally, the second region may include at least one image selection control; the receiving module 701 may be further configured to receive a fifth input of the user to the first image selection control of the at least one image selection control after the displaying module 702 displays the third interface; the display module 702 may be further configured to display the AR image selected by the first image selection control in the first area in response to a fifth input received by the receiving module 701.
Optionally, the first area may include a preview control; the receiving module 701 may further be configured to receive a sixth input of the preview control from the user when the displaying module 702 displays a third AR image in the first area after the displaying module 702 displays the third interface, where the third AR image is an AR image drawn by the user in the first area; the display module 702 may be further configured to display the third AR image in the second area and keep displaying the third AR image in the first area in response to the sixth input received by the receiving module 701.
Optionally, the first area may include an add control; the receiving module 701 may be further configured to receive a seventh input of the addition control from the user when the display module 702 displays a fourth AR image in the first area after the display module 702 displays the third interface, where the fourth AR image is an AR image drawn by the user in the first area; the display module 702 may be further configured to display the fourth AR image in the second area and cancel the display of the fourth AR image in the first area in response to the seventh input received by the receiving module 701.
The electronic device provided by the embodiment of the invention can realize each process executed by the electronic device in the embodiment of the image control method, and can achieve the same technical effect, and the details are not repeated here in order to avoid repetition.
The embodiment of the invention provides electronic equipment, each object in N objects can correspond to at least one AR image, so that when the electronic equipment uses an AR function at a geographic position corresponding to a certain real scene, a user can trigger the electronic equipment to display or cancel displaying of the AR image corresponding to the object through input of the object in the N objects according to the use requirement of the user, so that the electronic equipment can only superpose the AR image which the user wants to see on the real-scene image of the real scene, the electronic equipment can display the image according to the actual use requirement of the user, and the mode of displaying the image by the electronic equipment is flexible.
Fig. 15 is a hardware schematic diagram of an electronic device implementing various embodiments of the invention. As shown in fig. 15, electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 15 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, the processor 110 is configured to control the user input unit 107 to receive a first input of a target object of the N objects by a user; in response to the first input received by the user input unit 107, the display unit 106 is controlled to display or cancel display of at least one target AR image corresponding to the target object. Each object in the N objects corresponds to at least one AR image, and the AR images corresponding to the N objects are all AR images corresponding to the live-action images collected by the electronic equipment.
It can be understood that, in the embodiment of the present invention, the receiving module 701 in the structural schematic diagram of the electronic device (for example, fig. 14) may be implemented by the user input unit 107. The display module 702 in the structural schematic diagram of the electronic device (for example, fig. 14) can be implemented by the display unit 106.
The embodiment of the invention provides electronic equipment, each object in N objects can correspond to at least one AR image, so that when the electronic equipment uses an AR function at a geographic position corresponding to a certain real scene, a user can trigger the electronic equipment to display or cancel displaying of the AR image corresponding to the object through input of the object in the N objects according to the use requirement of the user, so that the electronic equipment can only superpose the AR image which the user wants to see on the real-scene image of the real scene, the electronic equipment can display the image according to the actual use requirement of the user, and the mode of displaying the image by the electronic equipment is flexible.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 102, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic apparatus 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The electronic device 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the electronic device 100 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 15, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the electronic apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 100 or may be used to transmit data between the electronic apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the electronic device. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The electronic device 100 may further include a power supply 111 (e.g., a battery) for supplying power to each component, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the electronic device 100 includes some functional modules that are not shown, and are not described in detail herein.
Optionally, an embodiment of the present invention further provides an electronic device, which includes the processor 110 shown in fig. 15, the memory 109, and a computer program stored in the memory 109 and capable of being executed on the processor 110, where the computer program, when executed by the processor 110, implements each process of the above-mentioned embodiment of the image control method, and can achieve the same technical effect, and is not described herein again to avoid repetition.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor shown in fig. 15, the computer program implements each process of the image control method embodiment, and can achieve the same technical effect, and is not described herein again to avoid repetition. The computer-readable storage medium may include a read-only memory (ROM), a Random Access Memory (RAM), a magnetic or optical disk, and the like.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling an electronic device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (18)

1. An image control method applied to an electronic device, the method comprising:
receiving first input of a user on a target object in N objects, wherein each object in the N objects corresponds to at least one Augmented Reality (AR) image, and the AR images corresponding to the N objects are all AR images corresponding to a live-action image collected by the electronic equipment;
displaying or canceling display of at least one target AR image corresponding to the target object in response to the first input.
2. The method of claim 1, wherein said displaying or de-displaying, in response to the first input, at least one target AR object corresponding to the target object comprises:
in response to the first input, dismissing the at least one target AR object;
before the receiving a first input of a target object of the N objects by a user, the method further includes:
receiving a second input of the target object by the user;
in response to the second input, displaying the at least one target AR image.
3. The method of claim 1, wherein the N objects comprise at least one of: the image determination control, the second AR image and the user information identifier;
the displaying or de-displaying, in response to the first input, at least one target AR image corresponding to the target object, comprising:
in response to a first input to the image determination control, displaying or canceling display of an AR image corresponding to the image determination control; alternatively, the first and second electrodes may be,
displaying or canceling display of an AR image drawn by a user who draws the second AR image in response to a first input to the second AR image; alternatively, the first and second electrodes may be,
and in response to the first input of the user information identification, displaying or canceling the display of the AR image drawn by the user indicated by the user information identification.
4. The method of claim 1, wherein prior to receiving the user's first input of the target object of the N objects, the method further comprises:
displaying a first interface, wherein the first interface comprises at least one operation control and a live-action image acquired by the electronic equipment;
receiving a third input of a user to a first operation control in the at least one operation control;
in response to the third input, displaying a second interface, the second interface including the N objects.
5. The method of claim 4, wherein after displaying the first interface, the method further comprises:
receiving a fourth input of a user to a second operation control in the at least one operation control;
and responding to the fourth input, and displaying a third interface, wherein the third interface comprises a first area and a second area, the first area is used for drawing an AR image, and the second area is used for displaying the AR image corresponding to the real scene image collected by the electronic equipment.
6. The method of claim 5, wherein the second region includes at least one image selection control;
after the displaying the third interface, the method further comprises:
receiving a fifth input by the user to a first image selection control of the at least one image selection control;
in response to the fifth input, displaying an AR image corresponding to the first image selection control in the first area.
7. The method of claim 5, wherein the first area comprises a preview control;
after the displaying the third interface, the method further comprises:
receiving a sixth input of the preview control from the user under the condition that a third AR image is displayed in the first area, wherein the third AR image is an AR image drawn by the user in the first area;
in response to the sixth input, displaying the third AR image in the second area and maintaining display of the third AR image in the first area.
8. The method of claim 5, wherein the first region comprises an add control;
after the displaying the third interface, the method further comprises:
receiving a seventh input of the addition control by the user under the condition that a fourth AR image is displayed in the first area, wherein the fourth AR image is an AR image drawn by the user in the first area;
in response to the seventh input, displaying the fourth AR image in the second area and canceling the displaying of the fourth AR image in the first area.
9. An electronic device, comprising a receiving module and a display module;
the receiving module is configured to receive a first input of a user to a target object of N objects, where each object of the N objects corresponds to at least one Augmented Reality (AR) image, and the AR images corresponding to the N objects are all AR images corresponding to a live-action image acquired by the electronic device;
a display module for displaying or canceling display of at least one target AR image corresponding to the target object in response to the first input received by the receiving module.
10. The electronic device of claim 9, wherein the display module, in particular, is configured to cancel displaying the at least one target AR object in response to the first input;
the receiving module is further configured to receive a second input of the target object from the user before receiving the first input of the target object from the N objects;
the display module is configured to display the at least one target AR object in response to the second input received by the receiving module.
11. The electronic device of claim 9, wherein the N objects comprise at least one of: the image determination control, the second AR image and the user information identifier;
the display module is specifically configured to display or cancel displaying the AR image corresponding to the image determination control in response to a first input to the image determination control; alternatively, the first and second electrodes may be,
the display module is specifically configured to display or cancel displaying the AR image drawn by the user who draws the second AR image in response to a first input to the second AR image; alternatively, the first and second electrodes may be,
the display module is specifically configured to display or cancel displaying the AR image drawn by the user indicated by the user information identifier in response to a first input to the user information identifier.
12. The electronic device of claim 9, wherein the display module is further configured to display a first interface before the receiving module receives a first input of the target object from a user, the first interface including at least one operation control and a live-action image captured by the electronic device;
the receiving module is further configured to receive a third input of a user to a first operation control of the at least one operation control displayed by the display module;
the display module is further configured to display a second interface in response to the third input received by the receiving module, where the second interface includes the N objects.
13. The electronic device of claim 12, wherein the receiving module is further configured to receive a fourth input from the user to a second operation control of the at least one operation control after the display module displays the first interface;
the display module is further configured to display a third interface in response to the fourth input received by the receiving module, where the third interface includes a first area and a second area, the first area is used to draw an AR image, and the second area is used to display an AR image corresponding to a live-action image collected by the electronic device.
14. The electronic device of claim 13, wherein the second region includes at least one image selection control;
the receiving module is further configured to receive a fifth input of the user to a first image selection control of the at least one image selection control after the display module displays the third interface;
the display module is further configured to display, in response to the fifth input received by the receiving module, the AR image selected by the first image selection control in the first area.
15. The electronic device of claim 13, wherein the first area comprises a preview control;
the receiving module is further configured to receive, after the display module displays the third interface, a sixth input of the user to the preview control when the display module displays a third AR image in the first area, where the third AR image is an AR image drawn by the user in the first area;
the display module is further configured to display the third AR image in the second area and keep displaying the third AR image in the first area in response to the sixth input received by the receiving module.
16. The electronic device of claim 13, wherein the first area comprises an add control;
the receiving module is further configured to receive a seventh input of the add control from the user when the display module displays a fourth AR image in the first area after the display module displays the third interface, where the fourth AR image is an AR image drawn by the user in the first area;
the display module is further configured to display the fourth AR image in the second area and cancel displaying the fourth AR image in the first area in response to the seventh input received by the receiving module.
17. An electronic device, comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the image control method according to any one of claims 1 to 8.
18. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image control method according to any one of claims 1 to 8.
CN201911202454.6A 2019-11-29 2019-11-29 Image control method and electronic equipment Active CN110941341B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911202454.6A CN110941341B (en) 2019-11-29 2019-11-29 Image control method and electronic equipment
PCT/CN2020/130776 WO2021104194A1 (en) 2019-11-29 2020-11-23 Image control method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911202454.6A CN110941341B (en) 2019-11-29 2019-11-29 Image control method and electronic equipment

Publications (2)

Publication Number Publication Date
CN110941341A true CN110941341A (en) 2020-03-31
CN110941341B CN110941341B (en) 2022-02-01

Family

ID=69909000

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911202454.6A Active CN110941341B (en) 2019-11-29 2019-11-29 Image control method and electronic equipment

Country Status (2)

Country Link
CN (1) CN110941341B (en)
WO (1) WO2021104194A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021104194A1 (en) * 2019-11-29 2021-06-03 维沃移动通信有限公司 Image control method and electronic device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103150759A (en) * 2013-03-05 2013-06-12 腾讯科技(深圳)有限公司 Method and device for dynamically enhancing street view image
CN107067295A (en) * 2017-03-13 2017-08-18 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN107111996A (en) * 2014-11-11 2017-08-29 本特图像实验室有限责任公司 The augmented reality experience of Real-Time Sharing
US20190213754A1 (en) * 2018-01-08 2019-07-11 Htc Corporation Anchor recognition in reality system
CN110162258A (en) * 2018-07-03 2019-08-23 腾讯数码(天津)有限公司 The processing method and processing device of individual scene image
CN110413719A (en) * 2019-07-25 2019-11-05 Oppo广东移动通信有限公司 Information processing method and device, equipment, storage medium
CN110442813A (en) * 2019-05-24 2019-11-12 广东药科大学 A kind of tourist souvenir information processing system and method based on AR

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5158162B2 (en) * 2010-09-14 2013-03-06 日本電気株式会社 Augmented reality display system, augmented reality display method, and augmented reality display program
JP2014149712A (en) * 2013-02-01 2014-08-21 Sony Corp Information processing device, terminal device, information processing method, and program
CN108037863B (en) * 2017-12-12 2021-03-30 北京小米移动软件有限公司 Method and device for displaying image
US20210102820A1 (en) * 2018-02-23 2021-04-08 Google Llc Transitioning between map view and augmented reality view
CN113112614B (en) * 2018-08-27 2024-03-19 创新先进技术有限公司 Interaction method and device based on augmented reality
CN109120800A (en) * 2018-10-18 2019-01-01 维沃移动通信有限公司 A kind of application icon method of adjustment and mobile terminal
CN110941341B (en) * 2019-11-29 2022-02-01 维沃移动通信有限公司 Image control method and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103150759A (en) * 2013-03-05 2013-06-12 腾讯科技(深圳)有限公司 Method and device for dynamically enhancing street view image
CN107111996A (en) * 2014-11-11 2017-08-29 本特图像实验室有限责任公司 The augmented reality experience of Real-Time Sharing
CN107067295A (en) * 2017-03-13 2017-08-18 联想(北京)有限公司 A kind of information processing method and electronic equipment
US20190213754A1 (en) * 2018-01-08 2019-07-11 Htc Corporation Anchor recognition in reality system
CN110162258A (en) * 2018-07-03 2019-08-23 腾讯数码(天津)有限公司 The processing method and processing device of individual scene image
CN110442813A (en) * 2019-05-24 2019-11-12 广东药科大学 A kind of tourist souvenir information processing system and method based on AR
CN110413719A (en) * 2019-07-25 2019-11-05 Oppo广东移动通信有限公司 Information processing method and device, equipment, storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
安福双的自留地: ""CamAR特效视频软件怎么使用操作啊?",https://jingyan.baidu.com/article/a17d5285cf7ece8098c8f228.html", 《百度经验》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021104194A1 (en) * 2019-11-29 2021-06-03 维沃移动通信有限公司 Image control method and electronic device

Also Published As

Publication number Publication date
CN110941341B (en) 2022-02-01
WO2021104194A1 (en) 2021-06-03

Similar Documents

Publication Publication Date Title
CN110995923B (en) Screen projection control method and electronic equipment
CN111061574B (en) Object sharing method and electronic device
CN110908558B (en) Image display method and electronic equipment
CN110062105B (en) Interface display method and terminal equipment
CN109032445B (en) Screen display control method and terminal equipment
CN111031398A (en) Video control method and electronic equipment
CN110752981B (en) Information control method and electronic equipment
CN108920226B (en) Screen recording method and device
CN111026299A (en) Information sharing method and electronic equipment
CN110908557B (en) Information display method and terminal equipment
CN109828731B (en) Searching method and terminal equipment
CN111273993B (en) Icon arrangement method and electronic equipment
CN109408072B (en) Application program deleting method and terminal equipment
CN110944139B (en) Display control method and electronic equipment
CN110703972B (en) File control method and electronic equipment
CN110989950A (en) Sharing control method and electronic equipment
CN111124231B (en) Picture generation method and electronic equipment
CN110647277A (en) Control method and terminal equipment
CN111190517B (en) Split screen display method and electronic equipment
CN109117037B (en) Image processing method and terminal equipment
CN111093033B (en) Information processing method and device
CN110647506B (en) Picture deleting method and terminal equipment
CN110941341B (en) Image control method and electronic equipment
CN109889654B (en) Information display method and terminal equipment
CN111178306A (en) Display control method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant