CN111093033B - Information processing method and device - Google Patents

Information processing method and device Download PDF

Info

Publication number
CN111093033B
CN111093033B CN201911424027.2A CN201911424027A CN111093033B CN 111093033 B CN111093033 B CN 111093033B CN 201911424027 A CN201911424027 A CN 201911424027A CN 111093033 B CN111093033 B CN 111093033B
Authority
CN
China
Prior art keywords
target
information
electronic device
target information
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911424027.2A
Other languages
Chinese (zh)
Other versions
CN111093033A (en
Inventor
张波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911424027.2A priority Critical patent/CN111093033B/en
Publication of CN111093033A publication Critical patent/CN111093033A/en
Application granted granted Critical
Publication of CN111093033B publication Critical patent/CN111093033B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Abstract

The embodiment of the invention provides an information processing method and equipment, which are applied to the field of communication and are used for solving the problem that privacy information of a user is leaked when electronic equipment displays information. The information processing method is applied to first electronic equipment and comprises the following steps: displaying a first AR object on a first shooting preview screen; receiving a first input of a user to a first AR object; in response to the first input, target information associated with the first AR object is displayed. The method is particularly applied to the process of displaying information by the electronic equipment.

Description

Information processing method and device
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an information processing method and equipment.
Background
Currently, after an electronic device receives a message sent by another device (e.g., another electronic device or a server) through a communication application, the message content (denoted as information) of the message may be directly displayed in a communication interface of its communication application.
However, if a stranger is around the user, such as the user is in a public place, if the information includes the privacy information of the user (such as a bank card account number or a password), the electronic device directly displays the information itself, which may cause the privacy information of the user to be leaked.
Disclosure of Invention
The embodiment of the invention provides an information processing method and equipment, and aims to solve the problem that privacy information of a user is leaked when electronic equipment displays information.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides an information processing method applied to a first electronic device, where the method includes: displaying a first Augmented Reality (AR) object on the first photographing preview screen; receiving a first input of a user to a first AR object; in response to the first input, target information associated with the first AR object is displayed.
In a second aspect, an embodiment of the present invention further provides an information processing method, which is applied to a target device, and the method includes: acquiring target information input by a user; storing the target information and the first AR object in an associated manner; generating an AR message based on the target information and the first AR object; an AR message is sent to the first electronic device.
In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device is a first electronic device, and the electronic device includes: the display module and the first receiving module; the display module is used for displaying the first AR object on the first shooting preview picture; the first receiving module is used for receiving first input of a user to the first AR object displayed by the display module; the display module is further configured to display target information associated with the first AR object in response to the first input received by the first receiving module.
In a fourth aspect, an embodiment of the present invention further provides an electronic device, where the electronic device is a target device, and the electronic device includes: the device comprises an acquisition module, a storage module, a generation module and a sending module; the acquisition module is used for acquiring target information input by a user; the storage module is used for storing the target information acquired by the acquisition module and the first AR object in an associated manner; the generating module is used for generating an AR message based on the target information and the first AR object stored by the storage module; and the sending module is used for sending the AR message generated by the generating module to the first electronic equipment.
In a fifth aspect, an embodiment of the present invention provides an electronic device, which includes a processor, a memory, and a computer program stored in the memory and executable on the processor, and when executed by the processor, the computer program implements the steps of the information processing method according to the first aspect.
In a sixth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the information processing method according to the first aspect.
In a seventh aspect, an embodiment of the present invention provides a target device, which includes a processor, a memory, and a computer program stored on the memory and operable on the processor, and when executed by the processor, the computer program implements the steps of the information processing method according to the second aspect.
In an eighth aspect, the embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the information processing method according to the second aspect.
In the embodiment of the present invention, since the target information is associated with the first AR object, the first electronic device does not directly display the target information (e.g., the content of a message), but displays the first AR object on the first shooting preview screen, and displays the target information associated with the first AR object on the first shooting preview screen when receiving the first input of the first AR object from the user. Specifically, other users except the local user generally cannot operate the first electronic device, that is, cannot operate the first AR object, so that the target information cannot be obtained, the privacy information of the user related to the target information cannot be obtained, and the security of target information display is improved. In addition, the target information is displayed on the first preview screen of the first electronic device, and is not directly displayed on interfaces such as interfaces of corresponding communication applications, so that the first electronic device can improve convenience of target information display by firstly displaying the first AR object and then displaying the target information. Namely, the intuitiveness, the safety and the convenience of leaving messages for other people by the user through the electronic equipment are improved.
Drawings
FIG. 1 is a block diagram of a possible operating system according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of an information processing method according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating an electronic device displaying content according to an embodiment of the present invention;
fig. 4 is a second schematic diagram of the display content of the electronic device according to the embodiment of the invention;
fig. 5 is a third schematic diagram of the display content of the electronic device according to the embodiment of the invention;
FIG. 6 is a fourth schematic diagram illustrating display contents of an electronic device according to an embodiment of the present invention;
FIG. 7 is a fifth schematic diagram illustrating display contents of an electronic device according to an embodiment of the invention;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a target device according to an embodiment of the present invention;
fig. 10 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention;
fig. 11 is a schematic diagram of a hardware structure of a target device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that "/" in this context means "or", for example, A/B may mean A or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. "plurality" means two or more than two.
It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The terms "first" and "second," and the like, in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first AR object and the second AR object, etc. are for distinguishing different AR objects, rather than for describing a particular order of AR objects.
It should be noted that the information processing method provided by the embodiment of the present invention may be applied to leave a message for a user, for example, a message left by a user for an object in the real world, for example, a scene in which the user says "an apple is in a refrigerator" for another person for the object, so that the user can leave a message or view a message conveniently and intuitively. Or, the method can also be applied to a scene that the information displayed by the user through the electronic equipment contains the privacy information of the user.
The electronic device (e.g., the first electronic device) in the embodiment of the present invention may be a mobile electronic device or a non-mobile electronic device. The mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), etc.; the non-mobile electronic device may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, or the like; the embodiments of the present invention are not particularly limited.
In the embodiment of the present invention, when the electronic device executes the information Processing method, the execution main body may be the electronic device itself, or a Central Processing Unit (CPU) of the electronic device, or a control module in the electronic device for executing the information Processing method. In the embodiment of the present invention, an electronic device executes an information processing method as an example, and the information processing method provided in the embodiment of the present invention is described.
Specifically, the electronic device provided in the embodiment of the present invention may be the following first electronic device.
It can be understood that, in the embodiment of the present invention, when the electronic device is an AR device, the electronic device may be an electronic device integrated with AR technology. The AR technology is a technology for realizing the combination of a real scene and a virtual scene. By adopting the AR technology, the visual function of human can be restored, so that human can experience the feeling of combining a real scene and a virtual scene through the AR technology, and further the human can experience the experience of being personally on the scene better.
The AR equipment can comprise the camera, so that the AR equipment can be combined with the virtual picture to display and interact on the basis of the picture shot by the camera. For example, in the embodiment of the present invention, the AR device may display the first AR object on the shooting preview screen, and display the target information associated with the first AR object after the user inputs the first AR object.
The target device in the embodiment of the present invention may be a mobile electronic device or a non-mobile electronic device. The mobile electronic equipment can be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, wearable equipment, a UMPC, a PDA and the like; the non-mobile electronic equipment can be a PC, a TV, a teller machine or a self-service machine and the like; the embodiments of the present invention are not particularly limited.
In the embodiment of the present invention, in a case where the target device executes the information processing method, the execution subject may be the target device itself, or a CPU of the target device, or a control module in the target device for executing the information processing method. In the embodiment of the present invention, an information processing method executed by a target device is taken as an example, and the information processing method provided in the embodiment of the present invention is described.
In the information processing method provided by the embodiment of the present invention, since the target information is associated with the first AR object, the first electronic device does not directly display the target information (e.g., content of a message), but displays the first AR object on the first shooting preview screen, and displays the target information associated with the first AR object on the first shooting preview screen when receiving the first input of the user to the first AR object. Specifically, other users except the local user generally cannot operate the first electronic device, that is, cannot operate the first AR object, so that the target information cannot be obtained, and the privacy information of the user related to the target information cannot be obtained, thereby improving the security of target information display. In addition, the target information is displayed on the first preview picture of the first electronic device, and is not directly displayed on interfaces such as interfaces of corresponding communication applications, so that the first electronic device can improve the intuitiveness and convenience of target information display by firstly displaying the first AR object and then displaying the target information. Namely, the intuitiveness, the safety and the convenience of leaving messages for other people by the user through the electronic equipment are improved.
The electronic device and/or the target device in the embodiments of the present invention may have an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
A software environment applied to the information processing method provided by the embodiment of the present invention is described below by taking a possible operating system as an example.
Fig. 1 is a schematic diagram of a possible operating system according to an embodiment of the present invention. In fig. 1, the architecture of the operating system includes 4 layers, respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application layer comprises various application programs (including system application programs and third-party application programs) in an operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application. For example, applications such as a system setup application, a system chat application, and a system camera application. And the third-party setting application, the third-party camera application, the third-party chatting application and other application programs.
The system runtime layer includes a library (also referred to as a system library) and an operating system runtime environment. The library mainly provides various resources required by the operating system. The operating system runtime environment is used to provide a software environment for the operating system.
The kernel layer is the operating system layer of the operating system and belongs to the lowest layer of the operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the operating system based on the Linux kernel.
Taking an operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the information processing method provided in the embodiment of the present invention based on the system architecture of the operating system shown in fig. 1, so that the information processing method may be run based on the operating system shown in fig. 1. That is, the processor or the electronic device may implement the information processing method provided by the embodiment of the present invention by running the software program in the operating system.
The following describes in detail an information processing method provided by an embodiment of the present invention with reference to a flowchart of the information processing method shown in fig. 2. Wherein, although the logical order of the information processing methods provided by embodiments of the present invention is shown in method flow diagrams, in some cases, the steps shown or described may be performed in an order different than here. For example, the information processing method shown in fig. 2 may include steps 201 to 203:
step 201, the first electronic device displays a first AR object on a first shooting preview screen.
Optionally, the first AR object is stored in the first electronic device, or is acquired by the first electronic device from another electronic device (e.g., a second electronic device) or a server.
Wherein, the first AR object is an entrance of the target information.
Specifically, the first electronic device may superimpose and display an AR object (e.g., a first AR object) obtained based on AR technology on a real screen in the first shooting preview interface.
It should be noted that the AR object (e.g., the first AR object) provided in the embodiment of the present invention may be a three-dimensional image, such as a three-dimensional panoramic image, of an object (e.g., a person or an animal).
Optionally, the first AR object may be an AR object of one or more objects, and the following embodiment takes the first AR object as an AR object of one object as an example for description.
Optionally, the target information provided in the embodiment of the present invention may be in a form of a text, an emoticon, a picture, a video, a link, or the like, and the specific form of the target information is not limited in the embodiment of the present invention, and may be in any realizable form.
Optionally, in this embodiment of the present invention, the first electronic device may obtain the first AR object and the target information from another electronic device (e.g., a second electronic device) through a predefined communication application, where the predefined communication application may be a social application or a chat application.
Step 202, the first electronic device receives a first input of the first AR object from the user.
Wherein the first electronic device displays the first AR object on the preview image, i.e. displays the entry of the target information, instead of directly displaying the target information itself.
Optionally, the first AR object may be displayed superimposed on the preview image of the first electronic device, or displayed in a floating manner on the preview image of the first electronic device.
It should be noted that, the Augmented Reality (AR) technology can perform analog simulation processing on entity information that is difficult to experience in the spatial range of the real world originally, and superimpose virtual information content to be effectively applied in the real world, so that the real environment and the virtual object can exist in the same picture and space after being superimposed. Thus, human sense can sense virtual objects, and therefore sense experience beyond reality is achieved.
It is understood that the first photographing preview screen displayed by the first electronic device may be changed as the real-world photographic scene is changed, and the first AR object may not be changed as the photographic scene is changed. That is, the first electronic device may display a virtual object represented by the first AR object on the first photographed preview screen of the real world. For example, the first electronic device may display a virtual apple on a first shot preview screen of a scene in which a kitchen is located.
The AR object of the embodiment of the present invention may be understood as: the AR device analyzes the real object to obtain feature information of the real object (e.g., type information of the real object, appearance information of the real object (e.g., structure, color, shape, etc.), position information of the real object in space, etc.), and constructs an AR model in the AR device according to the feature information.
Optionally, in the embodiment of the present invention, the AR object may specifically be a virtual image, a virtual pattern, a virtual character, and the like.
The first input may be input of a user to a first AR object displayed on a first photographing preview screen of the first electronic device.
It should be noted that the screen of the electronic device provided in the embodiment of the present invention may be a touch screen, and the touch screen may be configured to receive an input from a user and display a content corresponding to the input to the user in response to the input. The first input may be a touch screen input, a fingerprint input, a gravity input, a key input, or the like. The touch screen input is input such as a press input, a long press input, a slide input, a click input, a hover input (an input by a user near the touch screen) of a touch screen of the electronic device by the user. The fingerprint input is input by a user to a sliding fingerprint, a long-time pressing fingerprint, a single-click fingerprint, a double-click fingerprint and the like of a fingerprint recognizer of the electronic equipment. The gravity input is input such as shaking of the electronic equipment in a specific direction, shaking of the electronic equipment for a specific number of times and the like by a user. The key input corresponds to a single-click input, a double-click input, a long-press input, a combination key input, and the like, of a user for a key of a power key, a volume key, a Home key, and the like of the electronic device. Specifically, the embodiment of the present invention does not specifically limit the manner of the first input, and may be any realizable manner. For example, the first input is a user click input or hover input to the first AR object.
Specifically, in the case that the first input is a hovering input, the user may virtually touch the first AR object by moving a relative position of a finger and the first electronic device so that the finger of the user hovers over the first electronic device.
Step 203, the first electronic device responds to the first input and displays target information associated with the first AR object.
Optionally, the first electronic device may display the target information in a preset manner.
For example, the preset manner may include at least one of the first effect and the second effect.
Optionally, the first effect is used to indicate a display color, a display position, a display size, and a display style of the target information on the first photographing preview screen. The display style can be used for indicating target information to be displayed in a preset virtual frame, a preset smiling face image is displayed on the first shooting preview picture in a superposed mode, the first shooting preview picture is displayed in a static mode, or the first shooting preview picture is displayed in a dynamic mode (such as shaking) and the like.
Optionally, the second effect may be a hiding effect, a displaying effect, a switching effect, a sliding effect, a fade-in/fade-out effect, an animation effect, and the like.
For example, in an embodiment of the present invention, the first electronic device may first display an animation (e.g., an animation that fireworks show) on the first shooting preview screen, and then display the target information on the first shooting preview screen. Alternatively, the first electronic device may directly display the target information on the first photographing preview screen. Wherein the first electronic device can display the target information on the preview image in a dynamic form or a static form.
Optionally, when the number of the content in the target information is multiple, for example, when the target information includes multiple pictures, the first electronic device may display the multiple pictures on the first shooting preview screen at the same time, or sequentially display the multiple pictures in a certain order. The method for displaying the target information in the embodiment of the present invention is not particularly limited, and may be any method that can be implemented.
For example, the embodiment of the present invention may be applied to a scenario in which a first electronic device communicates with a second electronic device, and a user (denoted as a second user) of the second electronic device leaves a message to inform the user (denoted as a first user) of the first electronic device that an apple is in a refrigerator. The first AR object is used to represent an apple, and the target information may be "there is an apple in the refrigerator". Therefore, the user can input the virtual apple on the first shooting preview picture to know that the refrigerator has one apple, and the intuitiveness, convenience, interestingness and safety of the electronic equipment for displaying the message are improved.
As shown in fig. 3 (a), the AR apple M is included in the shooting preview screen 31 displayed by the first electronic device a, and the shooting preview screen 31 is a shooting preview screen of a scene where the kitchen is located. After the user (e.g., the first user) makes a click input to the AR apple M shown in (a) of fig. 3, the first electronic device a displays target information "one apple in refrigerator" and "one apple in refrigerator" in a static form on the photographing preview screen 31 as shown in (b) of fig. 3.
In the embodiment of the present invention, since the target information is associated with the first AR object, the first electronic device does not directly display the target information (e.g., the content of a message), but displays the first AR object on the first shooting preview screen, and displays the target information associated with the first AR object on the first shooting preview screen when receiving the first input of the first AR object from the user. Specifically, other users except the local user generally cannot operate the first electronic device, that is, cannot operate the first AR object, so that the target information cannot be obtained, and the privacy information of the user related to the target information cannot be obtained, thereby improving the security of target information display. In addition, the target information is displayed on the first preview picture of the first electronic device, and is not directly displayed on interfaces such as interfaces of corresponding communication applications, so that the first electronic device can improve the intuitiveness and convenience of target information display by firstly displaying the first AR object and then displaying the target information. Namely, the intuitiveness, the safety and the convenience of leaving messages for other people by the user through the electronic equipment are improved.
Optionally, in this embodiment of the present invention, after the step 202, a step 204 is further included, and the corresponding step 202 may be replaced with the step 202 a:
and step 204, the first electronic equipment displays the target AR object on the first shooting preview picture.
Optionally, the target AR object may be an AR object obtained by real-time scanning by the first electronic device, or obtained by controlling, by the first user, the first electronic device to obtain from an AR object already stored.
For example, the first electronic device may prompt the user to control the first electronic device to scan for an object.
Step 202a, when the target AR object is matched with the second AR object, the first electronic device displays the first AR object on the first shooting preview screen.
And the second AR object is authentication information associated with the target information.
Optionally, the first electronic device may compare the similarity between the scanned target AR object and the second AR object to determine whether the scanned object is the same as the object corresponding to the second AR object. When the similarity between the target AR object and the second AR object is greater than or equal to a certain threshold (e.g., 80%), the first electronic device may determine that the scanned object is the same as the object corresponding to the second AR object, that is, the target AR object is matched with the second AR object.
It can be understood that if the object scanned by the first electronic device is the same as the object corresponding to the second AR object, the authentication is successful; otherwise, the authentication fails.
For example, in a scenario where the user of the target device (e.g., the second electronic device) informs the user of the first electronic device that there is an apple in the refrigerator, the second AR object is the AR object of the refrigerator. Therefore, the user can know the information that 'one apple is in the refrigerator' only after the target AR object of the refrigerator is obtained through scanning, and the interestingness and the safety of information display are improved.
For example, after the user performs a click input on the AR apple M shown in (a) of fig. 3, prompt information "please scan the refrigerator" 41 and "confirm" button 42 as shown in (a) of fig. 4 may be displayed on the photographing preview screen 31. After the click input of the "ok" button 42 shown in (a) of fig. 4 by the user, the first electronic device a may display the scanned AR refrigerator K on the photographing preview screen 31 as shown in (b) of fig. 4. Subsequently, in a case where the first electronic device a determines that the scanned refrigerator is the refrigerator corresponding to the second AR object, the contents as shown in (b) of fig. 3 may be displayed, that is, the target information "one apple in the refrigerator" is displayed on the photographing preview screen 31.
It should be noted that, in the embodiment of the present invention, since the target information is associated with the second AR object, after receiving the input of the first AR object by the user, the first electronic device does not directly display the target information on the first shooting preview screen, but only displays the target information on the first shooting preview screen when the scanned object is the same as the object represented by the second AR object. However, since the user other than the local user usually cannot operate the first electronic device, the first electronic device cannot be controlled to scan the object represented by the second AR object, and thus the target information cannot be obtained, which avoids leakage of the privacy information of the user in the target information. Thus, the safety of information display is further improved. Specifically, the user can quickly and conveniently acquire information (namely target information) associated with the second AR object of the refrigerator by scanning a certain real object (such as the refrigerator) in the display.
Optionally, in the embodiment of the present invention, before the step 204, a step 205 may further be included:
step 205: and under the condition that the first electronic equipment displays the first prompt message on the first shooting preview picture, acquiring an image of the first object, and generating a target AR object according to the image of the first object.
The first prompt information is used for prompting a user to control the first electronic device to acquire the image of the first object.
Optionally, the first electronic device may display an interface for scanning the object to support the user to control the first electronic device to scan the image of the first object and generate the target AR object.
The scanning interface is an interface of an object scanned by the first electronic device.
Optionally, the scanning interface may further include scanning prompt information (e.g., first prompt information), where the scanning prompt information is used to prompt that the authentication is completed by scanning an object corresponding to the second AR object. Wherein, the scanning prompt information can be displayed in a floating manner in the scanning interface.
In addition, the scan prompt message may be continuously displayed in the scan interface, or may be canceled after being displayed in the scan interface for a certain period of time (e.g., 2 seconds).
Optionally, the first electronic device may further display the scan prompt information and the confirmation button in response to the first input, and display the scan interface after the user inputs the confirmation button.
In the embodiment of the invention, before the first electronic device displays the first object and the target information, the first electronic device can prompt the user to scan the object to generate and display the second AR object through the first prompt message, namely, the second AR object is used for judging whether the current user has the authority of viewing the target information, and the second AR object is the authentication information of the target information. In this way, the first electronic device displays the target information on the first photographing preview screen only if the object scanned by the first electronic device is identical to the object corresponding to the second AR object. Therefore, the user can timely control the first electronic device to scan to obtain the target AR object matched with the second AR object, and the man-machine interaction performance in the information display process is improved.
Optionally, in the embodiment of the present invention, before the step 202, a step 207 may further be included:
step 207: the first electronic device receives the AR message sent by the target device.
In a first application scenario, the AR message includes a first AR object and target information, and the AR message is used to indicate that the first AR object is associated with the target information.
In a second application scenario, the AR message includes a first AR object, a second AR object, and target information, and the AR message is used to indicate that the first AR object is associated with the target information, and the second AR object is authentication information associated with the target information.
Accordingly, the target device may send the AR message to the first electronic device.
Optionally, in this embodiment of the present invention, when an electronic device (e.g., a first electronic device) receives a message, it may first determine whether target information of the message is associated with an AR object (e.g., a first AR object). If the message is associated with the AR object, the electronic equipment can open the camera and display the corresponding AR object on the corresponding preview image; if the target information is not associated with the AR object, the electronic equipment does not need to open the camera and display the preview image, but directly displays the target information.
Specifically, after receiving the AR message, the first electronic device may first open the camera and display a first shooting preview picture acquired by the camera in real time, and then display the first AR object on the first shooting preview picture.
In the embodiment of the present invention, both the first electronic device and a target device (e.g., a second electronic device) described below support the AR technology, so that both the first electronic device and the target device can read the AR message.
It should be noted that, in the embodiment of the present invention, the AR message received by the first electronic device from the target device includes not only the target information, but also a first AR object associated with the target information, or a first AR object and a second AR object associated with the target information. In this way, the first electronic device may display the target information through the first AR object, or display the target information through the first AR object and the second AR object. Therefore, the safety and convenience of information display of the electronic equipment are improved.
Optionally, the information processing method provided in the embodiment of the present invention may further include step 208 to step 211. For example, step 208-step 211 may be included before step 201 described above:
and step 208, the target device acquires the target information input by the user.
Alternatively, in the case where the target device is a second electronic device different from the first electronic device, the second user may input the target information in the second electronic device.
Correspondingly, when the target device is the server, after the user inputs the target information through a second electronic device different from the first electronic device, the second electronic device may be triggered to report the target information to the server.
In the following embodiments, the information processing method provided by the embodiments of the present invention is described by taking a target device as an example of a second electronic device.
Alternatively, in the case where the target device is an electronic device (e.g., a second electronic device), the target device may input the target information through a predefined communication application installed therein.
Alternatively, the target information may be input by the user through a keyboard (e.g., a virtual keyboard or a physical keyboard) of the target device, or the user may select content from content (e.g., text, pictures, videos, etc.) already stored in the target device.
Optionally, the target device may select whether to associate an AR object for the target information, i.e. to transmit an AR message including the target information and the AR object associated with the target information, or to transmit a general message including only the target information.
For example, in the embodiment of the present invention, after step 208 and before step 209, step 208a may further be included:
step 208a, the target device determines whether an AR object needs to be associated with the target information.
Specifically, in the case where the target device determines that the AR object needs to be associated with the target information, the target device may perform the following steps 209 to 211. In case the target device determines that it is not necessary to associate an AR object with the target information, the target device does not associate an AR object with the target information, but directly sends the target information itself to the first electronic device, i.e. the target device will not perform steps 209-211 described below.
For example, in this embodiment of the present invention, the first electronic device may provide a setting option for selecting whether to associate an AR object with the target information, for example, the first setting option is used to indicate that the AR object is associated with the target information, and the second setting option is used to indicate that the AR object does not need to be associated with the target information. The first setting option and the second setting option can be target information in a predefined communication application. At this time, the user inputs the first setting option or the second setting option to trigger whether the electronic device is the target information associated AR object.
Optionally, in this embodiment of the present invention, after step 208 is executed, the target device may display the target information that has been input, and the first setting option and the second setting option on the screen. Alternatively, the target device may provide the first setting option and the second setting option in a setting application, and the first setting option is input in the setting application in advance, such as before step 204 is performed.
In the embodiment of the present invention, before sending the AR message, the target device may support the user to select whether to associate an AR object (e.g., a first AR object) with the target information. As such, in a scenario in which the user does not need to view an AR message including the target information and the first AR object (or the first AR object and the second AR object) associated with the target information, the target device may send a general message including the target information but not including the AR object to the first electronic device, so that the target device may be prevented from performing unnecessary operations of generating an AR message from the target information and the target AR object associated with the target information, and transmitting the AR object. Correspondingly, when the first electronic device receives the message, the AR object is not received, the AR object is not displayed, and the target information can be directly displayed, so that the requirements of the user are met.
And step 209, the target device stores the target information and the first AR object in an associated manner.
The target device stores the target information and the first AR object in an associated manner, which means that the target device sets an association relationship between the target information and the first AR object.
Illustratively, in conjunction with the above-described embodiments, the target device may perform step 209 if the first setting option is selected.
Step 210, the target device generates an AR message based on the target information and the first AR object.
It is understood that the first AR object may be an AR object of an object selected by the user in real time or an AR object of a preset object.
It should be noted that the first AR object may include one or more AR objects, that is, the first AR object corresponds to one or more objects. Illustratively, the first AR object corresponds to an object.
Step 211, the target device sends the AR message to the first electronic device.
It can be understood that, after the target device sends the AR message to the first electronic device, the first electronic device may learn the AR message and display the first AR object in the target AR object on the first shooting preview screen instead of directly displaying the target information in the AR message.
Optionally, in this embodiment of the present invention, the target device may send the AR message to the first electronic device through a predefined communication application installed in the target device.
In the information processing method provided by the embodiment of the present invention, since the AR message sent by the target device to the first electronic device includes not only the target information but also the first AR object associated with the target information, instead of only the target information itself, the first electronic device does not directly display the target information, but displays the first AR object on the first shooting preview screen of the first electronic device first, and displays the target information on the first shooting preview screen when receiving the first input of the user to the first AR object. Therefore, the safety and convenience of information display are improved, and the intuitiveness, the safety and the convenience of leaving messages for other people through the electronic equipment by the user are improved.
Optionally, in this embodiment of the present invention, the step 209 may be implemented by a step 209a, and correspondingly, the step 210 may be implemented by a step 210 a:
step 209a, the target device stores the target information, the first AR object and the second AR object in association.
And the second AR object is authentication information associated with the target information.
Specifically, the target device associates the first AR object with the target information, so that after the first AR object is used as an entry of the target information, the second AR object is associated with the target information, so that the second AR object is used as authentication information of the target information.
Step 210a, the target device generates an AR message based on the target information, the first AR object, and the second AR object.
In the embodiment of the present invention, since the AR message sent by the target device to the first electronic device includes not only the target information but also the first AR object and the second AR object associated with the target information, instead of only the target information itself, the first electronic device is enabled not to directly display the target information, but to display the target information through the first AR object and the second AR object on the first shooting preview screen of the first electronic device. Therefore, the safety of the electronic equipment for displaying information is further improved.
Optionally, in this embodiment of the present invention, before the step 209a, a step 212 may further be included:
and 212, acquiring an image of the target object by the target device under the condition that the target device displays the target prompt information on the target shooting preview picture, and generating a first target AR object according to the image of the target object.
Wherein, in case the target object is a first object, the first target AR object is a second AR object; the first target AR object is a first AR object if the target object is a second object; in the case where the target object includes a first object and a second object, the first target AR object includes a first AR object and a second AR object; the target prompt information is used for prompting a user to control the target device to acquire the image of the target object.
Illustratively, the target device may scan images of the target object (i.e., capture images of the target object) 360 degrees and generate the first target AR object from these images.
Optionally, when the target object is a plurality of objects, the user may control the target device to scan each object in sequence, and generate an AR object of each object, so as to obtain the first target AR object.
Optionally, during the process of the user controlling the target device to scan the target object, the target device may output some prompt information (i.e. target prompt information) to prompt the user how to control the target device to move. For example, the target device may prompt the user to move the target device to scan the photographic subject by displaying prompt information indicating a moving direction and a moving speed and a moving progress, which may include at least one of text prompt information and image prompt information.
Optionally, in step 212, the target device may further prompt the user to scan the target object, such as displaying a target prompt message "please pick up the device to scan the object" to prompt the user to scan the target object.
In the embodiment of the invention, the target device can support the user to select the target object in the target shooting preview picture displayed in real time, so that the user can select any object as the target object by moving the target device, the target AR object obtained by scanning the target device meets the user requirement, and the display effect of the AR message comprising the target AR object meets the user requirement. For example, the user may control the target device to capture an image of a refrigerator and an image of an apple to obtain a first AR object representing the apple and a second AR object representing the refrigerator.
Optionally, in this embodiment of the present invention, the step 212 may be implemented by a step 212 a:
step 212a, the target device acquires an image of the target object, and generates a first target AR object according to the image of the target object when the time length of the image of the target object displayed in the target area is greater than or equal to a preset time length.
The target area is an area in the target shooting preview picture.
For example, the target area may be an area for displaying a checkbox in the target shooting preview screen.
Specifically, in the mode 1, the target device may display a target shooting preview picture including a selected frame, and in a case where the time length of the image of one or more objects in the selected frame reaches a preset time length (for example, 1s), the target device may determine that the one or more objects are target objects. The images of the one or more objects can be displayed in the selected frame at the same time, or the images of the one or more objects are sequentially displayed in the selected frame according to a certain sequence, and the time length of the image of each object in the selected frame reaches the preset time length.
Illustratively, in combination with the above embodiments, in a scenario where the user of the target device informs the user of the first electronic device that there is an apple in the refrigerator. As shown in (a) of fig. 5, the target device B displays target information "one apple in the refrigerator" that the user has input, a first setting option "scan" 52 and a second setting option "cancel" 53 in the input box 51 of the communication application 1. The first setting option "scan" 52 is used to indicate that the target device B is the target information "there is an apple in the refrigerator" associated AR object, and the first setting option "cancel" 53 is used to indicate that the target device B does not need to be the target information "there is an apple in the refrigerator" associated AR object.
In one example, in connection with manner 1, after target device B receives a user click input to a first setting option "scan" 52 shown in (a) of fig. 5, as shown in (B) of fig. 5, target device B may turn on the camera and display a target shooting preview screen 54, which includes a checkbox 511 in box 511, which includes an image N of an apple in checkbox 511. Subsequently, in the case where the duration of the image N of the apple at the checkbox 511 reaches 1 second, the target device B may determine that the apple is an object corresponding to the first AR object, and display a scanning interface as shown in (c) of fig. 5. The scanning interface includes the AR apple M scanned and displayed by the target device B and an "ok" button 55. After the user clicks and inputs the "ok" button 55, the target device B may determine that the AR apple M is the first AR object, that is, the entry of the target information "one apple in refrigerator".
Further, the target device B may scan the AR refrigerator K to obtain an object corresponding to the second AR object, and determine that the refrigerator is the object corresponding to the second AR object, so as to obtain an AR message including the first AR object and the second AR object.
Therefore, the user can conveniently select the image of the target object according to the requirement, the target device only needs to identify the target object but does not need to identify other objects except the target object in the target shooting preview picture, and the scanned AR object meets the requirement of the user and simultaneously can reduce the resource waste of the target device.
Optionally, in this embodiment of the present invention, the step 212 may be implemented by steps 212b and 212 c:
and 212b, acquiring an image of the target object by the target device, and receiving target input of the first identifier by the user under the condition that the first identifier is displayed on the target AR object.
Wherein the first mark is used for marking the image of the target object in the target shooting preview picture.
Step 212c, the target device generates a first target AR object from the image of the target object in response to the target input.
Mode 2: the first electronic device may select an image of the target object through user input of the first identifier in a case where the first identifier is displayed on the target photographing preview screen, the first identifier being used to mark the image of the target object in the target photographing preview screen.
The first electronic equipment can identify images of all objects in the target shooting preview picture and mark the image of each object respectively.
Optionally, in the embodiment of the present invention, the form of the identifier used by the first electronic device to mark the image of the object, for example, the form of the first identifier used to mark the image of the target object, is not particularly limited, and may be any implementable form. For example, the first identifier may be a frame line displayed around the image of the target object.
Specifically, in mode 2, in the case where the target apparatus displays the target shooting preview screen, the target apparatus may recognize images of one or more objects included in the target shooting preview screen by an object recognition technique, and frame (i.e., mark) the images of the objects by one or more mark frames, respectively. Subsequently, after the user makes an input (e.g., a click input) to some of the markup frames, the target device may determine that the objects in the markup frames are target objects.
In another example, after the target device B receives the user's click input to the first setting option "scan" 52 shown in (a) of fig. 5, the content shown in (a) of fig. 6 may be displayed instead of the content shown in (B) of fig. 5. Among them, the content shown in (a) in fig. 6 includes the target shooting preview screen 61, the target shooting preview screen 61 includes a mark frame 611 and a mark frame 612, the mark frame 611 includes an image N of an apple, and the mark frame 612 includes an image W of a refrigerator. Subsequently, after the user makes a click input to the mark-up box 611 shown in (a) in fig. 6, the target device B may display a scan interface as shown in (B) in fig. 6. The scanning interface includes the AR apple M scanned and displayed by the target device B and the "ok" button 62. After the user clicks and inputs the "ok" button 62, the target device B may determine that the AR apple M is the first AR object, that is, the entry of the target information "one apple in refrigerator".
In addition, after the user makes a click input to the "ok" button 62 shown in (B) in fig. 6, the target apparatus B may display the target photographing preview screen 61 including a mark frame 611 and a mark frame 612 in the target photographing preview screen 61 as shown in (c) in fig. 6, the mark frame 611 including the image N of the apple, and the mark frame 612 including the image W of the refrigerator. Subsequently, after the target device B receives the user input to (c) in fig. 6 showing the mark box 612, a scanning interface including the AR refrigerator K scanned and displayed by the target device B and the "ok" button 63 may be displayed as shown in (d) in fig. 6. After the user clicks and inputs the "ok" button 63, the target device B may determine that the AR refrigerator K is the second AR object, that is, the target information "there is an apple in the refrigerator" authentication information.
In this way, since the target device can mark the images of the plurality of objects in the target shooting preview picture, the user can conveniently select the image of the target object from the target shooting preview picture, namely, conveniently select the image of one or more objects from the target shooting preview picture at the same time.
Optionally, in the embodiment of the present invention, before the step 209a, the method may further include steps 213 and 214:
step 213, the target device obtains a preset AR object associated with the preset keyword in advance when the target information includes the preset keyword.
Optionally, an AR object database may be preset in the target device, where the AR object database includes a plurality of preset AR objects, and each preset AR object is associated with one preset keyword.
Optionally, the AR objects in the AR object database may be manually input by the user, or may be AR objects of objects commonly used by the user, which are obtained based on big data statistics.
Specifically, after the target device receives the AR message, it may be determined whether the target information includes a preset keyword. Under the condition that the target information comprises certain preset keywords, the target device can sequentially search corresponding preset AR objects from the AR object database according to the preset keywords.
And step 214, the target device takes the preset AR object as a second target AR object.
Wherein the second target AR object comprises the first AR object and the second AR object.
For example, in the case where the target information is "one apple in refrigerator", the target device may recognize the keyword "refrigerator" and the keyword "apple". Subsequently, the target device may find the AR object of the "refrigerator" and the AR object of the "apple" from the AR object database under the condition that the keyword "refrigerator" and the keyword "apple" are determined to be preset keywords.
It is understood that the first AR object and the second AR object may be AR objects of objects described in the target information, that is, the first AR object and the second AR object may embody the target information to some extent. For example, the AR object of the refrigerator and the AR object of the apple may embody detailed information in the target information "there is one apple in the refrigerator". Therefore, the interest and intuition of target information display can be further improved.
It should be noted that, in the embodiment of the present invention, because the target device may set the preset keyword and the associated preset AR object, the target device may quickly obtain the second target AR object according to the preset keyword and the preset AR object, which is beneficial to reducing the time length for the target device to obtain the first AR object and the second AR object, and further reducing the time length for the target device to generate the AR message.
Optionally, in this embodiment of the present invention, step 202 may be implemented by step 202 b:
step 202b, the first electronic device displays the first AR object on the target AR object in an overlapping mode.
Specifically, under the condition that the first electronic device displays the first AR object on the target AR object in an overlapping manner, the user can visually see that the object corresponding to the first AR object is displayed on the object corresponding to the target AR object.
Exemplarily, in conjunction with fig. 4, after the first electronic device displays (b) of fig. 4, as shown in (a) of fig. 7, the first electronic device a may display the AR apple M on the AR refrigerator K in the first photographing preview screen 31. Subsequently, after the user makes an input (e.g., a first input) to the AR apple M shown in (a) of fig. 7, the first electronic device a may display target information "one apple in refrigerator" on the AR apple M as shown in (b) of fig. 7.
In the embodiment of the invention, the first electronic device can display the first AR object on the target AR object in an overlapping manner, for example, the first electronic device displays the AR object of the apple on the scanned AR object of the refrigerator, so that a user can conveniently and conveniently leave a message through a real object, and the intuitiveness of the message is improved.
Optionally, in the embodiment of the present invention, after the step 202, a step 206 may further be included:
in step 206, the first electronic device keeps the display of the first AR object in the first area or cancels the display of the first AR object when the object corresponding to the target AR object is not within the viewing range of the first electronic device.
The first area is an area in the first shooting preview picture.
It is understood that the "canceling the display of the first AR object" in step 206 is a parallel step with step 203.
Specifically, at the last time when the object corresponding to the target AR object is within the viewing range of the first electronic device, the area of the target AR object in the first shooting preview screen is the first area.
It can be understood that when the object corresponding to the target AR object (i.e., the first object) is not within the viewing range of the first electronic device, it indicates that the first electronic device cannot acquire the object corresponding to the target AR object. At this time, the first electronic device may cancel displaying the target AR object as the first object moves out of the viewing range of the first electronic device. At this time, a scene in which the first electronic device leaves the first object, for example, a scene in which the first electronic device leaves the kitchen in which the refrigerator is located, is described.
Specifically, the first electronic device keeps the display of the first AR object in the first area, which indicates that the user may trigger the first electronic device to display the target information through the first input to the first AR object subsequently.
In the embodiment of the invention, the first electronic device cancels the display of the first AR object in the first area, which indicates that the user cannot trigger the first electronic device to display the target information through the first input to the first AR object subsequently. In other words, the leakage of the private information of the user in the target information is avoided, and the safety of the information displayed by the electronic equipment is further improved. For example, when the first electronic device leaves a kitchen scene where a refrigerator is located, the first electronic device will not display the AR object of the apple to the user, and will not display the target information "there is an apple in the refrigerator", that is, the security of the display of the target information "there is an apple in the refrigerator" can be improved.
Fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. The electronic device 80 shown in fig. 8 is a first electronic device, and the electronic device 80 includes: a display module 81 and a first receiving module 82; a display module 81 for displaying the first AR object on the first photographing preview screen; a first receiving module 82, configured to receive a first input of the first AR object displayed by the display module 81 from the user; the display module 81 is further configured to display target information associated with the first AR object in response to the first input received by the first receiving module 82.
In the embodiment of the present invention, since the target information is associated with the first AR object, the first electronic device does not directly display the target information (e.g., the content of a message), but displays the first AR object on the first shooting preview screen, and displays the target information associated with the first AR object on the first shooting preview screen when receiving the first input of the first AR object from the user. Specifically, other users except the local user generally cannot operate the first electronic device, that is, cannot operate the first AR object, so that the target information cannot be obtained, and the privacy information of the user related to the target information cannot be obtained, thereby improving the security of target information display. In addition, the target information is displayed on the first preview picture of the first electronic device, and is not directly displayed on interfaces such as interfaces of corresponding communication applications, so that the first electronic device can improve the intuitiveness and convenience of target information display by firstly displaying the first AR object and then displaying the target information. Namely, the intuitiveness, the safety and the convenience of leaving messages for other people by the user through the electronic equipment are improved.
Optionally, the display module 81 is further configured to display the target AR object on the first shooting preview screen before displaying the first AR object on the first shooting preview screen; a display module 81, configured to display the first AR object on the first shooting preview screen when the target AR object matches the second AR object; and the second AR object is authentication information associated with the target information.
In the embodiment of the present invention, since the target information is associated with the second AR object, after receiving the input of the first AR object from the user, the first electronic device does not directly display the target information on the first shooting preview screen, but displays the target information on the first shooting preview screen only when the scanned object is the same as the object represented by the second AR object. However, since the user other than the local user usually cannot operate the first electronic device, the first electronic device cannot be controlled to scan the object represented by the second AR object, and thus the target information cannot be obtained, which avoids leakage of the privacy information of the user in the target information. Thus, the safety of information display is further improved. Specifically, the user can quickly and conveniently acquire information (namely target information) associated with the second AR object of the refrigerator by scanning a certain real object (such as the refrigerator) in the display.
Optionally, the display module 81 is further configured to, after the first AR object is displayed on the first shooting preview screen, maintain the display of the first AR object in the first area or cancel the display of the first AR object when the object corresponding to the target AR object is not within the viewing range of the electronic device; the first area is an area in the first shooting preview picture.
In the embodiment of the invention, the first electronic device cancels the display of the first AR object in the first area, which indicates that the user cannot trigger the first electronic device to display the target information through the first input to the first AR object subsequently. In other words, the leakage of the private information of the user in the target information is avoided, and the safety of the information displayed by the electronic equipment is further improved. For example, when the first electronic device leaves a kitchen scene where a refrigerator is located, the first electronic device will not display the AR object of the apple to the user, and will not display the target information "there is an apple in the refrigerator", that is, the security of the display of the target information "there is an apple in the refrigerator" can be improved.
Optionally, the display module 81 is specifically configured to display the first AR object in an overlapping manner on the target AR object.
In the embodiment of the invention, the first electronic device can display the first AR object on the target AR object in an overlapping manner, for example, the first electronic device displays the AR object of the apple on the scanned AR object of the refrigerator, so that a user can conveniently and conveniently leave a message through a real object, and the intuitiveness of the message is improved.
Optionally, the electronic device 80 further includes: a generation module; the generating module is configured to acquire an image of the first object and generate the target AR object according to the image of the first object when the display module 81 displays the first prompt information on the first shooting preview picture before the target AR object is displayed on the first shooting preview picture; the first prompt information is used for prompting a user to control the first electronic device to acquire the image of the first object.
In the embodiment of the invention, before the first electronic device displays the first object and the target information, the first electronic device can prompt the user to scan the object to generate and display the second AR object through the first prompt message, namely, the second AR object is used for judging whether the current user has the authority of viewing the target information, and the second AR object is the authentication information of the target information. In this way, the first electronic device displays the target information on the first photographing preview screen only if the object scanned by the first electronic device is identical to the object corresponding to the second AR object. Therefore, the user can timely control the first electronic device to scan to obtain the target AR object matched with the second AR object, and the man-machine interaction performance in the information display process is improved.
Optionally, the electronic device 80 further includes: a second receiving module; a second receiving module, configured to receive, by the display module 81, an AR message sent by the target device before displaying the first AR object on the first shooting preview screen; the AR message comprises a first AR object and target information, and the AR message is used for indicating the association of the first AR object and the target information; or the AR message includes a first AR object, a second AR object, and target information, where the AR message is used to indicate that the first AR object is associated with the target information, and the second AR object is authentication information associated with the target information.
In this embodiment of the present invention, the AR message received by the first electronic device from the target device includes not only the target information, but also the first AR object associated with the target information, or the first AR object and the second AR object associated with the target information. In this way, the first electronic device may display the target information through the first AR object, or display the target information through the first AR object and the second AR object. Therefore, the safety and convenience of information display of the electronic equipment are improved.
The electronic device 80 provided in the embodiment of the present invention can implement each process implemented by the first electronic device in the above method embodiments, and is not described here again to avoid repetition.
Fig. 9 is a schematic structural diagram of a target device according to an embodiment of the present invention. Fig. 9 shows an electronic device 90 including: the device comprises an acquisition module 91, a storage module 92, a generation module 93 and a sending module 94; an obtaining module 91, configured to obtain target information input by a user; a storage module 92, configured to store the target information acquired by the acquisition module 91 in association with the first AR object; a generating module 93, configured to generate an AR message based on the target information and the first AR object stored in the storage module 92; a sending module 94, configured to send the AR message generated by the generating module 93 to the first electronic device.
In the information processing method provided by the embodiment of the present invention, since the AR message sent by the target device to the first electronic device includes not only the target information but also the first AR object associated with the target information, instead of only the target information itself, the first electronic device does not directly display the target information, but displays the first AR object on the first shooting preview screen of the first electronic device first, and displays the target information on the first shooting preview screen when receiving the first input of the user to the first AR object. Therefore, the safety and convenience of information display are improved, and the intuitiveness, the safety and the convenience of leaving messages for other people through the electronic equipment by the user are improved.
Optionally, the storage module 92 is specifically configured to store the target information, the first AR object, and the second AR object in an associated manner; a generating module 93, specifically configured to generate an AR message based on the target information and the first AR object and the second AR object stored by the storage module 92; and the second AR object is authentication information associated with the target information.
In the embodiment of the present invention, since the AR message sent by the target device to the first electronic device includes not only the target information but also the first AR object and the second AR object associated with the target information, instead of only the target information itself, the first electronic device is enabled not to directly display the target information, but to display the target information through the first AR object and the second AR object on the first shooting preview screen of the first electronic device. Therefore, the safety of the electronic equipment for displaying information is further improved.
Optionally, the generating module 93 is further configured to collect an image of the target object and generate the first target AR object according to the image of the target object under the condition that the target prompt information is displayed on the target shooting preview screen before storing the target information, the first AR object, and the second AR object in an associated manner; wherein, in case the target object is a first object, the first target AR object is a second AR object; the first target AR object is a first AR object if the target object is a second object; in the case where the target object includes a first object and a second object, the first target AR object includes a first AR object and a second AR object; the target prompt information is used for prompting a user to control the target device to acquire the image of the target object.
In the embodiment of the present invention, since the AR message sent by the target device to the first electronic device includes not only the target information but also the first AR object and the second AR object associated with the target information, instead of only the target information itself, the first electronic device is enabled not to directly display the target information, but to display the target information through the first AR object and the second AR object on the first shooting preview screen of the first electronic device. Therefore, the safety of the electronic equipment for displaying information is further improved.
Optionally, the generating module 93 is specifically configured to acquire an image of the target object, and generate the first target AR object according to the image of the target object when a duration of the image of the target object displayed in the target area is greater than or equal to a preset duration; the target area is an area in the target shooting preview picture.
Therefore, the user can conveniently select the image of the target object according to the requirement, the target device only needs to identify the target object but does not need to identify other objects except the target object in the target shooting preview picture, and the scanned AR object meets the requirement of the user and simultaneously can reduce the resource waste of the target device.
Optionally, the generating module 93 is specifically configured to acquire an image of the target object, and receive a target input of the first identifier from the user when the first identifier is displayed on the target AR object; generating a first target AR object from an image of a target object in response to a target input; wherein the first mark is used for marking the image of the target object in the target shooting preview picture.
In this way, since the target device can mark the images of the plurality of objects in the target shooting preview picture, the user can conveniently select the image of the target object from the target shooting preview picture, namely, conveniently select the image of one or more objects from the target shooting preview picture at the same time.
Optionally, the obtaining module 91 is further configured to, before the storage module 92 stores the target information, the first AR object, and the second AR object in an associated manner, obtain a preset AR object that is associated in advance with a preset keyword in the target information when the preset keyword is included in the target information; taking the preset AR object as a second target AR object; wherein the second target AR object comprises the first AR object and the second AR object.
In the embodiment of the invention, the target device can set the preset keyword and the associated preset AR object, so that the target device can quickly acquire the second target AR object according to the preset keyword and the preset AR object, the time for acquiring the first AR object and the second AR object by the target device is favorably reduced, and the time for generating the AR message by the target device is further reduced.
The target device 90 provided in the embodiment of the present invention can implement each process implemented by the target device in the foregoing method embodiments, and for avoiding repetition, details are not described here again.
Fig. 10 is a schematic diagram of a hardware structure of an electronic device 100 according to an embodiment of the present invention, where the electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 10 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a wearable device, a pedometer, and the like.
Alternatively, in the embodiment of the present invention, the structure shown in the electronic device 100 may be the structure of the first electronic device, that is, the structure of the electronic device 80.
Specifically, the camera in the first electronic device may be implemented by the sensor 105. At this time, the sensor 105 may be an image sensor.
Wherein the display unit 106 is configured to display the first AR object on the first photographing preview screen; a user input unit 107 for receiving a first input of the first AR object displayed by the display unit 106 by a user; the display unit 106 is further configured to display target information associated with the first AR object in response to the first input received by the user input unit 107.
In the first electronic device provided in the embodiment of the present invention, since the target information is associated with the first AR object, the first electronic device does not directly display the target information (e.g., content of a message), but displays the first AR object on the first shooting preview screen, and displays the target information associated with the first AR object on the first shooting preview screen when receiving a first input from the user to the first AR object. Specifically, other users except the local user generally cannot operate the first electronic device, that is, cannot operate the first AR object, so that the target information cannot be obtained, the privacy information of the user related to the target information cannot be obtained, and the security of target information display is improved. In addition, the target information is displayed on the first preview picture of the first electronic device, and is not directly displayed on interfaces such as interfaces of corresponding communication applications, so that the first electronic device can improve the intuitiveness and convenience of target information display by firstly displaying the first AR object and then displaying the target information. Namely, the intuitiveness, the safety and the convenience of leaving messages for other people by the user through the electronic equipment are improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 102, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic apparatus 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The electronic device 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the electronic device 100 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 9, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the electronic apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 100 or may be used to transmit data between the electronic apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the electronic device. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The electronic device 100 may further include a power source 111 (such as a battery) for supplying power to each component, and preferably, the power source 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the electronic device 100 includes some functional modules that are not shown, and are not described in detail herein.
Fig. 11 is a schematic diagram of a hardware structure of a target device according to an embodiment of the present invention, where the target device 110 includes, but is not limited to: a radio frequency unit 111, a network module 112, an audio output unit 113, an input unit 114, a sensor 115, a display unit 116, a user input unit 117, an interface unit 118, a memory 119, a processor 120, and a power supply 121. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 11 does not constitute a limitation of electronic devices, which may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a wearable device, a pedometer, and the like.
Alternatively, in the embodiment of the present invention, the structure shown in the target device 110 may be the structure of the target device 90.
Specifically, in the embodiment of the present invention, the target device 110 is taken as an example for description.
The camera in the target device 110 may be implemented by the sensor 115. At this time, the sensor 115 may be an image sensor.
The processor 120 is configured to obtain target information input by a user; a memory 119 for storing the target information acquired by the processor 120 in association with the first AR object; a processor 120, further configured to generate an AR message based on the target information and the first AR object stored in the memory 119; an interface unit 118 for sending the AR message generated by the processor 120 to the first electronic device.
In the target device provided by the embodiment of the present invention, since the AR message sent by the target device to the first electronic device includes not only the target information but also the first AR object associated with the target information, instead of only the target information itself, the first electronic device does not directly display the target information, but first displays the first AR object on the first shooting preview screen of the first electronic device, and displays the target information on the first shooting preview screen when receiving the first input of the user to the first AR object. Therefore, the safety and convenience of information display are improved, and the intuitiveness, the safety and the convenience of leaving messages for other people through the electronic equipment by the user are improved.
Similarly, for the description of each module in the target device 110, reference may be made to the description of each module in the electronic device 100 in the foregoing embodiment, and details are not repeated here.
Optionally, in this embodiment of the present invention, the electronic device in the above embodiment may be an AR device. Specifically, when the electronic device in the above embodiment (for example, the electronic device shown in fig. 8 or fig. 10) is an AR device, the AR device may include all or part of the functional modules in the electronic device. Of course, the AR device may further include a functional module not included in the electronic device.
Optionally, an embodiment of the present invention further provides an electronic device, which includes a processor 110, a memory 109, and a computer program that is stored in the memory 109 and is executable on the processor 110, and when the computer program is executed by the processor 110, the electronic device implements each process of the foregoing method embodiment, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
Optionally, an embodiment of the present invention further provides a target device, which includes a processor 120, a memory 119, and a computer program that is stored in the memory 119 and is executable on the processor 120, and when the computer program is executed by the processor 120, the computer program implements each process of the foregoing method embodiment, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the method embodiments, and can achieve the same technical effects, and in order to avoid repetition, the details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling an electronic device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (15)

1. An information processing method applied to a first electronic device, the method comprising:
displaying a first AR object on a first shooting preview screen;
receiving a first input of the first AR object by a user;
displaying target information associated with the first AR object in response to the first input;
before displaying the first AR object on the first photographing preview screen, the method further includes:
displaying a target AR object on the first shooting preview picture;
the displaying, on the first photographing preview screen, a first AR object includes:
displaying the first AR object on the first photographing preview screen in a case where the target AR object is matched with a second AR object;
wherein the second AR object is authentication information associated with the target information.
2. The method according to claim 1, wherein after displaying the first AR object on the first photographing preview screen, the method further comprises:
when the object corresponding to the target AR object is not in the view range of the electronic device, keeping the display of the first AR object in a first area, or canceling the display of the first AR object;
wherein the first region is a region in the first photographing preview screen.
3. The method according to claim 1, wherein said displaying the first AR object on the first photographic preview screen comprises:
and displaying the first AR object on the target AR object in an overlapping manner.
4. The method according to claim 1, wherein before displaying a target AR object on the first photographic preview screen, the method further comprises:
under the condition that first prompt information is displayed on the first shooting preview picture, acquiring an image of a first object, and generating the target AR object according to the image of the first object;
the first prompt message is used for prompting a user to control the first electronic device to acquire the image of the first object.
5. The method according to claim 1, wherein before displaying the first AR object on the first photographing preview screen, the method further comprises:
receiving an AR message sent by target equipment;
wherein the AR message includes the first AR object and the target information, and the AR message is used to indicate that the first AR object is associated with the target information;
or the AR message includes the first AR object, a second AR object, and the target information, where the AR message is used to indicate that the first AR object is associated with the target information, and the second AR object is authentication information associated with the target information.
6. An information processing method applied to a target device, the method comprising:
acquiring target information input by a user;
storing the target information and a first AR object in an associated manner;
generating an AR message based on the target information and the first AR object;
sending the AR message to a first electronic device;
the storing the target information in association with the first AR object includes:
storing the target information, the first AR object and the second AR object in an associated manner;
generating an AR message based on the target information and the first AR object, comprising:
generating the AR message based on the target information, the first AR object, and the second AR object;
wherein the second AR object is authentication information associated with the target information.
7. The method of claim 6,
before the storing the target information, the first AR object, and the second AR object in association, the method further includes:
under the condition that target prompt information is displayed on a target shooting preview picture, acquiring an image of a target object, and generating a first target AR object according to the image of the target object;
wherein, in the case that the target object is a first object, the first target AR object is the second AR object;
the first target AR object is the first AR object when the target object is a second object;
in a case where the target object includes the first object and the second object, the first target AR object includes the first AR object and the second AR object;
the target prompt information is used for prompting a user to control the target equipment to acquire the image of the target object.
8. The method of claim 7,
the acquiring an image of a target object and generating a first target AR object from the image of the target object includes:
acquiring an image of the target object, and generating the first target AR object according to the image of the target object under the condition that the time length of the image of the target object displayed in a target area is greater than or equal to a preset time length;
and the target area is an area in the target shooting preview picture.
9. The method of claim 7,
the acquiring an image of a target object and generating a first target AR object from the image of the target object includes:
acquiring an image of the target object, and receiving target input of a user to a first identifier under the condition that the first identifier is displayed on the target AR object;
generating the first target AR object from an image of the target object in response to the target input;
wherein the first mark is used for marking the image of the target object in the target photographing preview screen.
10. The method of claim 6, wherein prior to storing the target information, the first AR object, and the second AR object in association, the method further comprises:
under the condition that the target information comprises preset keywords, acquiring a preset AR object associated with the preset keywords in advance;
taking the preset AR object as a second target AR object;
wherein the second target AR object comprises the first AR object and the second AR object.
11. An electronic device, the electronic device being a first electronic device, the electronic device comprising: the display module and the first receiving module;
the display module is used for displaying a first AR object on a first shooting preview picture;
the first receiving module is used for receiving a first input of the first AR object displayed by the display module by a user;
the display module is further configured to display target information associated with the first AR object in response to the first input received by the first receiving module;
the display module is further configured to display a target AR object on a first shooting preview screen before displaying the first AR object on the first shooting preview screen;
the display module is specifically configured to display the first AR object on the first shooting preview screen under a condition that the target AR object is matched with a second AR object;
wherein the second AR object is authentication information associated with the target information.
12. A target device, the target device comprising: the device comprises an acquisition module, a storage module, a generation module and a sending module;
the acquisition module is used for acquiring target information input by a user;
the storage module is used for storing the target information acquired by the acquisition module and the first AR object in an associated manner;
the generating module is configured to generate an AR message based on the target information and the first AR object stored by the storage module;
the sending module is configured to send the AR message generated by the generating module to a first electronic device;
the storage module is specifically configured to store the target information, the first AR object, and the second AR object in an associated manner;
the generating module is specifically configured to generate the AR message based on the target information, the first AR object, and the second AR object;
wherein the second AR object is authentication information associated with the target information.
13. An electronic device, comprising a processor, a memory, and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the information processing method according to any one of claims 1 to 5.
14. A target device, characterized by comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the information processing method according to any one of claims 6 to 10.
15. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when executed by a processor, implements the steps of the information processing method according to any one of claims 1 to 5, or according to any one of claims 6 to 10.
CN201911424027.2A 2019-12-31 2019-12-31 Information processing method and device Active CN111093033B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911424027.2A CN111093033B (en) 2019-12-31 2019-12-31 Information processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911424027.2A CN111093033B (en) 2019-12-31 2019-12-31 Information processing method and device

Publications (2)

Publication Number Publication Date
CN111093033A CN111093033A (en) 2020-05-01
CN111093033B true CN111093033B (en) 2021-08-06

Family

ID=70398807

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911424027.2A Active CN111093033B (en) 2019-12-31 2019-12-31 Information processing method and device

Country Status (1)

Country Link
CN (1) CN111093033B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111556271B (en) * 2020-05-13 2021-08-20 维沃移动通信有限公司 Video call method, video call device and electronic equipment
CN112232897B (en) * 2020-09-25 2022-04-22 北京五八信息技术有限公司 Data processing method and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102467343A (en) * 2010-11-03 2012-05-23 Lg电子株式会社 Mobile terminal and method for controlling the same
CN103207728A (en) * 2012-01-12 2013-07-17 三星电子株式会社 Method Of Providing Augmented Reality And Terminal Supporting The Same
CN103561065A (en) * 2013-10-22 2014-02-05 深圳市优逸电子科技有限公司 System and method for achieving 3D virtual advertisement with mobile terminal
CN107992814A (en) * 2017-11-28 2018-05-04 北京小米移动软件有限公司 Object finding method and device
CN108459700A (en) * 2017-02-17 2018-08-28 阿里巴巴集团控股有限公司 The method and device of interactive object information is provided
CN108924624A (en) * 2018-08-03 2018-11-30 百度在线网络技术(北京)有限公司 Information processing method and device
CN109828668A (en) * 2019-01-30 2019-05-31 维沃移动通信有限公司 A kind of display control method and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9488488B2 (en) * 2010-02-12 2016-11-08 Apple Inc. Augmented reality maps
WO2017165705A1 (en) * 2016-03-23 2017-09-28 Bent Image Lab, Llc Augmented reality for the internet of things
CN108108012B (en) * 2016-11-25 2019-12-06 腾讯科技(深圳)有限公司 Information interaction method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102467343A (en) * 2010-11-03 2012-05-23 Lg电子株式会社 Mobile terminal and method for controlling the same
CN103207728A (en) * 2012-01-12 2013-07-17 三星电子株式会社 Method Of Providing Augmented Reality And Terminal Supporting The Same
CN103561065A (en) * 2013-10-22 2014-02-05 深圳市优逸电子科技有限公司 System and method for achieving 3D virtual advertisement with mobile terminal
CN108459700A (en) * 2017-02-17 2018-08-28 阿里巴巴集团控股有限公司 The method and device of interactive object information is provided
CN107992814A (en) * 2017-11-28 2018-05-04 北京小米移动软件有限公司 Object finding method and device
CN108924624A (en) * 2018-08-03 2018-11-30 百度在线网络技术(北京)有限公司 Information processing method and device
CN109828668A (en) * 2019-01-30 2019-05-31 维沃移动通信有限公司 A kind of display control method and electronic equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
捡瓶子的火星人.支付宝AR红包怎么找红包怎么隐藏红包.《百度经验https://jingyan.baidu.com/article/e75aca85159656142edac6af.html》.2016, *
支付宝AR红包怎么找红包怎么隐藏红包;捡瓶子的火星人;《百度经验https://jingyan.baidu.com/article/e75aca85159656142edac6af.html》;20161224;全文 *

Also Published As

Publication number Publication date
CN111093033A (en) 2020-05-01

Similar Documents

Publication Publication Date Title
CN109525874B (en) Screen capturing method and terminal equipment
CN109218648B (en) Display control method and terminal equipment
CN109215007B (en) Image generation method and terminal equipment
CN110096326B (en) Screen capturing method, terminal equipment and computer readable storage medium
CN109828731B (en) Searching method and terminal equipment
CN111010523B (en) Video recording method and electronic equipment
WO2021057290A1 (en) Information control method and electronic device
US20220286622A1 (en) Object display method and electronic device
US20210320995A1 (en) Conversation creating method and terminal device
CN111158817A (en) Information processing method and electronic equipment
CN111124706A (en) Application program sharing method and electronic equipment
WO2021082772A1 (en) Screenshot method and electronic device
CN111163260A (en) Camera starting method and electronic equipment
CN110866465A (en) Control method of electronic equipment and electronic equipment
CN111079030A (en) Group searching method and electronic device
CN111383175A (en) Picture acquisition method and electronic equipment
CN111083374B (en) Filter adding method and electronic equipment
CN111124231B (en) Picture generation method and electronic equipment
CN111090529B (en) Information sharing method and electronic equipment
CN111093033B (en) Information processing method and device
CN110012151B (en) Information display method and terminal equipment
CN109117037B (en) Image processing method and terminal equipment
CN109166164B (en) Expression picture generation method and terminal
CN108833791B (en) Shooting method and device
CN111178306B (en) Display control method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant