CN114049467A - Display method, display device, display apparatus, storage medium, and program product - Google Patents

Display method, display device, display apparatus, storage medium, and program product Download PDF

Info

Publication number
CN114049467A
CN114049467A CN202111272893.1A CN202111272893A CN114049467A CN 114049467 A CN114049467 A CN 114049467A CN 202111272893 A CN202111272893 A CN 202111272893A CN 114049467 A CN114049467 A CN 114049467A
Authority
CN
China
Prior art keywords
data
business card
environment
user
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202111272893.1A
Other languages
Chinese (zh)
Inventor
李斌
刘旭
李颖楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sensetime Technology Development Co Ltd
Original Assignee
Beijing Sensetime Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sensetime Technology Development Co Ltd filed Critical Beijing Sensetime Technology Development Co Ltd
Priority to CN202111272893.1A priority Critical patent/CN114049467A/en
Publication of CN114049467A publication Critical patent/CN114049467A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The application discloses a display method, a display device, equipment, a storage medium and a program product, wherein the display method comprises the following steps: responding to the scanning operation of the information code of the physical business card, and entering an Augmented Reality (AR) environment of the information code; the AR environment runs through a small program end or a browser webpage end; loading AR data corresponding to the user data carried by the physical business card in the AR environment; and at least displaying the AR effect corresponding to the AR data on a display page of the AR environment.

Description

Display method, display device, display apparatus, storage medium, and program product
Technical Field
The present application relates to, but not limited to, terminal technologies, and in particular, to a display method, apparatus, device, storage medium, and program product.
Background
In business activities, business cards are important business tools for business people to conduct business activities. But the business card information that traditional paper business card can bear is limited, is difficult for long-term storage, and seeks business card information convenient inadequately for user experience feels not good.
Disclosure of Invention
In view of the above, embodiments of the present application provide a display method, an apparatus, a device, a storage medium, and a program product.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a display method, which comprises the following steps:
responding to the scanning operation of the information code of the physical business card, and entering an Augmented Reality (AR) environment of the information code; the AR environment runs through a small program end or a browser (web) webpage end;
loading AR data corresponding to the user data carried by the physical business card in the AR environment;
and at least displaying the AR effect corresponding to the AR data on a display page of the AR environment.
In some embodiments, the entering into the augmented reality AR environment of the information code in response to a scanning operation of the information code of the physical business card includes: responding to the scanning operation of the information code of the physical business card, and entering a network loading page of the information code; and responding to prompt information output by the network loading page, and jumping to a display page for starting the AR environment. Therefore, the AR environment is operated on the browser page, and the AR effect bound by the real name card can be presented on the browser page.
In some embodiments, the loading, in the AR environment, AR data corresponding to user data carried by the physical business card includes: in the AR environment, identifying the information code to obtain the user data bound by the information code; loading the AR data matching the user data in a repository; the repository is used for storing AR data corresponding to a plurality of user data according to the matching relation between the user data and the AR data. Thus, the AR data is stored in the repository, and the data security can be improved.
In some embodiments, said loading in a repository said AR data matching said user data comprises: loading real user data and virtual effect data associated with the user data in the repository; and obtaining the AR data matched with the user data based on the real user data and the virtual effect data. Therefore, the AR data of the physical business card is loaded in the storage library, and the AR effect of the physical business card can be timely and conveniently presented for a user in the AR environment of the display page.
In some embodiments, the method further comprises: responding to the scanning operation of the preset identification of the physical business card, and determining the position information of the preset identification; determining a space display area matched with the position information of the preset identification based on a preset relative position relation; the preset relative position relationship represents the relative position relationship between the position information of the preset identification and the space display area of the AR effect; the displaying, on the display page of the AR environment, at least displaying an AR effect corresponding to the AR data includes: and at least displaying the AR effect in the space display area. So, the spatial display area that will demonstrate the AR effect matches with the positional information of predetermineeing the sign, can make the user watch the AR effect that this AR data corresponds more directly perceivedly and clearly.
In some embodiments, the displaying, on the display page of the AR environment, at least the AR effect corresponding to the AR data includes: displaying an image corresponding to the user data on the display page; and/or displaying the AR effect of the user data and the AR data after the user data and the AR data are fused. Therefore, richer and more comprehensive business card user information can be provided for the business card information receiver.
In some embodiments, the displaying, on the display page of the AR environment, at least the AR effect corresponding to the AR data includes: responding to the interaction operation of the displayed AR data, and determining whether the interaction operation meets an interaction triggering condition; and presenting an interaction effect corresponding to the interaction operation in response to the interaction operation meeting the interaction triggering condition. Therefore, the interactive effect matched with the interactive operation is presented for the user on the display page, and the interestingness of the AR effect and the user experience can be improved.
In some embodiments, after the displaying, on the display page of the AR environment, at least displaying the AR effect corresponding to the AR data, the method further includes: and identifying the AR data, and outputting an identification result in a preset form. So, through discerning AR data, can be according to predetermineeing the form output recognition result to be convenient for receive the business card user and operate the output result.
In some embodiments, the recognizing the AR data and outputting a recognition result in a preset form includes: responding to the identification operation on the display page, and identifying the AR data by adopting an identification frame to obtain business card content corresponding to the AR data; filling the business card content in a preset template based on the corresponding relation between the business card content and a field in the preset template; and displaying the preset template filled with the business card content on an editable interface of the display page. In this way, the receiving business card user identifies AR data by using an Optical Character Recognition (OCR) function, so that editable business card content is presented according to the preset template, the receiving business card user can operate the business card content presented in the preset template based on own requirements, and the user can store identity information on the business card conveniently.
In some embodiments, after the displaying the preset template filled with the business card content on the editable interface of the display page, the method further comprises: responding to the editing operation on the editable interface, and editing the business card content in the preset template; and under the condition that the editing is detected to be completed, saving the edited business card content. Therefore, the user receiving the business card can generate the electronic business card by editing the business card content in the preset template and store the electronic business card in the mobile phone address book, so that the electronic business card is convenient to store and is convenient for follow-up query, and the experience of the user is improved.
An embodiment of the present application provides an image processing apparatus, including: the system comprises a first scanning module, a second scanning module and a third scanning module, wherein the first scanning module is used for responding to scanning operation of an information code of a physical business card and entering an Augmented Reality (AR) environment of the information code; the AR environment runs through a small program end or a web page end; the first loading module is used for loading AR data corresponding to the user data carried by the physical business card in the AR environment; and the first display module is used for at least displaying the AR effect corresponding to the AR data on a display page of the AR environment.
The embodiment of the application provides a computer device, which comprises a memory and a processor, wherein the memory stores a computer program capable of running on the processor, and the processor executes the program to realize the steps in the display method.
An embodiment of the present application provides a computer storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps in the display method or the image processing method described above.
Embodiments of the present application provide a computer program product, which includes a non-transitory computer-readable storage medium storing a computer program, and when the computer program is read and executed by a computer, the steps of the method are implemented.
The embodiment of the application provides a display method, which comprises the steps of firstly, starting and entering an AR environment operated by a small program end or a web page end by scanning an information code of a real name card; therefore, the AR environment is operated through the applet end or the web page end, and the method is fast and convenient. Then, loading AR data bound by the physical business card in the AR environment; and finally, displaying the AR effect corresponding to the AR data on a display page of an applet end or a web page end running the AR environment. So, show the AR effect on the display page of operation AR environment, can provide abundanter comprehensive business card content for the user, still be convenient for the user to use, improve user experience and feel.
Drawings
Fig. 1A is a schematic diagram of an implementation architecture of a display system provided in an embodiment of the present application;
fig. 1B is a schematic view of an implementation flow of a display method provided in the embodiment of the present application;
fig. 1C is a schematic flow chart of another implementation of the display method according to the embodiment of the present application;
fig. 2 is a schematic flow chart of another implementation of the display method according to the embodiment of the present application;
fig. 3 is a schematic flowchart of another implementation of the display method according to the embodiment of the present application;
fig. 4 is a schematic flowchart of another implementation of the display method according to the embodiment of the present application;
fig. 5 is a schematic flow chart of another implementation of the display method according to the embodiment of the present application;
fig. 6 is a schematic view of an implementation process for making an AR business card according to an embodiment of the present application;
fig. 7 is a schematic view of an application scenario of a display method according to an embodiment of the present application;
fig. 8 is a schematic view of another application scenario of the display method according to the embodiment of the present application;
fig. 9 is a schematic structural diagram of a display device according to an embodiment of the present disclosure;
fig. 10 is a hardware entity diagram of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the present application clearer, the technical solutions of the present application are further described in detail with reference to the drawings and the embodiments, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts belong to the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
Where similar language of "first/second" appears in the specification, the following description is added, and where reference is made to the term "first \ second \ third" merely to distinguish between similar items and not to imply a particular ordering with respect to the items, it is to be understood that "first \ second \ third" may be interchanged with a particular sequence or order as permitted, to enable the embodiments of the application described herein to be performed in an order other than that illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
1) Augmented reality: the AR augmented reality technology is a relatively new technology content which enables real world information and virtual world information content to be integrated together, and implements analog simulation processing on the basis of computer and other scientific technologies of entity information which is relatively difficult to experience in the space range of the real world originally, and virtual information content is effectively applied in the real world in an overlapping mode and can be perceived by human senses in the process, so that the sensory experience beyond reality is realized. After the real environment and the virtual object are overlapped, the real environment and the virtual object can exist in the same picture and space at the same time.
The augmented reality technology not only can effectively embody the content of the real world, but also can promote the display of virtual information content, and the fine content is mutually supplemented and superposed. In the visual augmented reality, a user needs to enable the real world to be overlapped with the computer graphics on the basis of the helmet display, and the real world can be fully seen around the computer graphics after the real world is overlapped. The augmented reality technology mainly comprises new technologies and means such as multimedia, three-dimensional modeling, scene fusion and the like, and the information content provided by augmented reality and the information content which can be perceived by human beings are obviously different.
2) Optical character recognition: refers to a process in which an electronic device (e.g., a scanner or a digital camera) checks a character printed on paper, determines its shape by detecting dark and light patterns, and then translates the shape into a computer text by a character recognition method; the method is characterized in that characters in a paper document are converted into an image file with a black-white dot matrix in an optical mode aiming at print characters, and the characters in the image are converted into a text format through recognition software for further editing and processing by word processing software.
3) The implementation method needs a Marker (e.g., a template card, an image or an identification code drawn in a certain shape) which is made in advance based on the Marker-based augmented reality (Marker-based AR), then, the Marker is put at a position in reality, which is equivalent to determining a plane in a real scene, then, the Marker is identified and evaluated in attitude (position Estimation) through a camera, and the position of the Marker is determined, then, a coordinate system with the Marker center as the origin is called a template coordinate system (Marker Coordinates), a mapping relation between the template coordinate system and a screen coordinate system is established, and based on the mapping relation (the transformation from the template coordinate system to a real screen coordinate system needs to be firstly rotated and translated to a camera coordinate system and then mapped to the screen coordinate system from the camera coordinate system), the AR effect corresponding to the AR data rendered on the screen can achieve the effect that the AR effect is attached to the Marker.
4) One-object-one-code is an intelligent 'information code' with a digital identity given to each object by using one-object-one-code technology. In the embodiment of the application, the term "one object and one code" means that a physical business card is bound with one information code, and the information codes carried on different physical business cards are different so as to distinguish different physical business cards; by scanning the information code on the physical business card, the user data corresponding to the physical business card bound by the information code can be obtained.
5) A Hyper Text Markup Language (HTML) H5 loading page is a web page capable of supporting the presentation of multimedia information on an electronic device. In the embodiment of the application, the H5 loading page at least comprises a network page, an applet page or a webpage end page and the like which present webpage jump prompt information.
Embodiments of the present application provide a display method, which may be performed by an electronic device, and the electronic device may be implemented as various types of terminals such as a mobile phone, a notebook computer, a tablet computer, a desktop computer, a set-top box, AR glasses, a mobile device (e.g., a portable music player, a personal digital assistant, a dedicated messaging device, and a portable game device). In some embodiments, the display method provided by the embodiment of the present application may be applied to a client application platform of an electronic device. The client application platform may be a network (Web) application platform or an applet. In some embodiments, the display method provided in the embodiments of the present application may also be applied to an application program of an electronic device.
Referring to fig. 1A, fig. 1A is a schematic diagram of an implementation architecture of a display system provided in this embodiment, in order to implement and support a client application platform, in a display system 10, an electronic device (e.g., a terminal 12) is connected to a server 14 through a network 13, where the network 13 may be a wide area network or a local area network, or a combination of the two. The electronic device is used for responding to the scanning operation of the information code of the physical business card 11 and entering an AR environment running through an applet end or a web page end. And transmits the information code to the server 14; the server 14 loads the AR data corresponding to the user data carried by the physical business card in the AR environment; displaying an AR effect corresponding to the loaded AR data on a display page 15 of the AR environment of the electronic device; in this way, the AR effect of binding the information code of the physical business card 11 is realized in the display page of the AR environment of the electronic device.
In some embodiments, the server 200 may be an independent physical server, may also be a server cluster or a distributed system formed by a plurality of physical servers, and may also be a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network service, cloud communication, a middleware service, a domain name service, a security service, a CDN, and a big data and artificial intelligence platform. The terminal and the server may be directly or indirectly connected through wired or wireless communication, and the embodiment of the present application is not limited.
In the following, the display method provided by the embodiment of the present application will be described in conjunction with an exemplary application and implementation of the electronic device provided by the embodiment of the present application.
As shown in fig. 1B, fig. 1B is a schematic flow chart of an implementation of a display method provided in an embodiment of the present application, and the following description is performed with reference to the steps shown in fig. 1B:
and S101, responding to the scanning operation of the information code of the physical business card, and entering the augmented reality environment of the information code.
Here, the physical business card may be any form of business card, such as paper business card, metal business card, electroplated business card, plastic business card, frosted business card, wooden business card, redwood business card, etc.; the specification of the physical business card can be freely set by a user. Carry user's identity information and the information code that binds with this identity information on this kind of real object business card, wherein, identity information includes: user name, job title, company, address, telephone number, and mailbox, among others. The information code may be a two-dimensional code, a barcode, or other codes that can be scanned, which is not limited in this application. And scanning the information code by adopting a terminal with a scanning function, and starting and entering an AR environment for presenting the AR effect of the information code. The AR environment runs through a small program end or a web page end. After the information code is scanned by the scanning terminal, the terminal enters a network loading page, automatically jumps to a browser page, and starts an AR environment of the information code in the browser page. When the information code of the physical business card is scanned, the terminal for executing the scanning operation can be a handheld terminal device, such as a mobile phone or a tablet computer; alternatively, the information code of the physical business card may be scanned by connecting the processing device to the image capturing device, for example, by connecting a robot having a camera to the image capturing device through a mobile phone, scanning the information code by the robot, and starting the augmented reality environment by the mobile phone connected to the robot. Wherein the AR environment is implemented by a web-side or applet deployed in an AR device; for example, in response to a scanning operation of an information code of a physical business card, a web terminal supporting an AR function or an applet supporting the AR function is jumped to, and an AR environment is presented in the web terminal or the applet.
In some embodiments, the terminal may scan an information code of any physical business card, and jump to a web end or an applet supporting an AR function to start the AR environment when a certain scanned information code is successfully identified. The information code of the physical business card binds the identity information, the contact way, the related introduction of the company to which the user belongs, the description information of the user and the like of the user on the business card.
In some embodiments, the terminal may display a scan prompt page including a scan entry, perform scan recognition of the information code through its own image acquisition device (e.g., a camera) when receiving a trigger operation for triggering the scan entry by a user, and enter an AR environment of the information code when the scanned information code is successfully recognized.
And S102, loading AR data corresponding to the user data carried by the physical business card in the AR environment.
In some embodiments, the user data carried by the physical business card is the identity information of the user presented on the business card. In some possible implementation modes, when designing an AR business card, setting AR data of the user data based on user data carried by the physical business card, and storing the AR data in a cloud database according to a preset storage address; therefore, after the scanning terminal enters the AR environment by scanning the information code on the business card, the AR data corresponding to the user data bound by the information code is loaded from the server side in the AR environment. The AR data is data which is set based on user data carried by the physical business card and is used for presenting rich user information.
In some embodiments, the AR data may be special effects data for presenting an augmented reality effect in an augmented reality environment corresponding to user data carried by the physical business card. In some embodiments, the AR data may include at least one of: a video clip (e.g., a company promotion video or a personal introduction video) showing the user's personal information, an image, a virtual sticker, a virtual animation, a virtual article, an interactive animation, a virtual three-dimensional object, a three-dimensional human model of the user, a three-dimensional model of a building or other object included in the image, and so on. Wherein, the video clips (such as company promotion videos or personal introduction videos) and images showing the personal information of the user can be videos and images in the collected real scene; the virtual animation, the virtual article, the virtual three-dimensional object may be virtual additional information for adding in the real scene image, for example, the virtual sticker may be a calendar for adding in the real scene image in the augmented reality environment and a virtual picture of a corresponding holiday; the virtual animation may be a virtual object that moves according to a preset action and is added to the real scene image, and the virtual object may include a virtual character, a virtual plant, a virtual animal, and the like, for example, the virtual animation may be a virtual interpreter for simply introducing the user's job company; the virtual article may be an ornament for decorating in the real scene image, for example, the virtual article may be a decoration for adding to a building or a human figure in the real scene image (such as a virtual garment, virtual glasses, or a virtual hairstyle, etc.); the interactive animation may be a responsive action added in the augmented reality environment based on an input user operation, for example, a user clicks a portrait presented in the augmented reality environment, and then presents an action of calling; the virtual three-dimensional object may be a virtual stereo object for adding in the real scene image, such as a virtual three-dimensional bonsai added in the image showing the company environment or a virtual three-dimensional object matching with a holiday; the three-dimensional human model of the user, the three-dimensional model of the building or other object included in the image may be a three-dimensional model for adding to a real portrait or object in an augmented reality environment.
And step S103, at least displaying the AR effect corresponding to the AR data on a display page of the AR environment.
In some embodiments, the AR data loaded from the repository to the user data is presented on a display page of the AR environment. The display page of the AR environment may be an applet display interface or a browser page that launches the AR environment; namely, at least displaying the AR effect corresponding to the AR data on the display page presenting the AR environment.
In some possible implementations, an image corresponding to the user data is displayed on the display page; and/or displaying the AR effect of the user data and the AR data after the user data and the AR data are fused. That is, on the display page of the AR environment, the AR effect corresponding to the AR data loaded from the cloud database may be displayed, or the AR effect after the AR data is fused with the user data carried by the physical business card is displayed, or the image corresponding to the physical business card and the AR effect corresponding to the AR data loaded from the cloud database are displayed simultaneously. Therefore, richer and more comprehensive business card user information can be provided for the business card information receiver.
In the embodiment of the application, firstly, an AR environment running through a small program end or a web page end is started and entered through scanning an information code of a real name card; therefore, the AR environment is operated through the applet end or the web page end, and the method is fast and convenient. Then, loading AR data bound by the physical business card in the AR environment; and finally, displaying the AR effect corresponding to the AR data on a display page of an applet end or a web page end running the AR environment. Therefore, the AR effect is displayed on the display page running in the AR environment, so that the use of a user is facilitated, and the user experience is improved.
In some embodiments, before displaying the AR effect on the display interface, a preset identifier on the physical business card is scanned to determine a space display area of the AR effect, so as to display the AR effect corresponding to the AR data in the space display area, which may be implemented by the steps shown in fig. 1C, where fig. 1C is another implementation flow diagram of the display method provided in the embodiment of the present application, and the steps shown in fig. 1C are combined to perform the following description:
step S131, responding to the scanning operation of the preset identification of the physical business card, and determining the position information of the preset identification.
In some embodiments, the location information of the preset identifier includes: the position of the preset mark on the physical business card and the plane where the preset mark is located. The preset mark can be a marker on a physical business card; and determining the position of the marker on the physical business card and the plane of the marker by scanning the marker.
Step S132, based on the preset relative position relationship, determining a space display area matched with the position information of the preset mark.
In some embodiments, the preset relative position relationship represents a relative position relationship between position information of a preset identifier and a spatial display area of the AR effect; for example, the relative position relationship is that the spatial display area of the AR effect is perpendicular to or parallel to the position information of the preset identifier; after the position information of the preset identification is determined, the space display area of the AR effect can be determined according to the preset relative position relation.
Step S133, at least displaying the AR effect in the spatial display area.
In this application embodiment, show the AR effect that AR data corresponds in this space display area, like this will show the space display area of AR effect and the positional information phase-match of predetermineeing the sign, no matter where electronic equipment moves, the space display area of AR effect can not change, and the user of being convenient for watches the AR effect of show through each position to can make the user watch the AR effect that this AR data corresponds directly perceivedly and clearly more.
In some embodiments, the information code is scanned to enter a network loading page, and then the display page (for example, a browser interface) capable of running the AR environment is skipped, that is, the step S101 may be implemented by the step shown in fig. 2, fig. 2 is another implementation flow diagram of the display method provided in the embodiment of the present application, and the following description is performed with reference to the steps shown in fig. 1 and 2:
step S201, responding to the scanning operation of the information code of the physical business card, and entering a network loading page of the information code.
In some embodiments, after the scanning terminal scans the information code of the physical business card, the network loading page bound by the information code is entered, and prompt information is presented on the network loading page to prompt a user of the scanning terminal to jump to a browser.
And S202, responding to prompt information output by the network loading page, and jumping to a display page for starting the AR environment.
In some embodiments, the hint information is information that prompts a jump. The information code of the physical business card is designed based on an object code, after the information code is scanned, prompt information is displayed on a loaded H5 page, and the display page of the AR environment is automatically jumped to, so that the AR environment bound by the information code is entered. For example, a jump is automatically made to the browser to facilitate the recognition of an AR business card in the AR environment. Therefore, after the scanning terminal scans the information codes, the browser enters an H5 loading page, and prompts jumping to enter the browser, so that an AR environment can be operated on the browser page, and an AR effect bound by the real name cards can be presented on the browser page.
In some embodiments, after entering the AR environment, the AR data corresponding to the user data carried by the physical business card is loaded from the repository, that is, the step S102 may be implemented by the step shown in fig. 3, and fig. 3 is a schematic flow chart of another implementation of the display method provided in this embodiment of the present application, and the following description is performed with reference to the steps shown in fig. 1 and 3:
step S301, in the AR environment, identifying the information code to obtain the user data bound with the information code.
In some embodiments, after the jump is made to the browser, an AR environment of the information code is entered on the browser interface; in the AR environment, at least one identification process of image preprocessing, positioning position detection graph, positioning correction graph, perspective transformation, decoding and the like is carried out on the information code, so that the user data bound with the information code, namely the identity information of the name card holder corresponding to the physical name card, is identified.
Step S302, the AR data matched with the user data is loaded in a repository.
In some embodiments, the repository may be a cloud repository or a local repository. In some possible implementations, after the information code is identified in the AR environment and the user data bound to the information code is determined, the AR data bound to the user data is loaded from the repository on the display interface running the AR environment. The AR data is stored in a cloud database according to a set path in the process of designing the AR business card of the physical business card; thus, by following the set path, the AR data can be loaded from the repository. The presentation form of the AR data can be different forms such as videos, characters, pictures, two-dimensional or three-dimensional models and the like.
In some embodiments, the repository is configured to store AR data corresponding to a plurality of user data according to a matching relationship between the user data and the AR data. Thus, storing the AR data in the repository can improve data security.
In some possible implementation manners, by loading real user data of a user corresponding to a physical business card and designed virtual effect data from a repository, the AR data matched by the user is obtained, that is, the step S302 may be implemented by steps S321 and S322 (not shown in the figure):
step S321, loading real user data and virtual effect data associated with the user data in the repository.
In some embodiments, the repository may be a cloud repository or a local repository. Taking the cloud storage as an example, the cloud storage stores images, videos, or three-dimensional models of real scenes associated with the user data, and virtual images, virtual animations, or other virtual effects associated with the user data. And loading real user data and virtual effect data associated with the user data from the cloud storage library. The real user data is data acquired based on a real object, such as a personal image of a user, a personal introduction video, a company scene image, or a company promotion video. The virtual effect data is virtual stickers, two-dimensional animations, virtual three-dimensional models or other virtual effects (such as a beauty effect, a makeup effect, a background effect, a foreground effect, a lens effect) and the like added on the basis of the user data.
Step S322, obtaining the AR data matched with the user data based on the real user data and the virtual effect data.
In some embodiments, in a design stage of the AR business card, real user data, the virtual effect data, and user data on the real business card are bound to be used as AR data matched with the user data, and are stored in the cloud database according to a set storage path. Therefore, after entering the AR environment, loading real user data and virtual effect data associated with the user data on the physical business card from the storage library according to a preset path, so as to load the AR data of the physical business card, and further being capable of displaying the AR effect of the physical business card for a user timely and conveniently in the AR environment of a display page.
In some embodiments, in response to the user' S interactive operation on the AR data, an interactive effect corresponding to the interactive operation can be presented on the display interface, that is, step S103 may further include the following S1031 and S1032 (not shown in the figure):
and S1031, responding to the interactive operation of the displayed AR data, and determining whether the interactive operation meets an interactive trigger condition.
In some embodiments, the interaction triggering condition may be a condition for a background setting to interact with the AR data. For example, the interaction triggering condition may be a triggering operation on a website link in the AR data, a triggering operation on a mailbox or a phone number in the AR data, or a clicking operation on some characters or objects in the AR data; illustratively, the website link in the AR data may be a website of a company corresponding to the physical business card or a webpage link for displaying personal talents of the user.
S1032, responding to the interactive operation meeting the interactive triggering condition, and presenting an interactive effect corresponding to the interactive operation.
In some embodiments, if the interaction operation is any one of a click operation on a website link, a trigger operation on a mailbox or a telephone number in the AR data, and the like, or a click operation on some character models or object models in the AR data, it is determined that the interaction operation satisfies an interaction trigger condition. If the interaction operation is a click operation on the website link, the interaction effect is based on jumping to the page of the webpage link; if the interactive operation is the triggering operation of the mailbox or the telephone number in the AR data, the interactive effect is that the user jumps to an editing interface of the mailbox or a dialing interface of the telephone number; if the interactive operation is the triggering operation of the person image or the object image in the AR data, the interactive effect is to present a two-dimensional or three-dimensional model of the person or the object; the two-dimensional or three-dimensional model of the person or object may be rendered from an image of the person or object.
In the embodiment of the application, whether the interactive operation of the user meets the interactive triggering condition is analyzed, the interactive effect matched with the interactive operation is presented for the user on the display page, the interestingness of the AR effect can be improved, and therefore the user experience is improved.
In some embodiments, after presenting the AR effect of the physical business card on the display page, the receiver of the business card information may perform an identifying operation on the AR data of the display page, and output the identified data according to a preset template, that is, step S103 may further include the step shown in fig. 4, which is described below with reference to fig. 1 and 4:
step S401, the AR data are identified, and an identification result is output in a preset form.
In some possible implementation modes, after AR data of a physical business card is displayed on a display page, receiving the AR effect displayed on the display page, clicking and identifying by a business card user, and identifying the displayed AR data through OCR to obtain an identification result; the recognition result includes real user data and virtual effect data included in the AR data, for example, the real user data includes: user identity information, company website, telephone, mailbox, user image, company image, address and the like of the physical business card; the virtual effect data includes: a virtual three-dimensional object (e.g., a three-dimensional mannequin of a user), a virtual sticker, a virtual animation of a promotional company, or other virtual object of a presentation (e.g., a virtual vehicle in an image of a presentation company), etc. The preset form can be a preset template, and the AR data is identified and automatically filled into a corresponding module according to the identified content; alternatively, the preset format is a preset format for outputting the recognition result, for example, the recognition result is outputted in an editable format on the display interface. So, through discerning AR data, can be according to predetermineeing the form output recognition result to be convenient for receive the business card user and operate the output result.
In some embodiments, the AR data is recognized, and the recognition result is output in an editable form according to the preset template, that is, the step S401 may be implemented by the steps shown in fig. 5, where fig. 5 is another implementation flow diagram of the display method provided in the embodiment of the present application, and the following description is performed with reference to the steps shown in fig. 5:
step S501, in response to the recognition operation on the display page, recognizing the AR data by adopting a recognition frame to obtain business card content corresponding to the AR data.
In some embodiments, the identifying operation is an operation performed by the business card user on the displayed AR data on the display interface; for example, the AR data is subjected to a click operation, a touch operation, a drag operation, or the like, so that the recognition frame is aligned with the AR data for recognition, and the business card content corresponding to the AR data is obtained. The recognition box may be an OCR recognition box, such that OCR recognition of AR data is enabled resulting in business card content comprising a text field.
Step S502, based on the corresponding relation between the business card content and the field in the preset template, filling the business card content in the preset template.
In some embodiments, the preset module is a template capable of performing editing operation and used for bearing business card content, wherein the template comprises a plurality of areas, different areas are used for bearing different business card content, and each area is provided with a field for representing the function of the area; for example, from the top edge to the bottom edge of the template, a user personal information area (field is user personal information), a user contact area (field is user contact), a company introduction area (field is company introduction), a user personal image area (field is user image), and the like are arranged in sequence. In some possible implementation manners, through recognizing the text in the business card content, matching the recognized text with the field in the preset template, so as to establish the corresponding relation between the business card content and the field in the preset template; and then automatically filling the business card content in the corresponding area of the preset template according to the corresponding relation.
Step S503, displaying the preset template filled with the business card content on the editable interface of the display page.
In some embodiments, the identified business card content is automatically filled into a corresponding area of a preset module, a preset template filled with the business card content is displayed on a display page, and the preset template is editable, so that a user can edit and fill the information identifying the error and the missing information in the preset template. Therefore, the receiving business card user identifies the AR data by using the OCR identification function, so that editable business card content is presented according to the preset template, the receiving business card user can operate the business card content presented in the preset template based on own requirements, and the user can store identity information on the business card conveniently.
In some embodiments, after automatically filling the business card content in the preset template, for the preset template presented on the display interface, the business card receiving user may perform editing operation on the business card content therein and store the business card content in the business card receiving terminal, that is, after step S503 above, the method further includes the following steps:
and step one, responding to the editing operation on the editable interface, and editing the business card content in the preset template.
In some embodiments, the editing operation is to receive an editing operation performed by a business card user on a editable interface presenting a preset template on business card content in the preset template. The editing operation may be an operation performed by a user receiving a business card based on his own needs, and includes: modification operations performed on the contents of the business card that are identified as erroneous, addition operations performed on missing information, deletion operations performed on the redundant information that is identified, and the like.
And secondly, under the condition that the editing is detected to be finished, storing the edited business card content.
In some embodiments, after the receiving business card user finishes editing the business card content in the preset template, the receiving business card user can click or touch an editing completion button and hold the edited electronic business card in the terminal of the receiving business card user. Therefore, the user receiving the business card can generate the electronic business card by editing the business card content in the preset template and store the electronic business card in the mobile phone address book, so that the electronic business card is convenient to store and is convenient for follow-up query, and the experience of the user is improved.
An exemplary application of the embodiments of the present application in a practical application scenario will be described below.
The embodiment of the application provides a display method, a business card user is received to scan a marker on a business card through a terminal, and information of a business card holder is displayed on the terminal through AR contents in different forms such as videos, characters, pictures, two-dimensional or three-dimensional forms and the like; the user can also recognize the information on the business card into characters through OCR recognition, and the characters are convenient to store.
In some embodiments, before presentation of the AR business card on the web end, the process of making the AR business card may be implemented by the steps shown in fig. 6:
step S601, acquiring information input by the user.
In step S602, a personal ID is extracted from the information.
In some possible implementation methods, the personal ID extracted in step S602 is transmitted to the background 608, so that the background 608 generates an information code for the business card to be made, and one-to-one code between the user and the business card is implemented.
In step S603, the input process of the user information is completed based on the personal ID.
The above steps S601 to S603 complete the input of user information for making a business card, where the user information includes the name, position department, mailbox address, contact information, and the like of the user, as shown in the front 701 and the back 703 of the business card in fig. 7.
Step S604, a personal photo of the user is acquired.
In step S605, a personal video of the user is acquired.
Step S606, the text introduction of the user is acquired.
The above steps S604 to S606 are combined with step S603 to create an AR card, and the process proceeds to step S607. Steps S604 to S606 provide, among other things, a field requirement for making an AR business card.
In step S607, an AR business card is generated.
The above steps S601 to S607 create a business card whose screen includes user information.
The presentation of the AR business card on the web end can be realized through the following steps:
first, a business card is scanned.
In some possible implementation modes, each business card has a dedicated two-dimensional code, after receiving the two-dimensional code' of the business card user on the business card scanned through a terminal, the user enters an H5 loading page, prompts on an H5 loading page to jump into a browser, identifies the AR business card in an AR environment, and loads business card data carried on the business card. All business card data can be stored through the cloud, the safety of the cloud storage is higher than that of the local storage, and the safety of the business card data information is guaranteed.
In some possible implementations, the physical business card is shown in fig. 7, and the physical business card is composed of a front surface 701 and a back surface 703, wherein the front surface 701 of the business card includes a two-dimensional code 702 and a guidance prompt (e.g., a company trademark, a name, a position, a department of belongings, a company, a contact address, etc.), and the back surface 703 of the business card may be designed to be a recognizable picture and a company poster, etc.
A user can enter a display interface for displaying the AR effect of the business card by scanning the two-dimensional code on the front side of the business card; as shown in fig. 8, the content that is displayed in the display area 803 for interactive clicking by the user, including: corporate web address 81, mailbox 82, and telephone 83; a colored egg 84, such as a video of a dance of a company leader or employee, is displayed in a display area 804.
And secondly, displaying AR content of the scanned business card.
In some possible implementation manners, after the receiving business card user identifies the AR business card, the receiving business card user can see the AR display content on the mobile phone without downloading an application program client. The display of the AR business card can be realized at a web end, so that the use of a user is more convenient and faster. The AR business card can show the information of the user more comprehensively and abundantly through different forms of AR contents such as characters, pictures, videos and two-dimensional or three-dimensional models.
And thirdly, identifying the displayed AR content of the AR business card so as to identify and store the content of the business card.
In some possible implementation manners, after displaying the content of the business card AR, the user receiving the business card may select to recognize the content of the business card through OCR and store the content; the method can be realized by the following steps:
firstly, a user clicks the name card identification, and the name card is aligned to the identification frame for identification.
Secondly, the identified business card information can be automatically filled into the corresponding module, and the user can edit and fill the information with errors and the missing information.
Finally, the user can store the edited electronic business card on the mobile phone, so that the electronic business card is convenient to search in the future.
In the embodiment of the application, firstly, a user enters an AR environment by scanning an information code on a real name card; and then, determining a space display area for displaying the AR effect by identifying the position information of the marker on the physical business card. Finally, displaying AR effects of different forms such as characters, pictures, videos, two-dimensional or three-dimensional models and the like in the space display area so as to display personal information of the name card holder, introduction videos or enterprise information and the like; meanwhile, the user can use the OCR recognition function to recognize the character information on the business card to generate an electronic business card and store the electronic business card in the mobile phone address book. Therefore, the AR technology is given to the business card, the information of a business card user is displayed through different forms such as videos, characters, pictures, two-dimensional or three-dimensional models and the like, the problem of small information capacity of paper business cards is solved, and meanwhile, the display of the AR effect can also play a good role in publicizing the business card holder and the enterprise/mechanism where the business card holder is located. The AR business card is displayed based on the web terminal, the user does not need to download an application program, and the method is more convenient and faster in a business scene. The safety of the business card data is guaranteed through 'one object one code' and a mode of storing the business card data in a cloud side. The method is combined with OCR recognition, the content of the business card can be recognized and directly stored in the mobile phone address book, the business card information can be conveniently stored and searched, and the problems that paper business cards are difficult to store and easy to lose are solved.
Based on the foregoing embodiments, the present application provides a display apparatus, which includes units included and modules included in the units, and can be implemented by a processor in a computer device; of course, the implementation can also be realized through a specific logic circuit; in implementation, the processor may be a Central Processing Unit (CPU), a Microprocessor (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like.
Fig. 9 is a schematic structural diagram of a display device according to an embodiment of the present application, and as shown in fig. 9, the display device 900 includes: a first scanning module 901, a first loading module 902 and a first display module 903, wherein:
the first scanning module 901 is configured to enter an Augmented Reality (AR) environment of an information code in response to a scanning operation on the information code of a physical business card; the AR environment runs through a small program end or a web page end;
a first loading module 902, configured to load, in the AR environment, AR data corresponding to user data carried by the physical business card;
a first display module 903, configured to display at least an AR effect corresponding to the AR data on a display page of the AR environment.
In some embodiments, the first scanning module 901 includes:
the first scanning sub-module is used for responding to the scanning operation of the information code of the physical business card and entering a network loading page of the information code;
and the first skipping submodule is used for responding to prompt information output by the network loading page and skipping to a display page for starting the AR environment.
In some embodiments, the first loading module 902 includes:
the first identification submodule is used for identifying the information code in the AR environment to obtain the user data bound with the information code;
a first loading submodule, configured to load, in a repository, the AR data that matches the user data; the repository is used for storing AR data corresponding to a plurality of user data according to the matching relation between the user data and the AR data.
In some embodiments, the first load submodule comprises:
a first loading unit, configured to load, in the repository, real user data and virtual effect data associated with the user data;
a first determining unit, configured to obtain, based on the real user data and the virtual effect data, the AR data matched with the user data.
In some embodiments, the apparatus further comprises:
the first determining module is used for responding to scanning operation of a preset identifier of the physical business card and determining position information of the preset identifier;
the second determining module is used for determining a space display area matched with the position information of the preset identifier based on a preset relative position relation; the preset relative position relationship represents the relative position relationship between the position information of the preset identification and the space display area of the AR effect;
the first display module 903 is further configured to:
and at least displaying the AR effect in the space display area.
In some embodiments, the first display module 903 is further configured to:
displaying an image corresponding to the user data on the display page; and/or displaying the AR effect of the user data and the AR data after the user data and the AR data are fused.
In some embodiments, the first display module 903 is further configured to: responding to the interaction operation of the displayed AR data, and determining whether the interaction operation meets an interaction triggering condition; and presenting an interaction effect corresponding to the interaction operation in response to the interaction operation meeting the interaction triggering condition.
In some embodiments, the apparatus further comprises:
and the first identification module is used for identifying the AR data and outputting an identification result in a preset form.
In some embodiments, the first identification module comprises:
the first identification submodule is used for responding to the identification operation on the display page and identifying the AR data by adopting an identification frame to obtain business card content corresponding to the AR data;
the first filling sub-module is used for filling the business card content in a preset template based on the corresponding relation between the business card content and a field in the preset template;
and the first display sub-module is used for displaying the preset template filled with the business card content on an editable interface of the display page.
In some embodiments, the apparatus further comprises:
the first editing module is used for responding to editing operation on the editable interface and editing the business card content in the preset template;
and the first maintaining module is used for storing the edited business card content under the condition that the completion of the editing is detected.
The above description of the apparatus embodiments, similar to the above description of the method embodiments, has similar beneficial effects as the method embodiments. For technical details not disclosed in the embodiments of the apparatus of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
The utility model relates to an augmented reality field, through obtaining the user data that carries on the physical business card, and then realize with the help of all kinds of visual correlation algorithm designs such as the relevant image to user data, user's individual show video, this user's company's image, company's propaganda video, through the form of a thing sign indicating number, bind these information and the information code on the physical business card, thereby receive the business card user and scan this information code through scanning terminal, can obtain the AR effect that combines together with the virtual and reality of this user data matching.
In the embodiment of the present application, if the display method or the image processing method is implemented in the form of a software functional module and sold or used as a standalone product, the software functional module may also be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or portions thereof contributing to the related art may be embodied in the form of a software product stored in a storage medium, and including several instructions for enabling an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
Correspondingly, the embodiment of the present application provides a computer device, which includes a memory and a processor, where the memory stores a computer program that can be run on the processor, and the processor implements the steps in the above method when executing the program.
Correspondingly, the embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, and the computer program realizes the steps of the above method when being executed by a processor.
Here, it should be noted that: the above description of the storage medium and device embodiments is similar to the description of the method embodiments above, with similar advantageous effects as the method embodiments. For technical details not disclosed in the embodiments of the storage medium and apparatus of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
It should be noted that fig. 10 is a schematic hardware entity diagram of a computer device according to an embodiment of the present application, and as shown in fig. 10, the hardware entity of the computer device 1000 includes: a processor 1001, a communication interface 1002, and a memory 1003, wherein,
the processor 1001 generally controls the overall operation of the computer device 1000.
The communication interface 1002 may enable the computer device to communicate with other terminals or servers via a network.
The Memory 1003 is configured to store instructions and applications executable by the processor 1001, and may also buffer data (e.g., image data, audio data, voice communication data, and video communication data) to be processed or already processed by the processor 1001 and modules in the computer apparatus 1000, and may be implemented by a FLASH Memory (FLASH) or a Random Access Memory (RAM).
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk.
Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The above description is only for the embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application.

Claims (14)

1. A method of displaying, the method comprising:
responding to the scanning operation of the information code of the physical business card, and entering an Augmented Reality (AR) environment of the information code; the AR environment runs through a small program end or a web page end;
loading AR data corresponding to the user data carried by the physical business card in the AR environment;
and at least displaying the AR effect corresponding to the AR data on a display page of the AR environment.
2. The method according to claim 1, wherein the entering into the Augmented Reality (AR) environment of the information code in response to the scanning operation of the information code of the physical business card comprises:
responding to the scanning operation of the information code of the physical business card, and entering a network loading page of the information code;
and responding to prompt information output by the network loading page, and jumping to a display page for starting the AR environment.
3. The method according to claim 1 or 2, wherein the loading, in the AR environment, AR data corresponding to user data carried by the physical business card includes:
in the AR environment, identifying the information code to obtain the user data bound by the information code;
loading the AR data matching the user data in a repository; the repository is used for storing AR data corresponding to a plurality of user data according to the matching relation between the user data and the AR data.
4. The method of claim 3, wherein loading the AR data matching the user data in a repository comprises:
loading real user data and virtual effect data associated with the user data in the repository;
and obtaining the AR data matched with the user data based on the real user data and the virtual effect data.
5. The method according to any one of claims 1 to 4, further comprising:
responding to the scanning operation of the preset identification of the physical business card, and determining the position information of the preset identification;
determining a space display area matched with the position information of the preset identification based on a preset relative position relation; the preset relative position relationship represents the relative position relationship between the position information of the preset identification and the space display area of the AR effect;
the displaying, on the display page of the AR environment, at least displaying an AR effect corresponding to the AR data includes: and at least displaying the AR effect in the space display area.
6. The method according to any one of claims 1 to 5, wherein displaying at least the AR effect corresponding to the AR data on the display page of the AR environment comprises:
displaying an image corresponding to the user data on the display page; and/or displaying the AR effect of the user data and the AR data after the user data and the AR data are fused.
7. The method according to any one of claims 1 to 6, wherein displaying at least the AR effect corresponding to the AR data on the display page of the AR environment comprises:
responding to the interaction operation of the displayed AR data, and determining whether the interaction operation meets an interaction triggering condition;
and presenting an interaction effect corresponding to the interaction operation in response to the interaction operation meeting the interaction triggering condition.
8. The method of any of claims 1 to 7, wherein after displaying at least the AR effect corresponding to the AR data on the display page of the AR environment, the method further comprises:
and identifying the AR data, and outputting an identification result in a preset form.
9. The method according to claim 8, wherein the recognizing the AR data and outputting the recognition result in a preset form comprises:
responding to the identification operation on the display page, and identifying the AR data by adopting an identification frame to obtain business card content corresponding to the AR data;
filling the business card content in a preset template based on the corresponding relation between the business card content and a field in the preset template;
and displaying the preset template filled with the business card content on an editable interface of the display page.
10. The method of claim 9, wherein after displaying the preset template populated with the business card content in the editable interface of the display page, the method further comprises:
responding to the editing operation on the editable interface, and editing the business card content in the preset template;
and under the condition that the editing is detected to be completed, saving the edited business card content.
11. A display device, comprising:
the system comprises a first scanning module, a second scanning module and a third scanning module, wherein the first scanning module is used for responding to scanning operation of an information code of a physical business card and entering an Augmented Reality (AR) environment of the information code; the AR environment runs through a small program end or a web page end;
the first loading module is used for loading AR data corresponding to the user data carried by the physical business card in the AR environment;
and the first display module is used for at least displaying the AR effect corresponding to the AR data on a display page of the AR environment.
12. An electronic device, comprising:
a display screen; a memory for storing an executable computer program;
a processor for implementing the method of any one of claims 1 to 10 in conjunction with the display screen when executing an executable computer program stored in the memory.
13. A computer-readable storage medium, having stored thereon a computer program for causing a processor, when executed, to carry out the method of any one of claims 1 to 10.
14. A computer program product comprising a non-transitory computer readable storage medium storing a computer program which, when read and executed by a computer, implements the steps of the method of any one of claims 1 to 10.
CN202111272893.1A 2021-10-29 2021-10-29 Display method, display device, display apparatus, storage medium, and program product Withdrawn CN114049467A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111272893.1A CN114049467A (en) 2021-10-29 2021-10-29 Display method, display device, display apparatus, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111272893.1A CN114049467A (en) 2021-10-29 2021-10-29 Display method, display device, display apparatus, storage medium, and program product

Publications (1)

Publication Number Publication Date
CN114049467A true CN114049467A (en) 2022-02-15

Family

ID=80206520

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111272893.1A Withdrawn CN114049467A (en) 2021-10-29 2021-10-29 Display method, display device, display apparatus, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN114049467A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022262377A1 (en) * 2021-06-15 2022-12-22 上海商汤智能科技有限公司 Display method and apparatus, and device, computer-readable storage medium and computer program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022262377A1 (en) * 2021-06-15 2022-12-22 上海商汤智能科技有限公司 Display method and apparatus, and device, computer-readable storage medium and computer program

Similar Documents

Publication Publication Date Title
US11182609B2 (en) Method and apparatus for recognition and matching of objects depicted in images
US9582913B1 (en) Automated highlighting of identified text
CN102822817B (en) For the Search Results of the action taked of virtual query
CN104461318B (en) Reading method based on augmented reality and system
US10186084B2 (en) Image processing to enhance variety of displayable augmented reality objects
JP5951759B2 (en) Extended live view
US20180095734A1 (en) System and method for creating a universally compatible application development system
US20160012136A1 (en) Simultaneous Local and Cloud Searching System and Method
CN105706080A (en) Augmenting and presenting captured data
JP2014524062A5 (en)
CN103988202A (en) Image attractiveness based indexing and searching
CN106130886A (en) The methods of exhibiting of extension information and device
WO2023045964A1 (en) Display method and apparatus, device, computer readable storage medium, computer program product, and computer program
CN111832826A (en) Library management method and device based on augmented reality and storage medium
CN114049467A (en) Display method, display device, display apparatus, storage medium, and program product
CN113127126B (en) Object display method and device
KR102234172B1 (en) Apparatus and method for providing digital twin book shelf
CN111651049B (en) Interaction method, device, computer equipment and storage medium
CN113359985A (en) Data display method and device, computer equipment and storage medium
CN113326709A (en) Display method, device, equipment and computer readable storage medium
US11544921B1 (en) Augmented reality items based on scan
CN111767488A (en) Article display method, electronic device and storage medium
US9965446B1 (en) Formatting a content item having a scalable object
US11699174B2 (en) Media processing techniques for enhancing content
AU2020102644A4 (en) Online Artwork gallery Systems And Methods

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20220215