WO2022161289A1 - 身份信息的展示方法、装置、终端、服务器及存储介质 - Google Patents

身份信息的展示方法、装置、终端、服务器及存储介质 Download PDF

Info

Publication number
WO2022161289A1
WO2022161289A1 PCT/CN2022/073258 CN2022073258W WO2022161289A1 WO 2022161289 A1 WO2022161289 A1 WO 2022161289A1 CN 2022073258 W CN2022073258 W CN 2022073258W WO 2022161289 A1 WO2022161289 A1 WO 2022161289A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
identity information
augmented reality
terminal
business card
Prior art date
Application number
PCT/CN2022/073258
Other languages
English (en)
French (fr)
Inventor
韩瑞
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2022161289A1 publication Critical patent/WO2022161289A1/zh
Priority to US18/048,562 priority Critical patent/US20230066708A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/248Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/024Multi-user, collaborative environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2004Aligning objects, relative positioning of parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2016Rotation, translation, scaling

Definitions

  • the embodiments of the present application relate to the field of augmented reality technologies, and in particular, to a method, device, terminal, server, and storage medium for displaying identity information.
  • Offline interaction is an interaction mode in which multiple people perform in a specific physical space.
  • Common offline interaction scenarios include industry summits, customer appreciation meetings, dating meetings, and so on.
  • the participants When participating in an offline interaction, the participants may not know each other, and it is often necessary to communicate with each other before knowing the identity of the other party.
  • the organizer of offline interaction can collect and sort out the identity information of the participants in advance (must include photos), and send the sorted identity information to Each participant; in the offline interaction process, the participants can determine the identity of each other through face comparison.
  • the embodiments of the present application provide a method, device, terminal, server and storage medium for displaying identity information, which can improve the efficiency and accuracy of offline interaction.
  • the technical solution is as follows:
  • an embodiment of the present application provides a method for displaying identity information, which is applied in a terminal, and the method includes:
  • Image collection is performed through a camera, and the collected images include people in the environment;
  • first identity information is displayed around the character in an augmented reality manner, where the first identity information is used to indicate the social identity of the character.
  • an embodiment of the present application provides a method for displaying identity information, the method comprising:
  • a geographic location information is used to represent the geographic location of the character in the environment;
  • first identity information of the character based on the first geographic location information, where the first identity information is augmented reality social information corresponding to the character;
  • an embodiment of the present application provides an apparatus for displaying identity information, and the apparatus includes:
  • the image acquisition module is used for image acquisition through the camera, and the acquired images include persons in the environment;
  • the information display module is used to display the image collected by the camera based on the augmented reality social mode, and the augmented reality social mode is used to display the augmented reality social information for the currently logged in account; based on the geographic location of the character in the environment , displaying first identity information around the character in an augmented reality manner, where the first identity information is used to indicate the social identity of the character.
  • an embodiment of the present application provides an apparatus for displaying identity information, and the apparatus includes:
  • the request receiving module is configured to receive an identity information acquisition request sent by the second terminal, the identity information acquisition request includes first geographic location information, and the first geographic location information is performed by the terminal on the image collected by the camera. It is determined that the first geographic location information is used to represent the geographic location where the character is located in the environment;
  • an information determination module configured to determine first identity information of the character based on the first geographic location information, where the first identity information is augmented reality social information corresponding to the character;
  • an information sending module configured to send the first identity information to the second terminal, and the second terminal is configured to display the first identity information around the character in an augmented reality manner.
  • an embodiment of the present application provides a terminal, the terminal includes a processor and a memory, the memory stores at least one instruction, and the at least one instruction is loaded and executed by the processor to achieve the above A method for displaying identity information on an aspect terminal side.
  • an embodiment of the present application provides a server, the server includes a processor and a memory, the memory stores at least one instruction, and the at least one instruction and the at least one program are loaded by the processor And execute to implement the method for displaying identity information on the server side as in the above aspect.
  • an embodiment of the present application provides a computer-readable storage medium, where at least one piece of program code is stored in the computer-readable storage medium, and the program code is loaded and executed by a processor to implement the above aspect on the terminal side
  • an embodiment of the present application provides a computer program product or computer program, where the computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the method for displaying identity information provided by the above aspects.
  • the terminal When the identity information of a specific person needs to be acquired during offline interaction, the terminal only needs to use the terminal for image acquisition, and the terminal can identify the person in the image and determine the geographical location of the person in the real environment, so as to obtain the information based on the geographical location.
  • the identity information of the character, and then the identity information is displayed around the character by means of augmented reality; using the solution provided by the embodiment of this application, the user does not need to determine the identity of the participant through face comparison, and the offline interaction scene containing a large number of participants is used. It can improve the efficiency of obtaining identity information; at the same time, obtaining identity information based on geographic location information can avoid the problem of large errors in face comparison results, help to improve the accuracy of the obtained identity information, and improve offline efficiency of interaction.
  • FIG. 1 shows a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application
  • FIG. 2 shows a flowchart of a method for displaying identity information provided by an exemplary embodiment of the present application
  • FIG. 3 shows a flowchart of a method for displaying identity information provided by another exemplary embodiment of the present application
  • FIG. 4 is a schematic diagram of the implementation of the process of determining the geographic location information of a person according to an exemplary embodiment of the present application
  • FIG. 5 is a schematic interface diagram showing the display effect of an augmented reality business card according to an exemplary embodiment of the present application
  • FIG. 6 shows a flowchart of a method for displaying identity information provided by another exemplary embodiment of the present application
  • FIG. 7 is a schematic interface diagram of an identity information display process provided by an exemplary embodiment of the present application.
  • FIG. 8 is a schematic interface diagram of a viewing permission setting process shown in an exemplary embodiment of the present application.
  • FIG. 9 shows a flowchart of a method for displaying identity information provided by another exemplary embodiment of the present application.
  • FIG. 10 shows a flowchart of a method for displaying identity information provided by another exemplary embodiment of the present application.
  • FIG. 11 is a flowchart of an interaction process between a terminal and a server according to an exemplary embodiment of the present application.
  • FIG. 12 is a structural block diagram of an apparatus for displaying identity information provided by an exemplary embodiment of the present application.
  • FIG. 13 is a structural block diagram of an apparatus for displaying identity information provided by another exemplary embodiment of the present application.
  • FIG. 14 shows a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • FIG. 15 shows a schematic structural diagram of a server provided by an exemplary embodiment of the present application.
  • FIG. 1 shows a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application.
  • the implementation environment includes a first terminal 110 , a server 120 and a second terminal 130 .
  • data communication is performed between the first terminal 110 and the server 120, and between the second terminal 130 and the server 120 through a communication network.
  • the communication network may be a wired network or a wireless network, and the communication network may be At least one of a local area network, a metropolitan area network, and a wide area network.
  • the first terminal 110 is a terminal used by the first user, and the terminal is an electronic device with image capturing and augmented reality functions, and the electronic device may be a smart phone, a tablet computer, smart glasses, and the like. With the help of image acquisition and augmented reality functions, the first terminal 110 can display the identity information of surrounding persons in an augmented reality manner.
  • the smart phone can capture images through the rear camera, so as to display the captured images on the screen, and display augmented reality business cards (including identity information) around the characters in the images;
  • the smart glasses can collect images through a camera, and display the collected images and augmented reality business cards through a lens with a display function (a non-transparent lens, equivalent to a display screen), or, through a projection component.
  • the image of the augmented reality business card is projected on the lens (transparent lens) or the human eyeball.
  • the first terminal 110 also has a positioning function. Through the positioning function, the first terminal 110 can obtain the geographical position information of the geographical position in the environment, and determine the collected images based on its own geographical position information.
  • the geographic location information of the person in the middle wherein the geographic location information can be latitude and longitude coordinate information, and the positioning function can be implemented by a positioning component.
  • the positioning component may be a Global Positioning System (Global Positioning System, GPS) component, a Beidou positioning component, etc., which is not limited in this embodiment.
  • the second terminal 130 is a terminal used by the second user, and the terminal is an electronic device with a positioning function, and the electronic device may be a smart phone, a tablet computer, a smart glasses, or the like provided with a positioning component.
  • the second terminal 130 may report the geographic location information to the server 120 in real time, so that the server 120 implements the identity information display function based on the reported geographic location information.
  • the second terminal 130 also has image collection and augmented reality functions, and with the image collection and augmented reality functions, the second terminal 130 can also display the identity information of surrounding persons in an augmented reality manner.
  • the server 120 may be an independent physical server, or a server cluster or a distributed system composed of multiple physical servers, or may provide cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communications, Cloud servers for basic cloud computing services such as middleware services, domain name services, security services, Content Delivery Network (CDN), and big data and artificial intelligence platforms.
  • the server 120 may also be a node in the blockchain system, which is not limited in this embodiment.
  • an application program with an identity information display function is installed in the first terminal 110 and the second terminal 130, the server 120 is a background server of the application program, and the application program may be an instant messaging application, a social application, Friendship applications, recruitment applications, etc., are not limited in this embodiment.
  • the server 120 stores the identity information reported by each terminal and the geographic location information reported by each terminal in real time, and the identity information corresponding to the terminal and the geographic location information are stored in association.
  • the first terminal 110 used by user A reports the identity information and geographic location information to the server 120 ; and the second terminal 130 used by user B reports the identity information and geographic location information to the server 120 information.
  • the first terminal 110 is used to photograph user B.
  • the first terminal 110 determines the first geographic location information of the user B based on the captured image and its own geographic location information, and sends an identity information acquisition request including the first geographic location information to the server 120 .
  • the server 120 determines that the user B is located at the location indicated by the first geographic location information, so as to feed back the identity information of the user B to the first terminal 110 .
  • the first terminal 110 generates an augmented reality business card 111 according to the received identity information, and displays the augmented reality business card 111 around the user B in the image.
  • the second terminal 130 can also be used to photograph user A, and the second geographic location information of user A can be determined, so as to obtain information from the server 120 based on the second geographic location information.
  • the above embodiment is only described by taking the application to smart phones as an example.
  • the user When applied to smart glasses, the user only needs to wear smart glasses and gaze at the person whose identity information needs to be acquired to trigger the identity information acquisition and display process (the gaze duration can be longer than the duration).
  • the threshold is a trigger condition), and there is no need to manually trigger the shooting function, which is not limited in this embodiment.
  • the above-mentioned embodiment only schematically illustrates the display process of the identity information through the first terminal and the second terminal.
  • the solution provided by the embodiment of the present application can be applied to the display of three or more users.
  • the offline interaction scenario (that is, including three or more terminals) is not limited in this implementation.
  • the shooting action involved in the various embodiments of the present application may be the action of aligning the camera at the target after the camera is turned on, and correspondingly, the identity information is displayed in the real-time viewfinder screen; or, the shooting action may be the triggering of the shooting control after the camera is turned on. (such as the shutter button) action, correspondingly, the identity information and the image captured by the camera are recorded in the captured photo or recorded video.
  • the method for displaying identity information provided by the embodiments of the present application is applied to various offline interaction scenarios, and the following description takes several specific offline interaction scenarios as examples.
  • the application program reports the identity information and real-time geographic location information of the terminal to the server.
  • the application will identify the participant in the image and determine the geographic location of the participant. location, and based on the geographic location, the identity information displayed by the participant to the outside world is obtained from the server, and then the augmented reality business card containing the identity information is displayed around the participant in the image.
  • the participants can quickly know the identity of the other party, so as to determine whether further communication is needed, reduce the communication cost of obtaining the identity information of the other party, reduce the ineffective communication between the participants, and improve the communication efficiency between the participants.
  • the instant messaging application When applied to offline dating scenarios, after users reach the dating venue, they can set the externally displayed dating information (including age, hobbies, dating goals, etc.) through the instant messaging application installed in the terminal after authorization, and open the instant messaging application.
  • the augmented reality business card function of the program, the instant messaging application reports the dating information and the real-time geographic location information of the terminal to the server.
  • the application When a user wants to view the friend information of other users, he only needs to use the terminal to take pictures of the user to be viewed, and the application will identify the user in the image and determine the geographic location of the user, based on the geographic location , obtain the friendship information displayed by the user externally from the server, and then display an augmented reality business card containing the friendship information around the user in the image.
  • the user can quickly understand the hobbies of other users through the displayed augmented reality business card, so as to quickly locate users with the same hobbies and make friends based on the similarity between the hobbies, or quickly determine communication topics based on the hobbies to improve the relationship between users.
  • users can also quickly establish a social relationship in an instant messaging application in order to make friends online.
  • recruiters When applied to offline recruitment scenarios, recruiters can set company information through the app in advance, and applicants can set personal information through the app in advance.
  • both recruiters and candidates turn on the app's augmented reality business card feature.
  • the applicant wants to know the company information of the recruiter, he only needs to use the terminal to shoot the recruiter to obtain the company information of the recruiter, so as to combine the personal information to determine whether further communication with the recruiter is required; when the recruiter wants to know
  • the applicant's personal information it is only necessary to use the terminal to photograph the applicant to obtain the applicant's personal information, so as to determine whether it is necessary to further communicate with the applicant according to the personal information and recruitment needs, so as to improve both the applicant and the recruiter. communication efficiency.
  • FIG. 2 shows a flowchart of a method for displaying identity information provided by an exemplary embodiment of the present application. This embodiment is described by taking the method being used in the first terminal 110 or the second terminal 130 shown in FIG. 1 as an example, and the method includes the following steps.
  • step 201 image acquisition is performed by a camera.
  • the captured images include persons in the environment.
  • the camera is turned on to capture images and images including the person to be recognized.
  • the camera may be a camera applying 3D vision technology, that is, information such as spatial orientation and size of each character in the image can be sensed based on the image collected by the camera.
  • the 3D vision technology may be at least one of Time of Flight (ToF) technology, monocular stereo vision technology, binocular vision technology, multi-eye stereo vision technology, or 3D structured light technology, which is not made in this embodiment. limited.
  • TOF Time of Flight
  • monocular stereo vision technology monocular stereo vision technology
  • binocular vision technology binocular vision technology
  • multi-eye stereo vision technology multi-eye stereo vision technology
  • 3D structured light technology which is not made in this embodiment. limited.
  • Step 202 display the image collected by the camera based on the augmented reality social mode.
  • the augmented reality social mode is used to display augmented reality social information for the currently logged in account.
  • displaying the image collected by the camera includes at least one of the following methods: 1. Displaying the image collected by the camera on the display screen of the terminal in an augmented reality social mode; 2. Using the augmented reality social network on the augmented reality glasses The mode displays the image captured by the camera, which is not limited in this embodiment.
  • Step 203 Based on the geographic location of the character in the environment, the first identity information is displayed around the character by means of augmented reality.
  • the first identity information is used to indicate the social identity of the character.
  • a person in the image is identified, and first geographic location information corresponding to the person is determined, where the first geographic location information is used to represent the geographic location of the person in the real environment.
  • the terminal obtains the identity information corresponding to the person to be recognized based on the geographic location of the person to be recognized in the real environment. Therefore, when capturing images through the camera, the terminal recognizes the person in the image, and determines the corresponding identity of the person.
  • First geographic location information is geographic location information in a world coordinate system.
  • the first geographic location information may be first latitude and longitude information corresponding to a person.
  • the terminal uses 3D vision technology and determines the first geographic location information corresponding to the character based on its own geographic location in the real environment. Therefore, the application program implementing the function of displaying identity information needs to have the terminal geographic location information. Access to location information.
  • the terminal determines the first geographic location information corresponding to each person; or, the terminal determines the target person from the multiple persons, thereby determining the first geographic location information corresponding to the target person.
  • the target person may be a person located in the center of the image, or a person in a focus area, or a person designated by the user.
  • the first identity information of the person is acquired.
  • the terminal sends an identity information acquisition request including the first geographic location information to the server.
  • the identity information reported by each terminal is stored in the server.
  • the server determines the terminal located at the geographic location indicated by the first geographic location information, thereby determining the identity information of the user corresponding to the terminal as the first identity information.
  • the server determines the terminal that is closest to the geographic location indicated by the first geographic location information, and the distance is less than a distance threshold (for example, 1 m), so that the terminal is located.
  • the identity information of the corresponding user is determined as the first identity information.
  • the identity information stored by the server is the identity information preset by the terminal and authorized to be displayed to the outside world, and the identity information may include at least one of text information, picture information and video information.
  • the information types of the identity information set by the user through the terminal may be the same or different.
  • the identity information set by user A through the terminal may include name, company, and position
  • the identity information set by user B through the terminal may include name, company, and the like.
  • the information type of the identity information may also be different.
  • the information type of identity information may include name, company and position; in the offline dating scenario, the information type of identity information may include name, age and hobbies; in the offline recruitment scenario, the information of identity information Types can include name, graduate school, major, etc.
  • the first identity information is displayed around the character by means of augmented reality.
  • the terminal displays the first identity information around the character in an augmented reality manner.
  • the terminal may render the first identity information in a specific augmented reality object element, so as to display the augmented reality object element around the character.
  • the terminal can render the first identity information in the augmented reality business card, so as to display the augmented reality business card around the character, so that the user can obtain the identity information of the character through the augmented reality business card corresponding to the character.
  • the first identity information is displayed in a bound manner with the character, that is, the first identity information moves as the character moves on the screen, thereby ensuring the correspondence between the character and the first identity information. Moreover, when the character moves out of the screen, the display of the first identity information is stopped.
  • the terminal when the identity information of a specific person needs to be acquired during offline interaction, only the terminal needs to be used for image acquisition, and the terminal can recognize the person in the image and determine that the person is in the real environment.
  • the geographic location of the person is obtained based on the geographic location, and the identity information of the person is obtained based on the geographic location, and then the identity information is displayed around the person by means of augmented reality; using the solution provided by the embodiment of the present application, the user does not need to determine the identity of the participant through face comparison.
  • obtaining identity information based on geographic location information can avoid the problem of large errors in the face comparison results, which helps to improve the ability to obtain identity information.
  • the accuracy of identity information improves the efficiency of offline interaction.
  • the terminal uses an augmented reality business card to display the acquired identity information, and in order to improve the efficiency of the user locating the person with high communication value, the terminal determines, based on the correlation between the person to be identified and its own user,
  • the size of the augmented reality business card enables users to quickly distinguish the communication value of different characters based on the size of the augmented reality business card. Exemplary embodiments are used for description below.
  • FIG. 3 shows a flowchart of a method for displaying identity information provided by another exemplary embodiment of the present application. This embodiment is described by taking the method being used in the first terminal 110 or the second terminal 130 shown in FIG. 1 as an example, and the method includes the following steps.
  • step 301 image acquisition is performed by a camera.
  • step 201 For the implementation of this step, reference may be made to step 201, and details are not described herein again in this embodiment.
  • Step 302 identify the person in the image, and determine the person's position information in the camera coordinate system.
  • the terminal has the ability to obtain its own geographic location in the world coordinate system, and the ability to determine the location of the person in the image in the camera coordinate system. Therefore, the terminal can be based on different coordinate systems.
  • the conversion relationship between geographic location information determines the geographic location information of the characters in the image in the world coordinate system.
  • the terminal determines the person position information of the person in the camera coordinate system by using a 3D vision technology.
  • the camera coordinate system is a three-dimensional rectangular coordinate system with the focus center of the camera as the coordinate origin and the optical axis as the Z axis.
  • the terminal when the terminal is provided with dual cameras, the terminal can determine the character position information of the character in the camera coordinate system through the binocular stereo vision technology (based on the principle of imaging difference between binoculars); when When the terminal is equipped with a ToF camera, the terminal can determine the person's position information in the camera coordinate system through the ToF depth measurement technology.
  • the terminal may also determine the person's position information by combining various 3D vision technologies, so as to improve the accuracy of the person's position information, which is not limited in this embodiment.
  • the user uses the terminal 41 to photograph the person 42 to obtain the person position information P 1 of the person 42 in the camera coordinate system 43 .
  • Step 303 Determine the first geographic location information of the person in the world coordinate system based on the location information of the person and the second geographic location information of the terminal in the world coordinate system.
  • the terminal acquires its own second geographic location information in the world coordinate system in real time, so as to convert the character location information from the camera coordinate system to the world coordinate system. 2.
  • the geographic location information and the transformed character location information determine the first geographic location information of the character in the world coordinate system.
  • the terminal may convert the character position information by using a rotation matrix and a translation matrix, which is not limited in this embodiment.
  • the terminal 41 determines the first geographic location information P 3 of the person 42 in the world coordinate system 44 based on its own second geographic location information P 2 and the character location information P 1 .
  • this embodiment only takes the above manner as an example to describe the process of determining the first geographic location information.
  • the first geographic location information of the person in the image may be determined in other possible manners, which is not limited in this embodiment.
  • Step 304 Obtain the first identity information from the server based on the first geographic location information and the second identity information of the user corresponding to the terminal, where the second identity information satisfies the viewing authority of the first identity information.
  • the identity information of each user who has enabled the augmented reality business card function is obtained by photographing.
  • the user may set viewing authority for his own identity information, so as to display the identity information only to users with viewing authority.
  • the terminal uploads the identity information set by the user together with the viewing authority to the server, and the server stores it.
  • the terminal when the terminal obtains the first identity information of the person in the image from the server, in addition to the first geographic location information of the person, it also needs to provide the second identity information of the corresponding user, so that the server needs to provide the second identity information based on the second identity information. , to determine whether the current terminal has the right to view the first identity information.
  • the second identity information corresponds to the account logged in by the terminal.
  • the terminal sends an identity information acquisition request including the first geographic location information and the second identity information to the server, so that the server checks the matching first identity information based on the first geographic location information, and based on the second geographic location information
  • the identity information determines whether the terminal has viewing authority of the first identity information.
  • the terminal may only send the first geographic location information and the user identifier of the user corresponding to the terminal to the server, and the server obtains the second identity information based on the user identifier, which is not limited in this embodiment.
  • the viewing authority includes identity information conditions, and the identity information conditions may include industry conditions, company conditions, position conditions, gender conditions, age conditions, hobby conditions, academic qualifications, work experience conditions, etc. This embodiment does not apply The specific content of the viewing permission is limited.
  • the server feeds back the first identity information to the terminal; otherwise, the server does not feed back the first identity information to the terminal.
  • Step 305 generating an augmented reality business card based on the first identity information.
  • the terminal After acquiring the first identity information, the terminal renders and generates an augmented reality business card according to the first identity information, wherein the augmented reality business card may be generated on the basis of a preset business card template based on the identity information.
  • the augmented reality business cards generated based on different identity information are consistent in size and form.
  • at least one of the size and form of the augmented reality business cards generated based on different identity information is different.
  • the terminal when the terminal generates an augmented reality business card, the following steps may be included.
  • the server after acquiring the first identity information based on the first geographic location information, acquires the second identity information of the user corresponding to the terminal, and determines the person relevancy based on the first identity information and the second identity information, Thereby, the relevancy of the character is fed back to the terminal together with the first identity information.
  • the higher the character relevancy the higher the communication value of the character to the current terminal user, the lower the character relevancy, the lower the communication value of the character to the current terminal user.
  • the server when applied to an offline meeting scenario, may calculate the character relevancy through a preset acquaintance algorithm; when applied to an offline friendship scenario, the server may calculate the character relevancy through an interest matching algorithm; In the recruitment scenario, the server may calculate the relevancy of the person through the resume matching algorithm, which is not limited in this embodiment.
  • the step of determining the relevance of the person may also be performed by the terminal, thereby reducing the processing pressure of the server, which is not limited in this embodiment.
  • the personal relevancy degrees corresponding to the users may be the same or may be different.
  • the personal relevancy of user A to user B is s 1
  • the personal relevancy of user B to user A is s 2 , which is not limited in this embodiment.
  • the augmented reality business card has a default business card size
  • the terminal determines the business card size of the augmented reality business card corresponding to the first identity information according to the person relevancy and the default business card size, that is, the higher the person relevancy, the augmented reality business card The larger the size of the business card of the augmented reality business card, the lower the person relatedness, and the smaller the business card size of the augmented reality business card.
  • the determined business card size is (s ⁇ a)*(s ⁇ b) .
  • the terminal renders and generates an augmented reality business card based on the first identity information and the size of the business card, or the terminal generates a default augmented reality business card based on the first identity confidence and the default business card size, and the default augmented reality business card is generated based on the relevancy of the characters.
  • the augmented reality business card is zoomed to obtain an augmented reality business card.
  • the image captured by the terminal includes a first person 51 and a second person 52. Since the person correlation degree of the first person 51 is 0.9, and the person correlation degree of the second person 52 is 0.5, Therefore, the business card size of the first person 51 corresponding to the first augmented reality business card 53 is larger than the business card size of the second person 52 corresponding to the second augmented reality business card 54 .
  • Step 306 Determine the business card position information of the augmented reality business card in the camera coordinate system based on the person's position information in the camera coordinate system.
  • the terminal In order to simulate the effect of displaying a physical business card around a person in the real world, when displaying an augmented reality business card, the terminal needs to determine the business card display position of the augmented reality business card in the camera coordinate system. In a possible implementation, the terminal determines the target position from a predetermined range around the position indicated by the character position information based on the character position information of the character in the camera coordinate system, so as to determine the target position as the business card position.
  • the position of the business card indicated by the business card position information is located at a preset distance below the face (for example, 20cm below the face), and at a preset distance above the person's head (for example, 10cm above the person's head).
  • the location of the business card is limited.
  • Step 307 displaying the augmented reality business card at the position indicated by the business card position information in an augmented reality manner, wherein the position indicated by the business card position information is located around the position indicated by the person position information.
  • the terminal renders and displays the augmented reality business card at the position indicated by the location information of the business card in an augmented reality manner, simulating the display of a physical business card around a character in the real world.
  • the terminal displays a first augmented reality business card 53 and a second augmented reality business card 54 respectively 20 cm below the faces of the first character 51 and the second character 52 .
  • the position of the business card of the augmented reality business card changes with the position of the character, so as to achieve the effect that the business card moves with the character.
  • Step 308 in response to the change of the face orientation of the character, adjust the business card orientation of the augmented reality business card based on the face orientation, wherein the business card orientation is consistent with the face orientation.
  • the business card orientation of the augmented reality business card (that is, the side containing the identity information) is consistent with the face orientation of the character.
  • the terminal interprets the augmented reality based on the face orientation
  • the business card orientation of the business card is adjusted in real time to improve the authenticity of the augmented reality business card display.
  • the augmented reality business card includes the front of the business card and the back of the business card.
  • the terminal displays the back of the business card of the augmented reality business card, and the content displayed on the back of the business card can be set by the user.
  • the back of a business card can be the company logo.
  • the terminal determines the location information of the person in the world coordinate system based on the person location information of the person in the image in the camera coordinate system and the location information of the terminal in the world coordinate system, which improves the determined location information.
  • the accuracy of the geographic location information thereby improving the accuracy of the subsequently obtained identity information.
  • the terminal determines the character relevancy based on the correlation between the identity information, and then determines the display size of the augmented reality business card based on the character relevancy, so that the size of the augmented reality business card corresponding to the person with high communication value is larger than that of the low
  • the size of the augmented reality business card corresponding to the communication value character improves the efficiency of users in selecting communication characters based on the augmented reality business card.
  • the terminal displays a social relationship establishment control; when receiving a trigger operation on the social relationship establishment control, the terminal sends a social relationship establishment request to the social server based on the social information, requesting to establish a social relationship.
  • the terminal may determine whether a trigger operation for the augmented reality business card is received based on the pixel coordinates of the augmented reality business card in the image.
  • the terminal displays respective social relationship establishment controls corresponding to different social applications, and receives a trigger for the social relationship establishment controls corresponding to the target social application.
  • a request for establishing a social relationship is sent to the social server through the target social application.
  • the application when the application to which the solution provided by the embodiment of the present application is applied and the target social application are the same application, the application directly sends a request to the social server based on the social information; and when the application to which the solution is provided by the embodiment of the present application is the same application
  • the application sends a social relationship establishment request including social information to the target social application by calling the application programming interface (Application Programming Interface, API) provided by the target social application, and the target social application is based on The social information sends a social relationship establishment request to the social server.
  • API Application Programming Interface
  • the terminal displays a social relationship establishment control 55, and the user clicks
  • the social relationship establishment control 55 can quickly establish a social relationship.
  • the terminal when receiving a triggering operation on the augmented reality business card and the first identity information includes contact information, the terminal displays a contact adding control; when receiving a triggering operation on the contact adding control , the terminal automatically creates a contact based on the contact information.
  • the terminal displays a contact adding control 56, and the user contacts by clicking The person adding control 56 can trigger the terminal to automatically add a contact to the address book.
  • FIG. 6 shows a flowchart of an identity information setting process provided by an exemplary embodiment of the present application. This embodiment is described by taking the method being used in the first terminal 110 or the second terminal 130 shown in FIG. 1 as an example, and the method includes the following steps.
  • Step 601 displaying an information display setting interface, where the information display setting interface is used to set the second identity information to be displayed externally.
  • the information display setting interface may be manually triggered and displayed by the user, or may be automatically triggered and displayed by the terminal.
  • the terminal when receiving an augmented reality social mode activation instruction (triggered through a function entry in the application), the terminal displays an information display setting interface, or, when it is detected that the augmented reality social mode activation conditions are currently met , the terminal displays prompt information, and displays an information display setting interface when receiving a triggering operation for the prompt information.
  • the terminal when a itinerary is set in the terminal, the terminal obtains the itinerary information (including the itinerary location and travel time) of the target itinerary (offline activity itinerary), and detects whether the current geographical location and current time are within the target itinerary, if the current The geographic location and the current time indicate that the target itinerary is on, and the terminal displays an information display reminder to remind the user to turn on the augmented reality social mode. In response to the triggering operation for the information display reminder, the terminal displays the information display setting interface.
  • the itinerary information including the itinerary location and travel time of the target itinerary (offline activity itinerary)
  • the terminal displays an information display reminder to remind the user to turn on the augmented reality social mode.
  • the terminal displays the information display setting interface.
  • the information display setting interface is a blank setting interface, and the user can customize the identity information items to be displayed externally according to the identity display requirements; in other embodiments, in order to improve the efficiency of the user setting the identity information, the The information display setting interface contains pre-set identity information, and the user can quickly select the identity information displayed this time through the information display setting interface.
  • the user enters the identity information 72 in the identity information entry interface 71 in advance and saves it.
  • the terminal displays an information display reminder notification 73 to remind the user to set the identity information to be displayed externally, and to enable the augmented reality social mode.
  • the terminal displays the information display setting interface 74 , and the information display setting interface 74 includes the identity information 72 pre-entered in the identity information input interface 71 .
  • Step 602 In response to the information setting operation in the information display setting interface, the augmented reality social mode is enabled, and the second identity information and the second geographic location information of the terminal in the world coordinate system are reported to the server.
  • the information setting operation may be an upload operation, an input operation, or a check operation.
  • the information setting operation may further include a typesetting operation, where the typesetting operation is used to adjust the display position of the second identity information on the augmented reality business card.
  • the user can select the preset identity information 72 in the information display setting interface 74 as the second identity information to be displayed externally this time.
  • the second identity information checked by the user will be displayed in the augmented reality business card preview area 75 , so that the user can adjust the display position of each piece of second identity information in the augmented reality business card preview area 75 .
  • the terminal After completing the setting of the second identity information, the terminal starts the augmented reality social mode, and uploads the second identity information and its own second geographic location information to the server, and the server associates and stores the identity information and the geographic location information.
  • the terminal Since the geographic location where the terminal is located will change, during the process of enabling the augmented reality social mode, the terminal reports the second geographic location information to the server in real time, so as to improve the accuracy of subsequent identity information display.
  • Step 603 In response to the permission setting operation in the information display setting interface, send permission information to the server, where the permission information is used to indicate the viewing permission of the second identity information.
  • the information display setting interface can also set the viewing authority of the identity information, that is, the identity information set by the user can only be viewed by the terminal (user) with the viewing authority of the identity information.
  • the terminal determines the permission information corresponding to the second identity information based on the received permission setting operation, where the permission information is used to indicate a condition that a user with viewing permission needs to meet, or is used to indicate that a shielded user needs to meet the condition. conditions of.
  • the terminal reports the determined authority information to the server, and the server associates and stores the authority information and the identity information.
  • the user can click the permission setting control 82 to set the viewing permission.
  • the shielding option 83 is displayed, and the user can select the industry and position to be shielded from the shielding option 83, that is, the user belonging to the shielded industry and the shielded position will not be able to view the second. Identity Information.
  • the terminal displays a business card viewing prompt, prompting the user that the business card of other users can be obtained by photographing.
  • a business card viewing prompt 76 the terminal displays a business card viewing prompt 76, and when receiving a click operation on the business card viewing prompt 76, the terminal performs image collection, and displays the image in the image.
  • the augmented reality business card 77 of the user "Zhang San” is displayed in the display, and the augmented reality business card 77 contains the second identity information set by the user "Zhang San" through the information display setting interface 74 .
  • users with viewing authority may also be divided, and different viewing content may be set for users with different viewing authority levels.
  • the identity information set for users with first-level viewing authority can be visible, and users with second-level viewing authority are set to be visible. Only part of the identity information (such as name and company) of the user setting of the authority is visible, which is not limited in this embodiment.
  • the terminal determines whether it is in the target itinerary based on the current geographical location and the current time, so that when the terminal is in the target itinerary, it automatically prompts the user to set the identity information and enable the augmented reality social mode to avoid users forgetting to turn on the augmented reality social mode. An issue that caused augmented reality business cards to not display properly.
  • FIG. 9 shows a flowchart of a method for displaying identity information provided by another exemplary embodiment of the present application. This embodiment is described by taking the method used in the server shown in FIG. 1 as an example, and the method includes the following steps.
  • Step 901 Receive an identity information acquisition request sent by a second terminal, where the identity information acquisition request includes first geographic location information, and the first geographic location information is obtained by the terminal performing person identification on images collected by the camera, and the first geographic location information is determined. Information is used to characterize the geographic location of the character in the real environment.
  • the second terminal performs image collection through a camera, identifies the person in the image, and determines the first geographic location information of the person in the world coordinate system, thereby Send an identity information acquisition request including the first geographic location information to the server to acquire the first identity information of the person.
  • the server after receiving the identity information acquisition request sent by the second terminal, the server detects whether the second terminal enables the augmented reality social mode, and if the second terminal does not enable the augmented reality social mode, prompts to enable the augmented reality social mode; The second terminal has turned on the augmented reality social mode, and the subsequent process is performed.
  • the server may determine whether the second terminal enables the augmented reality social mode by detecting whether the identity information and geographic location information corresponding to the second terminal are stored.
  • Step 902 based on the first geographic location information, determine the first identity information of the person.
  • the server stores geographic location information and identity information corresponding to each terminal (with the augmented reality social mode turned on). After receiving the request for obtaining the identity information, the server calculates the distance between the first geographic location information and the positions indicated by each stored geographic location information, thereby determining the person at the location indicated by the first geographic location information based on the distance, and then obtaining the information.
  • the first identity information corresponding to the character.
  • the server determines the person whose distance from the position indicated by the first geographic location information is smaller than the distance threshold and whose distance is the closest is the target person, so as to obtain the first identity information corresponding to the target person.
  • the server may perform regional division on the stored geographic location information, so as to determine the target area based on the first geographic location information, and then determine the target person from the target area, which is not limited in this embodiment.
  • the server may also, based on the first geographic location information and the second geographic location information of the second terminal, determine whether the user corresponding to the second terminal is located in the same scene as the person in the image, and When located in the same scene, determine the first identity information of the character.
  • Step 903 Send the first identity information to the second terminal, and the second terminal is used to display the first identity information around the character in an augmented reality manner.
  • the server feeds back the determined first identity information to the terminal, so that the second terminal can display the first identity information in an augmented reality manner.
  • the second terminal displays the first identity information.
  • the terminal when the identity information of a specific person needs to be acquired during offline interaction, only the terminal needs to be used for image acquisition, and the terminal can recognize the person in the image and determine that the person is in the real environment.
  • the geographic location of the person is obtained based on the geographic location, and the identity information of the person is obtained based on the geographic location, and then the identity information is displayed around the person by means of augmented reality; using the solution provided by the embodiment of the present application, the user does not need to determine the identity of the participant through face comparison.
  • obtaining identity information based on geographic location information can avoid the problem of large errors in the face comparison results, which helps to improve the ability to obtain identity information.
  • the accuracy of identity information improves the efficiency of offline interaction.
  • FIG. 10 shows a flowchart of a method for displaying identity information provided by another exemplary embodiment of the present application. This embodiment is described by taking the method used in the server shown in FIG. 1 as an example, and the method includes the following steps.
  • Step 1001 Receive geographic location information and identity information reported by each terminal.
  • the terminal after the terminal turns on the augmented reality social mode, it reports the identity information displayed to the server to the outside world, and reports the real-time geographic location information to the server after the augmented reality social mode is turned on.
  • the server receives the information reported by the terminal, it stores and updates the information.
  • the corresponding relationship between the geographic location information and the identity information stored in the server is shown in Table 1.
  • Step 1002 Receive an identity information acquisition request sent by a second terminal, where the identity information acquisition request includes first geographic location information.
  • step 901 For the implementation of this step, reference may be made to step 901, and details are not described herein again in this embodiment.
  • Step 1003 Determine a first terminal located at the geographic location indicated by the first geographic location information.
  • the server determines the distance between the person to be identified and each terminal based on the first geographic location information and the stored geographic location information of each terminal, so as to determine the terminal closest to the person to be identified as the first terminal. a terminal.
  • Step 1004 Determine the identity information reported by the first terminal as the first identity information.
  • the server sends the first reported identity information acquisition request, which includes the first geographic location information as P 0 , and the server calculates the distance between P 0 and P 1 . is 20 meters, the distance between P 0 and P 2 is 100 meters, the distance between P 0 and P 3 is 0.5 meters, and the distance between P 0 and P 4 is 50 meters, determine the character to be identified as "Li Four", so as to obtain the first identity information of "Li Si”: Name: Li Si; Company: YYY Co., Ltd.; Position: Project Manager.
  • Step 1005 Acquire second identity information corresponding to the second terminal and authority information corresponding to the first identity information, where the authority information is used to indicate the viewing authority of the first identity information.
  • the server After acquiring the first identity information, the server detects whether the first identity information is set with viewing authority, and if the viewing authority is set, determines whether the second terminal has the authority to obtain the first identity information, If the viewing permission is not set, the subsequent process is directly executed.
  • the terminal while sending the identity information to the server, the terminal sends the set permission information to the server, and correspondingly, the server associates and stores the permission information and the identity information.
  • the server associates and stores the permission information and the identity information.
  • the corresponding relationship among geographic location information, identity information and authority information is shown in Table 2.
  • the server detects whether the second identity information corresponding to the second terminal satisfies the viewing authority indicated by the authority information, and if so, executes the following step 1006, and if not, does not feed back the first identity information to the second terminal.
  • Step 1006 In response to the second identity information satisfying the viewing authority of the first identity information, determine the person relevancy based on the first identity information and the second identity information reported by the second terminal.
  • the server determines, based on the first identity information and the second identity information corresponding to the second terminal, the user of the first terminal and the The personal relevancy between the two end users.
  • the server when applied to an offline meeting scenario, can calculate the character relevancy through an acquaintance algorithm; when applied to an offline dating scenario, the server can calculate the character relevancy through an interest matching algorithm; when applied to an offline recruitment scenario At the time, the server may calculate the personal relevancy degree through the resume matching algorithm, which is not limited in this embodiment.
  • Step 1007 Send the first identity information and the person relevancy to the second terminal, and the second terminal is used to determine the business card size based on the person relevancy, and generate an augmented reality business card based on the first identity information and the business card size.
  • the server sends the determined character relevancy to the second terminal while sending the first identity information to the second terminal.
  • the second terminal determines the business card size of the augmented reality business card according to the relevancy of the person, and then displays the augmented reality business card using the business card size around the person.
  • the degree of person correlation is determined, and then the display size of the augmented reality business card is determined based on the degree of person correlation, so that the size of the augmented reality business card corresponding to the person with high communication value is larger than that of the person with low communication value
  • the corresponding augmented reality business card size improves the user's efficiency in selecting communication characters based on the augmented reality business card.
  • the interaction process between the terminal and the server is shown in FIG. 11 .
  • Step 1101 the terminal acquires the user A's input identity information, and reports it to the server.
  • the terminal While reporting the identity information of user A, the terminal reports geographic location information.
  • Step 1102 the server stores the identity information and geographic location information of user A.
  • the server based on the user's itinerary information, determines that the user A is in the target itinerary, and pushes a prompt.
  • Augmented Reality Mode is used to indicate a mode in which identity information is presented in augmented reality.
  • the terminal determines the identity information that the user A chooses to display to the outside world, and reports it to the server.
  • the server stores the identity information displayed by user A to the outside world.
  • the terminal acquires an image including user B captured by user A using the terminal.
  • the terminal determines the geographic location information of user B, and sends a request to the server.
  • the server obtains the identity information of user B based on the geographic location information.
  • the server determines whether user A has the right to obtain the identity information of user B.
  • the server determines the personal relevancy between user A and user B.
  • the terminal displays an augmented reality business card based on the identity information and the person's relevancy.
  • FIG. 12 is a structural block diagram of an apparatus for displaying identity information provided by an exemplary embodiment of the present application. As shown in FIG. 12 , the apparatus includes:
  • the image acquisition module 1201 is used for image acquisition through a camera, and the acquired images include persons in the environment;
  • the information display module 1204 is configured to display the image captured by the camera based on the augmented reality social mode, which is used to display the augmented reality social information for the currently logged-in account; based on the geographic location of the character in the environment position, and display first identity information around the character in an augmented reality manner, where the first identity information is used to indicate the social identity of the character.
  • the device further includes:
  • a location determination module 1202 configured to identify the person in the image, and determine first geographic location information corresponding to the person, where the first geographic location information is used to represent the geographic location where the person is located in the environment Location;
  • an information acquisition module 1203, configured to acquire the first identity information of the character based on the first geographic location information
  • the information display module 1204 is configured to display the first identity information around the character by means of augmented reality.
  • the information display module 1204 includes:
  • a business card generating unit configured to generate an augmented reality business card based on the first identity information
  • a business card position determination unit configured to determine the business card position information of the augmented reality business card under the camera coordinate system based on the character position information of the person under the camera coordinate system;
  • a business card display unit configured to display the augmented reality business card at the position indicated by the business card position information by means of augmented reality, wherein the position indicated by the business card position information is located around the position indicated by the person position information .
  • a business card generation unit for:
  • the augmented reality business card is generated based on the first identity information and the business card size.
  • the device further includes:
  • An orientation adjustment module configured to adjust the business card orientation of the augmented reality business card based on the face orientation in response to the change of the face orientation of the person, wherein the business card orientation is consistent with the face orientation.
  • the device further includes:
  • a setting interface display module configured to display an information display setting interface, where the information display setting interface is used to set second identity information to be displayed externally, and the second identity information corresponds to the account logged in by the terminal;
  • a reporting module configured to turn on the augmented reality social mode in response to the information setting operation in the information display setting interface, and report the second identity information and the second geographic location information of the terminal in the world coordinate system to the server .
  • the information display setting interface is also used to set viewing authority
  • the reporting module is also used to:
  • permission information is sent to the server, where the permission information is used to indicate the viewing permission of the second identity information.
  • set the interface display module including:
  • a reminder display unit used for displaying an information display reminder in response to the current geographical location and the current time indicating that the target itinerary is in;
  • a setting interface display unit configured to display the information display setting interface in response to a triggering operation of the information display reminder.
  • the first identity information is set with viewing authority
  • the information acquisition module 1203 is used for:
  • the first identity information is acquired from the server based on the first geographic location information and the second identity information of the account corresponding to the terminal, where the second identity information satisfies the viewing authority of the first identity information.
  • the location determination module 1202 is used for:
  • the first geographic position information of the character under the world coordinate system is determined.
  • FIG. 13 is a structural block diagram of an apparatus for displaying identity information provided by another exemplary embodiment of the present application. As shown in FIG. 13 , the apparatus includes:
  • the request receiving module 1301 is configured to receive an identity information acquisition request sent by a second terminal, where the identity information acquisition request includes first geographic location information, and the first geographic location information is used by the terminal to perform person identification on images collected by the camera and it is determined that the first geographic location information is used to represent the geographic location of the character in the environment;
  • an information determination module 1302 configured to determine first identity information of the character based on the first geographic location information, where the first identity information is augmented reality social information corresponding to the character;
  • the information sending module 1303 is configured to send the first identity information to the second terminal, and the second terminal is configured to display the first identity information around the character in an augmented reality manner.
  • the device further includes:
  • a receiving module used for receiving geographic location information and identity information reported by each terminal
  • the information determination module 1302 includes:
  • a terminal determining unit configured to determine a first terminal located at the geographic location indicated by the first geographic location information
  • an information determining unit configured to determine the identity information reported by the first terminal as the first identity information.
  • the device further includes:
  • a relevancy determination module configured to determine a person relevancy based on the first identity information and the second identity information reported by the second terminal
  • the information sending module 1303 is also used for:
  • the second terminal is configured to determine a business card size based on the person relevancy, and generate an enhancement based on the first identity information and the business card size Realistic business card.
  • the first identity information is set with viewing authority
  • the information sending module 1303 is also used for:
  • the first identity information is sent to the second terminal.
  • the device provided in the above-mentioned embodiment is only illustrated by the division of the above-mentioned various functional modules.
  • the above-mentioned function allocation can be completed by different functional modules as required, that is, the internal structure of the device is divided into Different functional modules to complete all or part of the functions described above.
  • the apparatus and method embodiments provided in the above embodiments belong to the same concept, and the implementation process thereof is detailed in the method embodiments, which will not be repeated here.
  • FIG. 14 shows a structural block diagram of a terminal 1400 provided by an exemplary embodiment of the present application.
  • the terminal 1400 may be a portable mobile terminal, such as a smart phone, a tablet computer, a Moving Picture Experts Group Audio Layer III (MP3) player, a Moving Picture Experts Group Audio Layer 4 (Moving Picture Experts) Experts Group Audio Layer IV, MP4) player.
  • Terminal 1400 may also be referred to as user equipment, portable terminal, or other names.
  • the terminal 1400 includes: a processor 1401 and a memory 1402 .
  • the processor 1401 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like.
  • the processor 1401 can use at least one hardware form among digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), and programmable logic array (Programmable Logic Array, PLA).
  • DSP Digital Signal Processing
  • FPGA field programmable gate array
  • PLA programmable logic array
  • the processor 1401 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in a wake-up state, also called a central processing unit (CPU); the coprocessor is a A low-power processor for processing data in a standby state.
  • CPU central processing unit
  • the processor 1401 may be integrated with a graphics processor (Graphics Processing Unit, GPU), and the GPU is used for rendering and drawing the content that needs to be displayed on the display screen.
  • the processor 1401 may further include an artificial intelligence (Artificial Intelligence, AI) processor for processing computing operations related to machine learning.
  • AI Artificial Intelligence
  • Memory 1402 may include one or more computer-readable storage media, which may be tangible and non-transitory. Memory 1402 may also include high-speed random access memory, as well as non-volatile memory, such as one or more disk storage devices, flash storage devices. In some embodiments, the non-transitory computer-readable storage medium in the memory 1402 is used to store at least one instruction, where the at least one instruction is used to be executed by the processor 1401 to implement the methods provided by the embodiments of the present application.
  • the terminal 1400 may optionally include) other components, those skilled in the art can understand that the structure shown in FIG. 14 does not constitute a limitation on the terminal 1400, and may include more or less components, or a combination of certain components, or a different arrangement of components.
  • the server 1500 includes a central processing unit (CPU) 1501, a system memory 1504 including a random access memory 1502 and a read-only memory 1503, and a system bus connecting the system memory 1504 and the central processing unit 1501 1505.
  • the server 1500 also includes a basic input/output system (Input/Output, I/O system) 1506 that facilitates the transfer of information between various devices in the computer, and is used to store the operating system 1513, application programs 1514 and other program modules 1515
  • the basic input/output system 1506 includes a display 1508 for displaying information and input devices 1509 such as a mouse, keyboard, etc., for user input of information.
  • the display 1508 and the input device 1509 are both connected to the central processing unit 1501 through the input and output controller 1510 connected to the system bus 1505 .
  • the basic input/output system 1506 may also include an input output controller 1510 for receiving and processing input from a number of other devices such as a keyboard, mouse, or electronic stylus.
  • input output controller 1510 also provides output to a display screen, printer, or other type of output device.
  • the mass storage device 1507 is connected to the central processing unit 1501 through a mass storage controller (not shown) connected to the system bus 1505 .
  • the mass storage device 1507 and its associated computer-readable media provide non-volatile storage for the server 1500. That is, the mass storage device 1507 may include a computer-readable medium (not shown) such as a hard disk or a drive.
  • Computer-readable media can include computer storage media and communication media.
  • Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Computer storage media include random access memory (RAM, Random Access Memory), read only memory (ROM, Read Only Memory), flash memory or other solid-state storage technologies, Compact Disc Read-Only Memory, CD-ROM ), Digital Versatile Disc (DVD), or other optical storage, cassette, magnetic tape, magnetic disk storage, or other magnetic storage device.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory or other solid-state storage technologies
  • Compact Disc Read-Only Memory CD-ROM
  • DVD Digital Versatile Disc
  • the computer storage medium is not limited to the above-mentioned ones.
  • the system memory 1504 and the mass storage device 1507 described above may be collectively referred to as memory.
  • the memory stores one or more programs, the one or more programs are configured to be executed by the one or more central processing units 1501, the one or more programs contain instructions for implementing the above methods, the central processing unit 1501 executes the one or more programs A plurality of programs implement the methods on the server side in each of the foregoing method embodiments.
  • the server 1500 may also be operated by connecting to a remote computer on the network through a network such as the Internet. That is, the server 1500 can be connected to the network 1512 through the network interface unit 1511 connected to the system bus 1505, or in other words, the network interface unit 1511 can also be used to connect to other types of networks or remote computer systems (not shown) .
  • the memory further includes one or more programs, the one or more programs are stored in the memory, and the one or more programs include steps performed by the server for performing the method provided by the embodiment of the present application.
  • Embodiments of the present application further provide a computer-readable storage medium, where at least one piece of program code is stored in the computer-readable storage medium, and the program code is loaded and executed by a processor to implement the identity information described in the above aspects. Show method.
  • Embodiments of the present application provide a computer program product or computer program, where the computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the method for displaying identity information described in the foregoing embodiments.
  • the medium may be a computer-readable storage medium included in the memory in the above-mentioned embodiments; it may also be a computer-readable storage medium that exists independently and is not assembled into the terminal.
  • the computer-readable storage medium stores at least one instruction, at least one piece of program, code set or instruction set, and the at least one instruction, the at least one piece of program, the code set or the instruction set is loaded and executed by a processor to implement The method described in any of the above method embodiments.
  • the computer-readable storage medium may include: ROM, RAM, Solid State Drives (SSD, Solid State Drives), or an optical disc.
  • the RAM may include Resistive Random Access Memory (ReRAM, Resistance Random Access Memory) and Dynamic Random Access Memory (DRAM, Dynamic Random Access Memory).
  • ReRAM Resistive Random Access Memory
  • DRAM Dynamic Random Access Memory

Abstract

一种身份信息的展示方法、装置、终端、服务器及存储介质,涉及增强现实技术领域。该方法包括:步骤201,显示通过摄像头采集的图像;步骤202,基于增强现实社交模式显示摄像头采集的图像,增强现实社交模式用于为当前登录的帐号展示增强现实社交信息。步骤203,基于人物在环境中所处的地理位置,通过增强现实方式在人物的周围展示第一身份信息。

Description

身份信息的展示方法、装置、终端、服务器及存储介质
本申请要求于2021年01月28日提交的申请号为202110120989.X、发明名称为“身份信息的展示方法、装置、终端、服务器及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及增强现实技术领域,特别涉及一种身份信息的展示方法、装置、终端、服务器及存储介质。
背景技术
线下交互是一种多人在特定物理空间内进行的交互模式,常见的线下交互场景包括行业峰会、客户答谢会、交友会等等。
参加线下交互时,参与者之间可能互不相识,往往需要经过一番沟通交流后才能知悉对方的身份。为了提高线下交互时,参与者之间的交互效率,相关技术中,线下交互的举办方可以预先收集整理参与者的身份信息(必须包含照片),并将整理后的身份信息下发给各个参与者;线下交互过程中,参与者即可通过人脸比对的方式确定对方身份。
然而,当线下交互场景中包含大量参与者时,人为进行人脸比对需要花费大量时间,且人脸比对结果的误差较大,导致线下交互的效率较低,且准确性较差。
发明内容
本申请实施例提供了一种身份信息的展示方法、装置、终端、服务器及存储介质,能够提高线下交互的效率以及准确性。所述技术方案如下:
一方面,本申请实施例提供了一种身份信息的展示方法,应用于终端中,所述方法包括:
通过摄像头进行图像采集,采集到的图像中包括处于环境中的人物;
基于增强现实社交模式显示所述摄像头采集的图像,所述增强现实社交模式用于为当前登录的帐号展示增强现实社交信息;
基于所述人物在环境中所处的地理位置,通过增强现实方式在所述人物的周围展示第一身份信息,所述第一身份信息用于指示所述人物的社交身份。
另一方面,本申请实施例提供了一种身份信息的展示方法,所述方法包括:
接收第二终端发送的身份信息获取请求,所述身份信息获取请求中包含第一地理位置信息,所述第一地理位置信息由终端对摄像头采集到的图像进行人物识别并确定得到,所述第一地理位置信息用于表征人物在环境中所处的地理位置;
基于所述第一地理位置信息,确定所述人物的第一身份信息,所述第一身份信息为所述人物对应的增强现实社交信息;
向所述第二终端发送所述第一身份信息,所述第二终端用于通过增强现实方式,在所述人物的周围展示所述第一身份信息。
另一方面,本申请实施例提供了一种身份信息的展示装置,所述装置包括:
图像采集模块,用于通过摄像头进行图像采集,采集到的图像中包括处于环境中的人物;
信息展示模块,用于基于增强现实社交模式显示所述摄像头采集的图像,所述增强现实社交模式用于为当前登录的帐号展示增强现实社交信息;基于所述人物在环境中所处的地理位置,通过增强现实方式在所述人物的周围展示第一身份信息,所述第一身份信息用于指示所述人物的社交身份。
另一方面,本申请实施例提供了一种身份信息的展示装置,所述装置包括:
请求接收模块,用于接收第二终端发送的身份信息获取请求,所述身份信息获取请求中 包含第一地理位置信息,所述第一地理位置信息由终端对摄像头采集到的图像进行人物识别并确定得到,所述第一地理位置信息用于表征人物在环境中所处的地理位置;
信息确定模块,用于基于所述第一地理位置信息,确定所述人物的第一身份信息,所述第一身份信息为所述人物对应的增强现实社交信息;
信息发送模块,用于向所述第二终端发送所述第一身份信息,所述第二终端用于通过增强现实方式,在所述人物的周围展示所述第一身份信息。
另一方面,本申请实施例提供了一种终端,所述终端包括处理器和存储器,所述存储器中存储有至少一条指令,所述至少一条指令由所述处理器加载并执行以实现如上述方面终端侧的身份信息的展示方法。
另一方面,本申请实施例提供了一种服务器,所述服务器包括处理器和存储器,所述存储器中存储有至少一条指令,所述至少一条指令、所述至少一段程序由所述处理器加载并执行以实现如上述方面服务器侧的身份信息的展示方法。
另一方面,本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条程序代码,所述程序代码由处理器加载并执行以实现如上述方面终端侧的身份信息的展示方法,或,实现如上述方面服务器侧的身份信息的展示方法。
另一方面,本申请实施例提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机指令,该计算机指令存储在计算机可读存储介质中。计算机设备的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该计算机设备执行上述方面提供的身份信息的展示方法。
本申请实施例提供的技术方案带来的有益效果至少包括:
当需要在线下交互时获取特定人物的身份信息时,只需要使用终端进行图像采集,终端即可识别图像中的人物,并确定该人物在真实环境中的地理位置,从而基于该地理位置获取该人物的身份信息,进而采用增强现实方式将身份信息显示在人物周围;采用本申请实施例提供的方案,用户无需通过人脸比对确定参与者的身份,在包含大量参与者的线下交互场景下,能够提高获取身份信息的效率;同时,基于地理位置信息获取身份信息,能够避免人脸比对结果误差较大的问题,有助于提高获取到的身份信息的准确性,提高了线下交互的效率。
附图说明
图1示出了本申请一个示例性实施例提供的实施环境的示意图;
图2示出了本申请一个示例性实施例提供的身份信息的展示方法的流程图;
图3示出了本申请另一个示例性实施例提供的身份信息的展示方法的流程图;
图4是本申请一个示例性实施例示出的人物的地理位置信息确定过程的实施示意图;
图5是本申请一个示例性实施例示出的增强现实名片显示效果的界面示意图;
图6示出了本申请另一个示例性实施例提供的身份信息的展示方法的流程图;
图7是本申请一个示例性实施例提供的身份信息展示过程的界面示意图;
图8是本申请一个示例性实施例示出的查看权限设置过程的界面示意图;
图9示出了本申请另一个示例性实施例提供的身份信息的展示方法的流程图;
图10示出了本申请另一个示例性实施例提供的身份信息的展示方法的流程图;
图11是本申请一个示例性实施例示出的终端与服务器交互过程的流程图;
图12是本申请一个示例性实施例提供的身份信息的展示装置的结构框图;
图13是本申请另一个示例性实施例提供的身份信息的展示装置的结构框图;
图14示出了本申请一个示例性实施例提供的终端的结构框图;
图15示出了本申请一个示例性实施例提供的服务器的结构示意图。
具体实施方式
图1示出了本申请一个示例性实施例提供的实施环境的示意图。该实施环境中包括第一终端110、服务器120和第二终端130。其中,第一终端110与服务器120之间,第二终端130与服务器120之间通过通信网络进行数据通信,可选的,通信网络可以是有线网络也可以是无线网络,且该通信网络可以是局域网、城域网以及广域网中的至少一种。
第一终端110是第一用户使用的终端,该终端是具有图像采集以及增强现实功能的电子设备,该电子设备可以是智能手机、平板电脑、智能眼镜等。借助图像采集以及增强现实功能,第一终端110可以通过增强现实方式展示周围人物的身份信息。
比如,当第一终端110为智能手机时,智能手机可以通过后置摄像头进行图像采集,从而通过屏幕显示采集到的图像,并在图像中人物的周围显示增强现实名片(包含身份信息);当第一终端110为智能眼镜时,智能眼镜可以通过摄像头进行图像采集,并通过具有显示功能的镜片(非透明镜片,相当于显示屏)显示采集到的图像以及增强现实名片,或者,通过投影组件将增强现实名片的影像投影在镜片(透明镜片)或者人眼眼球上。
本申请实施例中,第一终端110还具有定位功能,通过该定位功能,第一终端110可以获取在环境中所处地理位置的地理位置信息,并基于自身的地理位置信息,确定所采集图像中人物的地理位置信息,其中,该地理位置信息可以为经纬度坐标信息,该定位功能可以通过定位组件实现。该定位组件可以为全球定位系统(Global Positioning System,GPS)组件、北斗定位组件等等,本实施例对此不作限定。
第二终端130是第二用户使用的终端,该终端是具有定位功能的电子设备,该电子设备可以是设置有定位组件的智能手机、平板电脑、智能眼镜等。通过定位功能获取到地理位置信息后,第二终端130可以向服务器120实时上报地理位置信息,以便服务器120基于上报的地理位置信息实现身份信息展示功能。
可以理解的是,在本申请的具体实施方式中,涉及到定位信息、地理位置信息、图像数据等相关的数据,当本申请以上实施例运用到具体产品或技术中时,需要获得用户许可、授权或者同意,且相关数据的收集、使用和处理需要遵守相关国家和地区的相关法律法规和标准。
可选的,第二终端130同样具有图像采集以及增强现实功能,借助图像采集以及增强现实功能,第二终端130同样能够通过增强现实方式展示周围人物的身份信息。
服务器120可以是独立的物理服务器,也可以是多个物理服务器构成的服务器集群或者分布式系统,还可以是提供云服务、云数据库、云计算、云函数、云存储、网络服务、云通信、中间件服务、域名服务、安全服务、内容分发网络(Content Delivery Network,CDN)、以及大数据和人工智能平台等基础云计算服务的云服务器。可选地,服务器120还可以是区块链系统中的节点,本实施例对此不加以限定。
可选的,第一终端110和第二终端130中安装有具有身份信息展示功能的应用程序,服务器120是该应用程序的后台服务器,该应用程序可以是即时通信应用程序、社交类应用程序、交友类应用程序、招聘应用程序等等,本实施例对此不作限定。
本申请实施例中,服务器120中存储有各个终端上报的身份信息,以及各个终端实时上报的地理位置信息,且终端对应的身份信息和地理位置信息关联存储。
在线下交互场景下,如图1所示,用户A使用的第一终端110向服务器120上报身份信息以及地理位置信息;以及,用户B使用的第二终端130向服务器120上报身份信息以及地理位置信息。当用户A想要获取用户B的身份信息时,使用第一终端110拍摄用户B。第一终端110基于拍摄的图像以及自身的地理位置信息,确定出用户B的第一地理位置信息,从而向服务器120发送包含第一地理位置信息的身份信息获取请求。服务器120接收到该请求后,确定用户B位于第一地理位置信息所指示的位置,从而将用户B的身份信息反馈给第一终端110。第一终端110根据接收到的身份信息,生成增强现实名片111,并将增强现实名片111显示在图像中用户B的周围。
类似的,当用户B想要获取用户A的身份信息时,同样可以使用第二终端130拍摄用户 A,并确定出用户A的第二地理位置信息,从而基于第二地理位置信息从服务器120处获取用户B的身份信息,并进行增强现实展示。
上述实施例仅以应用于智能手机为例进行说明,当应用于智能眼镜时,用户只需要佩戴智能眼镜并注视需要获取身份信息的人物即可触发身份信息获取展示流程(可以以注视时长超过时长阈值为触发条件),无需手动触发拍摄功能,本实施例对此并不构成限定。
需要说明的是,上述实施例仅通过第一终端和第二终端对身份信息的展示过程进行示意性说明,在实际应用过程中,本申请实施例提供的方案可以应用于三个及以上用户的线下交互场景(即包含三个及三个以上终端),本实施并不对此构成限定。
此外,本申请各个实施例中涉及的拍摄动作可以是摄像头开启后将摄像头对准目标的动作,相应的,身份信息显示在实时的取景画面中;或者,拍摄动作可以是摄像头开启后触发拍摄控件(比如快门按钮)的动作,相应的,身份信息与摄像头采集到影像共同记录在拍摄的照片或者录制的视频中。
本申请实施例提供的身份信息的展示方法应用于各类线下交互场景,下面以几种具体的线下交互场景为例进行说明。
1、线下会议场景
应用于线下会议场景时,参会者到达会议现场后,可以在授权后通过终端中安装的应用程序设置对外展示的个人信息(包括公司、姓名等等),并开启应用程序的增强现实名片功能,由应用程序将身份信息和终端的实时地理位置信息上报至服务器。当参会者需要查看其他参会者的身份信息时,只需要使用终端拍摄待查看的参会者,应用程序即对图像中的参会者进行识别,并确定该参会者所处的地理位置,从而基于该地理位置,从服务器处获取参会者对外展示的身份信息,进而在图像中该参会者的周围显示包含身份信息的增强现实名片。
通过上述方式,参与者之间可以快速知悉对方的身份,从而确定是否需要进行进一步沟通,降低获取对方身份信息的沟通成本,减少参与者之间的无效沟通,提高参与者之间的沟通效率。
2、线下交友场景
应用于线下交友场景时,用户达到交友场地后,可以在授权后通过终端中安装的即时通信应用程序设置对外展示的交友信息(包括年龄、爱好、交友目标等等),并开启即时通信应用程序的增强现实名片功能,由即时通信应用程序将交友信息和终端的实时地理位置信息上报至服务器。当某一用户想要查看其他用户的交友信息时,只需要使用终端拍摄待查看的用户,应用程序即对图像中的用户进行识别,并确定该用户所处的地理位置,从而基于该地理位置,从服务器处获取该用户对外展示的交友信息,进而在图像中该用户的周围显示包含交友信息的增强现实名片。用户通过显示的增强现实名片,能够快速了解其他用户的爱好,从而基于该爱好之间的相似度,快速定位爱好相同的用户并进行交友,或者,基于该爱好快速确定沟通话题,提高用户之间的交友效率。
在一些可能的实施方式中,基于显示的增强现实名片,用户之间还可以在即时通信应用中快速建立社交关系,以便进行线上交友。
3、线下招聘场景
应用于线下招聘场景时,招聘者可以预先通过应用程序设置公司信息,应聘者则可以预先通过应用程序设置个人信息。当达到招聘现场时,招聘者和应聘者均开启应用程序的增强现实名片功能。当应聘者想要了解招聘者的公司信息时,只需要使用终端拍摄招聘者,即可获取招聘者的公司信息,从而结合个人信息确定是否需要进一步与招聘者进行沟通;当招聘者想要了解应聘者的个人信息时,只需要使用终端拍摄应聘者,即可获取应聘者的个人信息,从而根据该个人信息与招聘需求,确定是否需要进一步与应聘者进行沟通,提高应聘者和招聘者双方的沟通效率。
当然,除了上述几种线下交互场景外,本申请实施例提供的方案还可以应用于其他线下 交互场景,本实施例并不对具体的应用场景构成限定。
图2示出了本申请一个示例性实施例提供的身份信息的展示方法的流程图。本实施例以该方法用于图1所示的第一终端110或第二终端130为例进行说明,该方法包括如下步骤。
步骤201,通过摄像头进行图像采集。
可选地,采集到的图像中包括处于环境中的人物。
在一种可能的实施方式中,终端中的应用程序开启增强现实名片功能后,当接收到增强现实名片显示指令时,即开启摄像头进行图像采集,采集包含待识别人物的图像。
其中,该摄像头可以是应用3D视觉技术的摄像头,即基于摄像头采集的图像能够感知图像中各个人物的空间方位、尺寸等信息。该3D视觉技术可以是飞行时间(Time of Flight,ToF)技术、单目立体视觉技术、双目视觉技术、多目立体视觉技术或者3D结构光技术中的至少一种,本实施例对此不作限定。
步骤202,基于增强现实社交模式显示摄像头采集的图像。
其中,增强现实社交模式用于为当前登录的帐号展示增强现实社交信息。
在一些实施例中,显示摄像头采集的图像包括如下方式中的至少一种:1、在终端显示屏中以增强现实社交模式显示通过摄像头采集的图像;2、在增强现实眼镜上以增强现实社交模式显示摄像头采集的图像,本实施例对此不加以限定。
步骤203,基于人物在环境中所处的地理位置,通过增强现实方式在人物的周围展示第一身份信息。
第一身份信息用于指示所述人物的社交身份。可选地,识别图像中的人物,并确定人物对应的第一地理位置信息,第一地理位置信息用于表征人物在真实环境中所处的地理位置。
本申请实施例中,终端基于待识别人物在真实环境中所处的地理位置,获取待识别人物对应的身份信息,因此通过摄像头进行图像采集时,终端识别图像中的人物,并确定人物对应的第一地理位置信息。其中,该第一地理位置信息为世界坐标系下的地理位置信息,比如,该第一地理位置信息可以是人物对应的第一经纬度信息。
在一种可能的实施方式中,终端通过3D视觉技术,并基于自身在真实环境中所处的地理位置确定人物对应的第一地理位置信息,因此实现身份信息展示功能的应用程序需要具备终端地理位置信息的获取权限。
可选的,当识别出图像中包含多个人物时,终端确定各个人物对应的第一地理位置信息;或者,终端从多个人物中确定目标人物,从而确定目标人物对应的第一地理位置信息。其中,该目标人物可以是位于图像画面中央的人物,或者焦点区域的人物,或者用户指定的人物。
可选的,基于第一地理位置信息,获取人物的第一身份信息。
在一种可能的实施方式中,终端向服务器发送包含第一地理位置信息的身份信息获取请求。服务器中存储有各个终端上报的身份信息,接收到请求后,服务器确定位于该第一地理位置信息所指示地理位置的终端,从而将该终端对应用户的身份信息确定为第一身份信息。
可选的,由于确定出的第一地理位置信息存在一定的误差,因此服务器确定与第一地理位置信息所指示地理位置相距最近,且间距小于距离阈值(比如1m)的终端,从而将该终端对应用户的身份信息确定为第一身份信息。
其中,服务器存储的身份信息为终端预先设置的,并授权允许对外展示的身份信息,该身份信息可以包括文字信息、图片信息和视频信息中的至少一种。用户通过终端设置的身份信息的信息类型可以相同,也可以不同。比如,用户A通过终端设置的身份信息中可以包括姓名、公司和职位,而用户B通过终端设置的身份信息中可以包括姓名、公司等。
并且,不同应用场景下,该身份信息的信息类型也可能不同。比如,线下会议场景下,身份信息的信息类型可以包括姓名、公司和职位;线下交友场景下,身份信息的信息类型可以包括姓名、年龄和爱好;线下招聘场景下,身份信息的信息类型可以包括姓名、毕业院校和专业等等。
可选地,通过增强现实方式,在人物的周围展示第一身份信息。
可选的,终端获取到第一身份信息后,通过增强现实方式将第一身份信息显示在人物的周围。其中,终端可以将第一身份信息渲染在特定的增强现实对象元素中,从而将增强现实对象元素显示在人物的周围。比如,终端可以将第一身份信息渲染在增强现实名片中,从而在人物周围显示增强现实名片,使用户可以通过人物对应的增强现实名片获取其身份信息。
可选的,第一身份信息与人物绑定显示,即第一身份信息随着人物在画面中移动而移动,从而保证人物与第一身份信息的对应性。并且,当该人物移动至画面之外时,该第一身份信息停止展示。
综上所述,本申请实施例中,当需要在线下交互时获取特定人物的身份信息时,只需要使用终端进行图像采集,终端即可识别图像中的人物,并确定该人物在真实环境中的地理位置,从而基于该地理位置获取该人物的身份信息,进而采用增强现实方式将身份信息显示在人物周围;采用本申请实施例提供的方案,用户无需通过人脸比对确定参与者的身份,在包含大量参与者的线下交互场景下,能够提高获取身份信息的效率;同时,基于地理位置信息获取身份信息,能够避免人脸比对结果误差较大的问题,有助于提高获取到的身份信息的准确性,提高了线下交互的效率。
在一种可能的实施方式中,终端采用增强现实名片对获取到的身份信息进行展示,且为了提高用户定位高沟通价值人物的效率,终端基于待识别人物与自身用户之间的相关度,确定增强现实名片的尺寸,使用户基于增强现实名片的尺寸即可快速分辨不同人物的沟通价值。下面采用示例性的实施例进行说明。
图3示出了本申请另一个示例性实施例提供的身份信息的展示方法的流程图。本实施例以该方法用于图1所示的第一终端110或第二终端130为例进行说明,该方法包括如下步骤。
步骤301,通过摄像头进行图像采集。
本步骤的实施方式可以参考步骤201,本实施例在此不再赘述。
步骤302,识别图像中的人物,并确定人物在相机坐标系下的人物位置信息。
在一种可能的实施方式中,终端具备获取自身在世界坐标系下所处地理位置的能力,以及确定图像中人物在相机坐标系下所处位置的能力,因此,终端可以基于不同坐标系下地理位置信息之间的转换关系,确定图像中人物在世界坐标系下的地理位置信息。
可选的,终端识别出图像中的人物后,通过3D视觉技术,确定该人物在相机坐标系下的人物位置信息。其中,该相机坐标系是以摄像头的聚焦中心为坐标原点、以光轴为Z轴的三维直角坐标系。
在一种可能的实施方式中,当终端设置有双摄像头时,终端可以通过双目立体视觉技术,确定人物在相机坐标系下的人物位置信息(基于双目之间的成像差原理);当终端设置有ToF摄像头时,终端可以通过ToF深度测量技术,确定人物在相机坐标系下的人物位置信息。当然,终端也可以结合多种3D视觉技术确定人物位置信息,以此提高人物位置信息的准确性,本实施例对此不作限定。
示意性的,如图4所示,用户使用终端41拍摄人物42,得到人物42在相机坐标系43下的人物位置信息P 1
步骤303,基于人物位置信息以及终端在世界坐标系下的第二地理位置信息,确定人物在世界坐标系下的第一地理位置信息。
在一种可能的实施方式中,启用增强现实名片功能后,终端实时获取自身在世界坐标系下的第二地理位置信息,从而对人物位置信息进行相机坐标系到世界坐标系的转化,基于第二地理位置信息以及转化后的人物位置信息,确定该人物在世界坐标系下的第一地理位置信息。
其中,终端可以通过旋转矩阵和平移矩阵对人物位置信息进行转换,本实施例对此不作限定。
示意性的,如图4所示,终端41基于自身的第二地理位置信息P 2,以及人物位置信息P 1,确定人物42在世界坐标系44下的第一地理位置信息P 3
需要说明的是,本实施例仅以上述方式为例,对第一地理位置信息的确定过程进行说明,在其他可能的实施方式中,终端还可以通过测量与人物之间的距离以及相对方位,或者其他可能的方式确定图像中人物的第一地理位置信息,本实施例对此并不进行限定。
步骤304,基于第一地理位置信息以及终端对应用户的第二身份信息,从服务器处获取第一身份信息,其中,第二身份信息满足第一身份信息的查看权限。
在一种可能的实施方式中,开启增强现实名片功能后,通过拍摄获取各个开启增强现实名片功能的用户的身份信息。
然而,在部分应用场景下,用户需要对自身的身份信息进行获取权限的配置。因此,在另一种可能的实施方式中,用户可以为自身的身份信息设置查看权限,实现仅向具有查看权限的用户展示身份信息。可选的,终端将用户设置的身份信息以及查看权限共同上传至服务器,由服务器进行存储。关于查看权限的具体设置过程,下述实施例将进行详述。
相应的,终端从服务器处获取图像中人物的第一身份信息时,除了需要提供该人物的第一地理位置信息外,还需要提供自身对应用户的第二身份信息,以便服务器基于第二身份信息,确定当前终端是否具有第一身份信息的查看权限。其中,第二身份信息与终端登录的帐号对应。
在一种可能的实施方式中,终端向服务器发送包含第一地理位置信息以及第二身份信息的身份信息获取请求,以便服务器基于第一地理位置信息查看匹配的第一身份信息,并基于第二身份信息确定该终端是否具有第一身份信息的查看权限。当然,终端可以仅向服务器发送第一地理位置信息以及终端对应用户的用户标识,由服务器基于用户标识获取第二身份信息,本实施例对此不作限定。
可选的,该查看权限包括身份信息条件,该身份信息条件可以包括行业条件、公司条件、职位条件、性别条件、年龄条件、爱好条件、学历条件、工作经验条件等等,本实施例并不对查看权限的具体内容进行限定。
当第二身份信息满足第一身份信息的查看权限时,服务器即向终端反馈第一身份信息;否则,服务器不会向终端反馈第一身份信息。
步骤305,基于第一身份信息生成增强现实名片。
获取到第一身份信息后,终端即根据第一身份信息渲染生成增强现实名片,其中,该增强现实名片可以基于身份信息在预设名片模板的基础上生成。
在一种可能的实施方式中,基于不同身份信息生成的增强现实名片的尺寸、形式一致。在另一种可能的实施方式中,为了提高用户定位高沟通价值用户的效率,基于不同身份信息生成的增强现实名片的尺寸和形式中的至少一项不同。可选的,终端生成增强现实名片时可以包括如下步骤。
一、获取人物相关度,人物相关度基于第一身份信息以及终端对应用户的第二身份信息确定得到。
在一种可能的实施方式中,服务器基于第一地理位置信息获取到第一身份信息后,获取终端对应用户的第二身份信息,并基于第一身份信息和第二身份信息确定人物相关度,从而将人物相关度与第一身份信息一同反馈至终端。其中,人物相关度越高,表明该人物对于当前终端用户的沟通价值越高,人物相关度越低,表明该人物对于当前终端用户的沟通价值越低。
可选的,当应用于线下会议场景时,服务器可以通过预设的熟人算法计算人物相关度;当应用于线下交友场景时,服务器可以通过兴趣匹配算法计算人物相关度;当应用于线下招聘场景时,服务器可以通过简历匹配算法计算人物相关度,本实施例对此不作限定。
当然,确定人物相关度的步骤也可以由终端执行,从而降低服务器的处理压力,本实施例对此不作限定。
需要说明的是,对于两个用户而言,用户各自对应的人物相关度可能相同,也可能不同。比如用户A对用户B的人物相关度为s 1,而用户B对用户A的人物相关度为s 2,本实施例对此不作限定。
二、基于人物相关度确定名片尺寸,名片尺寸与人物相关度呈正相关关系。
在一种可能的实施方式中,增强现实名片具有默认名片尺寸,终端即根据人物相关度和默认名片尺寸确定第一身份信息对应增强现实名片的名片尺寸,即人物相关度越高,增强现实名片的名片尺寸越大,人物相关度越低,增强现实名片的名片尺寸越小。
比如,当第一身份信息对应的人物相关度为s(0<s≤1),且默认名片尺寸为a*b时,则确定出的名片尺寸为(s×a)*(s×b)。
三、基于第一身份信息和名片尺寸生成增强现实名片。
在一种可能的实施方式中,终端基于第一身份信息和名片尺寸渲染生成增强现实名片,或者,终端基于第一身份信心和默认名片尺寸渲染生成默认增强现实名片,并基于人物相关度对默认增强现实名片进行缩放处理,得到增强现实名片。
示意性的,如图5所示,终端拍摄的图像中包含第一人物51以及第二人物52,由于第一人物51的人物相关度为0.9,而第二人物52的人物相关度为0.5,因此第一人物51对应第一增强现实名片53的名片尺寸大于第二人物52对应第二增强现实名片54的名片尺寸。
需要说明的是,上述实施例仅以不同人物相关度对应不同名片尺寸为例进行说明,在其他可能的实施方案中,可以为不同人物相关度设置不同的名片形式(比如颜色、特效等等)以标识沟通价值的高低,本实施例对此并不进行限定。
步骤306,基于人物在相机坐标系下的人物位置信息,确定增强现实名片在相机坐标系下的名片位置信息。
为了模拟出在真实世界中人物的周围显示实体名片的效果,在显示增强现实名片时,终端需要确定增强现实名片在相机坐标系下的名片显示位置。在一种可能的实施方式中,终端基于人物在相机坐标系下的人物位置信息,从人物位置信息所指示位置周围预定范围内确定出目标位置,从而将目标位置确定为名片位置。
可选的,该名片位置信息所指示的名片位置位于人脸面部下方预设距离(比如人脸下方20cm)处、位于人物头顶预设距离(比如人物头顶10cm)处,本实施例并不对具体名片位置进行限定。
步骤307,通过增强现实方式,在名片位置信息所指示的位置展示增强现实名片,其中,名片位置信息所指示的位置位于人物位置信息所指示位置的周围。
可选地,终端通过增强现实方式,将增强现实名片渲染显示在名片位置信息所指示的位置处,模拟出在真实世界中人物的周围显示实体名片。
示意性的,如图5所示,终端分别在第一人物51和第二人物52的人脸下方20cm处显示第一增强现实名片53以及第二增强现实名片54。
需要说明的是,该增强现实名片的名片位置随着人物位置变化,达到名片跟随人物移动的效果。
步骤308,响应于人物的面部朝向变化,基于面部朝向调整增强现实名片的名片朝向,其中,名片朝向与面部朝向保持一致。
在一种可能的实施方式中,增强现实名片的名片朝向(即包含身份信息的一面)与人物的面部朝向保持一致,相应的,当人物的面部朝向发生变化时,终端基于面部朝向对增强现实名片的名片朝向进行实时调整,提高增强现实名片显示的真实性。
可选的,增强现实名片包括名片正面和名片背面,当人物背对用户时,终端显示增强现实名片的名片背面,该名片背面显示的内容可以由用户自定设置。比如,名片背面可以为公司logo。
本实施例中,终端基于图像中人物在相机坐标系下的人物位置信息,以及终端在世界坐标系下的地理位置信息,确定出人物在世界坐标系下的地理位置信息,提高了确定出的地理 位置信息的准确性,进而提高了后续获取到的身份信息的准确性。
此外,本实施例中,终端基于身份信息之间的相关性,确定出人物相关度,进而基于人物相关度确定增强现实名片的显示尺寸,使具有高沟通价值人物对应的增强现实名片尺寸大于低沟通价值人物对应的增强现实名片尺寸,提高用户基于增强现实名片选择沟通人物的效率。
在一种可能的实施方式中,为了提高用户间建立社交关系的效率,通过增强现实方式展示增强现实名片后,当接收到对增强现实名片的触发操作,且第一身份信息中包含社交信息时,终端显示社交关系建立控件;当接收到对社交关系建立控件的触发操作时,终端基于该社交信息向社交服务器发送社交关系建立请求,请求建立社交关系。其中,终端可以基于增强现实名片在图像中的像素坐标,确定是否接收到对增强现实名片的触发操作。
可选的,当第一身份信息中包含至少两种社交应用对应的社交信息时,终端显示不同社交应用各自对应的社交关系建立控件,并在接收到对目标社交应用对应社交关系建立控件的触发操作时,基于目标社交应用对应的社交信息,通过目标社交应用向社交服务器发送社交关系建立请求。
其中,当本申请实施例提供方案所应用的应用程序与目标社交应用为同一应用程序时,该应用程序直接基于社交信息向社交服务器发送请求;而当本申请实施例提供方案所应用的应用程序与目标社交应用为不同应用程序时,该应用程序通过调用目标社交应用提供的应用程序接口(Application Programming Interface,API),向目标社交应用发送包含社交信息的社交关系建立请求,由目标社交应用基于社交信息向社交服务器发送社交关系建立请求。
示意性的,如图5所示,当接收到对第一增强现实名片53的长按操作时,由于第一增强现实名片53中包含社交帐号,因此终端显示社交关系建立控件55,用户通过点击社交关系建立控件55即可快速建立社交关系。
在其他可能的实施方式中,当接收到对增强现实名片的触发操作,且第一身份信息中包含联系方式信息时,终端显示联系人添加控件;当接收到对联系人添加控件的触发操作时,终端基于联系方式信息自动创建联系人。
示意性的,如图5所示,当接收到对第二增强现实名片54的长按操作时,由于第二增强现实名片54中包含电话,因此终端显示联系人添加控件56,用户通过点击联系人添加控件56即可触发终端自动向通讯录中添加联系人。
上述实施例对身份信息的显示过程进行了说明,下面采用示意性的实施例对身份信息的设置过程进行说明。
图6示出了本申请一个示例性实施例提供的身份信息设置过程的流程图。本实施例以该方法用于图1所示的第一终端110或第二终端130为例进行说明,该方法包括如下步骤。
步骤601,显示信息展示设置界面,信息展示设置界面用于设置对外展示的第二身份信息。
其中,该信息展示设置界面可以由用户手动触发显示,也可以由终端自动触发显示。
在一种可能的实施方式中,当接收到增强现实社交模式启动指令时(通过应用程序内的功能入口触发),终端显示信息展示设置界面,或者,当检测到当前满足增强现实社交模式启动条件时,终端显示提示信息,并在接收到对提示信息的触发操作时,显示信息展示设置界面。
可选的,当终端中设置有行程时,终端获取目标行程(线下活动行程)的行程信息(包括行程地点和行程时间),并检测当前地理位置和当前时间是否处于该目标行程,若当前地理位置和当前时间指示处于目标行程,终端则显示信息展示提醒,提醒用户开启增强现实社交模式。响应于对信息展示提醒的触发操作,终端显示信息展示设置界面。
在一些实施例中,该信息展示设置界面为空白设置界面,用户可以根据身份展示需求, 自定设置对外展示的身份信息条目;在另一些实施例中,为了提高用户设置身份信息的效率,该信息展示设置界面中包含预先设置的身份信息,用户通过信息展示设置界面可以快速选择本次对外展示的身份信息。
示意性的,如图7所示,用户预先在身份信息录入界面71中录入身份信息72并保存。当用户到达目标行程指示的行程地点时,终端显示信息展示提醒通知73,提醒用户设置对外展示的身份信息,并开启增强现实社交模式。当接收到对信息展示提醒通知73的点击操作时,终端显示信息展示设置界面74,信息展示设置界面74中包含在身份信息录入界面71预先录入的身份信息72。
步骤602,响应于信息展示设置界面内的信息设置操作,开启增强现实社交模式,并向服务器上报第二身份信息以及终端在世界坐标系下的第二地理位置信息。
其中,该信息设置操作可以是上传操作、输入操作或勾选操作。可选的,该信息设置操作还可以包括排版操作,该排版操作用于调整第二身份信息在增强现实名片上的显示位置。
示意性的,如图7所示,用户可以信息展示设置界面74中勾选预先设置的身份信息72作为本次对外展示的第二身份信息。并且,用户勾选的第二身份信息将显示在增强现实名片预览区域75中,以便用户在增强现实名片预览区域75中调整各条第二身份信息的显示位置。
完成第二身份信息设置后,终端即开启增强现实社交模式,并将第二身份信息以及自身的第二地理位置信息上传至服务器,由服务器对身份信息和地理位置信息进行关联存储。
由于终端所处的地理位置会发生变化,因此在开启增强现实社交模式过程中,终端向服务器实时上报第二地理位置信息,提高后续身份信息展示的准确性。
步骤603,响应于信息展示设置界面内的权限设置操作,向服务器发送权限信息,权限信息用于指示第二身份信息的查看权限。
信息展示设置界面除了能够设置身份信息外,还可以设置身份信息的查看权限,即用户设置的身份信息仅能够被具有身份信息查看权限的终端(用户)查看。
可选的,终端基于接收到的权限设置操作,确定第二身份信息对应的权限信息,该权限信息用于指示具有查看权限的用户所需满足的条件,或者,用于指示屏蔽用户所需满足的条件。
可选地,终端将确定出的权限信息上报至服务器,由服务器对权限信息和身份信息进行关联存储。
示意性的,如图8所示,用户在信息展示设置界面81内勾选第二身份信息后,可以点击权限设置控件82进行查看权限设置。终端接收到对权限设置控件82的点击操作后,显示屏蔽选项83,用户可以从屏蔽选项83中勾选需要屏蔽的行业以及职位,即属于该屏蔽行业以及屏蔽职位的用户将无法查看该第二身份信息。
在一种可能的实施方式中,完成身份信息设置后,终端显示名片查看提示,提示用户可以通过拍摄获取其他用户的名片。示意性的,如图7所示,用户“李四”完成身份信息设置后,终端显示名片查看提示76,当接收到对名片查看提示76的点击操作时,终端即进行图像采集,并在图像中显示用户“张三”的增强现实名片77,该增强现实名片77中即包含用户“张三”通过信息展示设置界面74设置的第二身份信息。
在其他可能的实施方式中,还可以对具有查看权限的用户进行划分,为不同查看权限等级的用户设置不同的查看内容,比如为一级查看权限的用户设置身份信息均可见,为二级查看权限的用户设置仅部分身份信息(比如姓名和公司)可见,本实施例对此不作限定。
本实施例中,终端基于当前地理位置和当前时间,确定是否处于目标行程,从而在处于目标行程时,自动提示用户进行身份信息设置并开启增强现实社交模式,避免因用户忘记开启增强现实社交模式导致增强现实名片无法正常显示的问题。
并且,在设置身份信息的同时,为身份信息设置查看权限,提高身份信息展示的针对性,避免身份信息被随意获取导致的个人信息泄露。
图9示出了本申请另一个示例性实施例提供的身份信息的展示方法的流程图。本实施例以该方法用于图1所示的服务器为例进行说明,该方法包括如下步骤。
步骤901,接收第二终端发送的身份信息获取请求,身份信息获取请求中包含第一地理位置信息,第一地理位置信息由终端对摄像头采集到的图像进行人物识别并确定得到,第一地理位置信息用于表征人物在真实环境中所处的地理位置。
在一种可能的实施方式中,第二终端开启增强现实社交模式后,通过摄像头进行图像采集,对图像中的人物进行识别,并确定出人物在世界坐标系下的第一地理位置信息,从而向服务器发送包含第一地理位置信息的身份信息获取请求,以获取该人物的第一身份信息。
可选的,接收到第二终端发送的身份信息获取请求后,服务器检测第二终端是否开启增强现实社交模式,若第二终端未开启增强现实社交模式,则提示开启增强现实社交模式;若第二终端已开启增强现实社交模式,则执行后续流程。其中,服务器可以通过检测是否存储有第二终端对应的身份信息以及地理位置信息来确定第二终端是否开启增强现实社交模式。
步骤902,基于第一地理位置信息,确定人物的第一身份信息。
本申请实施例中,服务器中存储有各个(开启增强现实社交模式)终端对应的地理位置信息以及身份信息。接收到身份信息获取请求后,服务器计算第一地理位置信息与各个已存储地理位置信息所指示位置之间的距离,从而基于距离确定位于第一地理位置信息所指示位置处的人物,进而获取该人物对应的第一身份信息。
示意性的,服务器将与第一地理位置信息所指示位置之间距离小于距离阈值,且距离最近的人物确定为目标人物,从而获取目标人物对应的第一身份信息。
为了提高服务器的处理速度,服务器可以对存储的地理位置信息进行区域划分,从而基于第一地理位置信息确定目标区域,进而从目标区域中确定目标人物,本实施例对此不作限定。
可选的,在确定第一身份信息时,服务器还可以基于第一地理位置信息和第二终端的第二地理位置信息,第二终端对应的用户是否与图像中的人物位于同一场景中,并在位于同一场景时,确定人物的第一身份信息。
步骤903,向第二终端发送第一身份信息,第二终端用于通过增强现实方式,在人物的周围展示第一身份信息。
服务器向终端反馈确定出的第一身份信息,以便第二终端通过增强现实方式对第一身份信息进行展示,其中,第二终端展示第一身份信息的方式可以参考上述实施例,本实施例在此不再赘述。
综上所述,本申请实施例中,当需要在线下交互时获取特定人物的身份信息时,只需要使用终端进行图像采集,终端即可识别图像中的人物,并确定该人物在真实环境中的地理位置,从而基于该地理位置获取该人物的身份信息,进而采用增强现实方式将身份信息显示在人物周围;采用本申请实施例提供的方案,用户无需通过人脸比对确定参与者的身份,在包含大量参与者的线下交互场景下,能够提高获取身份信息的效率;同时,基于地理位置信息获取身份信息,能够避免人脸比对结果误差较大的问题,有助于提高获取到的身份信息的准确性,提高了线下交互的效率。
图10示出了本申请另一个示例性实施例提供的身份信息的展示方法的流程图。本实施例以该方法用于图1所示的服务器为例进行说明,该方法包括如下步骤。
步骤1001,接收各个终端上报的地理位置信息以及身份信息。
在一种可能的实施方式中,终端开启增强现实社交模式后,即向服务器上报对外展示的身份信息,并在增强现实社交模式开启后,向服务器上报实时的地理位置信息。服务器接收到终端上报的信息后,即对信息进行存储更新。示意性的,服务器中存储的地理位置信息与身份信息的对应关系如表一所示。
表一
Figure PCTCN2022073258-appb-000001
步骤1002,接收第二终端发送的身份信息获取请求,身份信息获取请求中包含第一地理位置信息。
本步骤的实施方式可以参考步骤901,本实施例在此不再赘述。
步骤1003,确定位于第一地理位置信息所指示地理位置的第一终端。
在一种可能的实施方式中,服务器基于第一地理位置信息以及存储的各个终端的地理位置信息,确定待识别人物与各个终端之间的距离,从而将距离待识别人物最近的终端确定为第一终端。
步骤1004,将第一终端上报的身份信息确定为第一身份信息。
可选地,服务器基于地理位置信息域身份信息之间的对应关系,将第一上报身份信息获取请求,其中包含第一地理位置信息为P 0,服务器计算得到P 0与P 1之间的距离为20米,P 0与P 2之间的距离为100米,P 0与P 3之间的距离为0.5米,P 0与P 4之间的距离为50米,确定待识别人物为“李四”,从而获取“李四”应的第一身份信息:姓名:李四;公司:YYY有限公司;职位:项目经理。
步骤1005,获取第二终端对应的第二身份信息以及第一身份信息对应的权限信息,权限信息用于指示第一身份信息的查看权限。
在一种可能的实施方式中,获取到第一身份信息后,服务器检测第一身份信息是否设置有查看权限,若设置有查看权限,则确定第二终端是否具有获取第一身份信息的权限,若未设置查看权限,则直接执行后续流程。
可选的,终端向服务器发送身份信息的同时,向服务器发送设置的权限信息,相应的,服务器将权限信息与身份信息进行关联存储。示意性的,地理位置信息、身份信息与权限信息三者之间的对应关系如表二所示。
表二
Figure PCTCN2022073258-appb-000002
服务器检测第二终端对应的第二身份信息是否满足权限信息所指示的查看权限,若满足,则执行下述步骤1006,若不满足,则不向第二终端反馈第一身份信息。
步骤1006,响应于第二身份信息满足第一身份信息的查看权限,基于第一身份信息以及第二终端上报的第二身份信息,确定人物相关度。
对于具有查看权限的第二终端,为了提高用户基于显示的增强现实名片定位高沟通价值人物的效率,服务器基于第一身份信息以及第二终端对应的第二身份信息,确定第一终端用户与第二终端用户之间的人物相关度。
可选的,当应用于线下会议场景时,服务器可以通过熟人算法计算人物相关度;当应用于线下交友场景时,服务器可以通过兴趣匹配算法计算人物相关度;当应用于线下招聘场景时,服务器可以通过简历匹配算法计算人物相关度,本实施例对此不作限定。
步骤1007,向第二终端发送第一身份信息以及人物相关度,第二终端用于基于人物相关度确定名片尺寸,并基于第一身份信息和名片尺寸生成增强现实名片。
为了使第二终端能够突出显示高沟通价值人物对应的增强现实名片,服务器向第二终端方第一身份信息的同时,将确定出的人物相关度发送至第二终端。第二终端接收到服务器反馈的信息后,即根据人物相关度确定增强现实名片的名片尺寸,进而将采用该名片尺寸的增强现实名片显示在人物的周围。其中,显示增强现实名片的具体方式可以参考上述实施例,本实施例在此不再赘述。
本实施例中,基于身份信息之间的相关性,确定出人物相关度,进而基于人物相关度确定增强现实名片的显示尺寸,使具有高沟通价值人物对应的增强现实名片尺寸大于低沟通价值人物对应的增强现实名片尺寸,提高用户基于增强现实名片选择沟通人物的效率。
并且,在设置身份信息的同时,为身份信息设置查看权限,提高身份信息展示的针对性,避免身份信息被随意获取导致的个人信息泄露。
在一个示意性的例子中,身份信息展示过程中,终端与服务器之间的交互过程如图11所示。
步骤1101,终端获取用户A录入身份信息,并上报至服务器。
终端在上报用户A的身份信息的同时,上报地理位置信息。
步骤1102,服务器对用户A的身份信息以及地理位置信息进行存储。
1103,服务器基于用户行程信息,确定用户A处于目标行程时,进行提示推送。
1104,终端是否开启增强现实模式。
增强现实模式,或称为增强现实社交模式,用于指示以增强现实方式展示身份信息的模式。
1105,终端确定用户A选择对外展示的身份信息,并上报服务器。
1106,服务器存储用户A对外展示的身份信息。
1107,终端获取用户A使用终端拍摄包含用户B的图像。
图像中包括处于环境中的用户B。
1108,终端确定用户B的地理位置信息,并向服务器发送请求。
1109,服务器基于地理位置信息,获取用户B的身份信息。
1110,服务器判断用户A是否具有用户B身份信息的获取权限。
1111,服务器确定用户A与用户B之间的人物相关度。
1112,终端基于身份信息和人物相关度显示增强现实名片。
图12是本申请一个示例性实施例提供的身份信息的展示装置的结构框图,如图12所示,该装置包括:
图像采集模块1201,用于通过摄像头进行图像采集,采集到的图像中包括处于环境中的人物;
信息展示模块1204,用于基于增强现实社交模式显示所述摄像头采集的图像,所述增强现实社交模式用于为当前登录的帐号展示增强现实社交信息;基于所述人物在环境中所处的地理位置,通过增强现实方式在所述人物的周围展示第一身份信息,所述第一身份信息用于指示所述人物的社交身份。
在一个可选的实施例中,该装置还包括:
位置确定模块1202,用于识别所述图像中的所述人物,并确定所述人物对应的第一地理位置信息,所述第一地理位置信息用于表征所述人物在环境中所处的地理位置;
信息获取模块1203,用于基于所述第一地理位置信息,获取所述人物的第一身份信息;
信息展示模块1204,用于通过增强现实方式,在所述人物的周围展示所述第一身份信息。
可选的,信息展示模块1204,包括:
名片生成单元,用于基于所述第一身份信息生成增强现实名片;
名片位置确定单元,用于基于所述人物在相机坐标系下的人物位置信息,确定所述增强现实名片在所述相机坐标系下的名片位置信息;
名片展示单元,用于通过增强现实方式,在所述名片位置信息所指示的位置展示所述增强现实名片,其中,所述名片位置信息所指示的位置位于所述人物位置信息所指示位置的周围。
可选的,名片生成单元,用于:
获取人物相关度,所述人物相关度基于所述第一身份信息以及终端对应帐号的第二身份信息确定得到;
基于所述人物相关度确定名片尺寸,所述名片尺寸与所述人物相关度呈正相关关系;
基于所述第一身份信息和所述名片尺寸生成所述增强现实名片。
可选的,所述装置还包括:
朝向调整模块,用于响应于所述人物的面部朝向变化,基于所述面部朝向调整所述增强现实名片的名片朝向,其中,所述名片朝向与所述面部朝向保持一致。
可选的,所述装置还包括:
设置界面显示模块,用于显示信息展示设置界面,所述信息展示设置界面用于设置对外展示的第二身份信息,所述第二身份信息与所述终端登录的帐号对应;
上报模块,用于响应于所述信息展示设置界面内的信息设置操作,开启增强现实社交模式,并向所述服务器上报所述第二身份信息以及终端在世界坐标系下的第二地理位置信息。
可选的,所述信息展示设置界面还用于设置查看权限;
上报模块,还用于:
响应于所述信息展示设置界面内的权限设置操作,向所述服务器发送权限信息,所述权限信息用于指示所述第二身份信息的查看权限。
可选的,设置界面显示模块,包括:
提醒展示单元,用于响应于当前地理位置和当前时间指示处于目标行程,显示信息展示提醒;
设置界面显示单元,用于响应于对所述信息展示提醒的触发操作,显示所述信息展示设置界面。
可选的,所述第一身份信息设置有查看权限;
信息获取模块1203,用于:
基于所述第一地理位置信息以及终端对应帐号的第二身份信息,从所述服务器处获取所述第一身份信息,其中,所述第二身份信息满足所述第一身份信息的查看权限。
可选的,所述位置确定模块1202,用于:
识别图像中的人物,并确定所述人物在相机坐标系下的人物位置信息;
基于所述人物位置信息以及终端在世界坐标系下的第二地理位置信息,确定所述人物在世界坐标系下的所述第一地理位置信息。
图13是本申请另一个示例性实施例提供的身份信息的展示装置的结构框图,如图13所示,该装置包括:
请求接收模块1301,用于接收第二终端发送的身份信息获取请求,所述身份信息获取请 求中包含第一地理位置信息,所述第一地理位置信息由终端对摄像头采集到的图像进行人物识别并确定得到,所述第一地理位置信息用于表征人物在环境中所处的地理位置;
信息确定模块1302,用于基于所述第一地理位置信息,确定所述人物的第一身份信息,所述第一身份信息为所述人物对应的增强现实社交信息;
信息发送模块1303,用于向所述第二终端发送所述第一身份信息,所述第二终端用于通过增强现实方式,在所述人物的周围展示所述第一身份信息。
可选的,所述装置还包括:
接收模块,用于接收各个终端上报的地理位置信息以及身份信息;
信息确定模块1302,包括:
终端确定单元,用于确定位于所述第一地理位置信息所指示地理位置的第一终端;
信息确定单元,用于将所述第一终端上报的所述身份信息确定为所述第一身份信息。
可选的,所述装置还包括:
相关度确定模块,用于基于所述第一身份信息以及所述第二终端上报的第二身份信息,确定人物相关度;
信息发送模块1303,还用于:
向所述第二终端发送所述第一身份信息以及所述人物相关度,所述第二终端用于基于所述人物相关度确定名片尺寸,并基于第一身份信息和所述名片尺寸生成增强现实名片。
可选的,所述第一身份信息设置有查看权限;
所述信息发送模块1303,还用于:
获取所述第二终端对应的第二身份信息以及所述第一身份信息对应的权限信息,所述权限信息用于指示所述第一身份信息的查看权限;
响应于所述第二身份信息满足所述第一身份信息的查看权限,向所述第二终端发送所述第一身份信息。
需要说明的是:上述实施例提供的装置,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的装置与方法实施例属于同一构思,其实现过程详见方法实施例,这里不再赘述。
请参考图14,其示出了本申请一个示例性实施例提供的终端1400的结构框图。该终端1400可以是便携式移动终端,比如:智能手机、平板电脑、动态影像专家压缩标准音频层面3(Moving Picture Experts Group Audio Layer III,MP3)播放器、动态影像专家压缩标准音频层面4(Moving Picture Experts Group Audio Layer IV,MP4)播放器。终端1400还可能被称为用户设备、便携式终端等其他名称。
通常,终端1400包括有:处理器1401和存储器1402。
处理器1401可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器1401可以采用数字信号处理(Digital Signal Processing,DSP)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)、可编程逻辑阵列(Programmable Logic Array,PLA)中的至少一种硬件形式来实现。处理器1401也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称中央处理器(Central Processing Unit,CPU);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1401可以在集成有图像处理器(Graphics Processing Unit,GPU),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1401还可以包括人工智能(Artificial Intelligence,AI)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1402可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是有形的和非暂态的。存储器1402还可包括高速随机存取存储器,以及非易失性存储器,比如一 个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1402中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1401所执行以实现本申请实施例提供的方法。
在一些实施例中,终端1400还可选包括)其他组件,本领域技术人员可以理解,图14中示出的结构并不构成对终端1400的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
请参考图15,其示出了本申请一个示例性实施例提供的服务器的结构示意图。具体来讲:所述服务器1500包括中央处理单元(Central Processing Unit,CPU)1501、包括随机存取存储器1502和只读存储器1503的系统存储器1504,以及连接系统存储器1504和中央处理单元1501的系统总线1505。所述服务器1500还包括帮助计算机内的各个器件之间传输信息的基本输入/输出系统(Input/Output,I/O系统)1506,和用于存储操作系统1513、应用程序1514和其他程序模块1515的大容量存储设备1507。
所述基本输入/输出系统1506包括有用于显示信息的显示器1508和用于用户输入信息的诸如鼠标、键盘之类的输入设备1509。其中所述显示器1508和输入设备1509都通过连接到系统总线1505的输入输出控制器1510连接到中央处理单元1501。所述基本输入/输出系统1506还可以包括输入输出控制器1510以用于接收和处理来自键盘、鼠标、或电子触控笔等多个其他设备的输入。类似地,输入输出控制器1510还提供输出到显示屏、打印机或其他类型的输出设备。
所述大容量存储设备1507通过连接到系统总线1505的大容量存储控制器(未示出)连接到中央处理单元1501。所述大容量存储设备1507及其相关联的计算机可读介质为服务器1500提供非易失性存储。也就是说,所述大容量存储设备1507可以包括诸如硬盘或者驱动器之类的计算机可读介质(未示出)。
不失一般性,所述计算机可读介质可以包括计算机存储介质和通信介质。计算机存储介质包括以用于存储诸如计算机可读指令、数据结构、程序模块或其他数据等信息的任何方法或技术实现的易失性和非易失性、可移动和不可移动介质。计算机存储介质包括随机存取记忆体(RAM,Random Access Memory)、只读存储器(ROM,Read Only Memory)、闪存或其他固态存储其技术,只读光盘(Compact Disc Read-Only Memory,CD-ROM)、数字通用光盘(Digital Versatile Disc,DVD)或其他光学存储、磁带盒、磁带、磁盘存储或其他磁性存储设备。当然,本领域技术人员可知所述计算机存储介质不局限于上述几种。上述的系统存储器1504和大容量存储设备1507可以统称为存储器。
存储器存储有一个或多个程序,一个或多个程序被配置成由一个或多个中央处理单元1501执行,一个或多个程序包含用于实现上述方法的指令,中央处理单元1501执行该一个或多个程序实现上述各个方法实施例中服务器侧的方法。
根据本申请的各种实施例,所述服务器1500还可以通过诸如因特网等网络连接到网络上的远程计算机运行。也即服务器1500可以通过连接在所述系统总线1505上的网络接口单元1511连接到网络1512,或者说,也可以使用网络接口单元1511来连接到其他类型的网络或远程计算机系统(未示出)。
所述存储器还包括一个或者一个以上的程序,所述一个或者一个以上程序存储于存储器中,所述一个或者一个以上程序包含用于进行本申请实施例提供的方法中由服务器所执行的步骤。
本申请实施例还提供一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条程序代码,所述程序代码由处理器加载并执行以实现如上述方面所述的身份信息的展示方法。
本申请实施例提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程 序包括计算机指令,该计算机指令存储在计算机可读存储介质中。计算机设备的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该计算机设备执行上述实施例所述的身份信息的展示方法。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,该计算机可读存储介质可以是上述实施例中的存储器中所包含的计算机可读存储介质;也可以是单独存在,未装配入终端中的计算机可读存储介质。该计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由处理器加载并执行以实现上述任一方法实施例所述的方法。
可选地,该计算机可读存储介质可以包括:ROM、RAM、固态硬盘(SSD,Solid State Drives)或光盘等。其中,RAM可以包括电阻式随机存取记忆体(ReRAM,Resistance Random Access Memory)和动态随机存取存储器(DRAM,Dynamic Random Access Memory)。上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本申请的可选的实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (25)

  1. 一种身份信息的展示方法,应用于终端中,所述方法包括:
    通过摄像头进行图像采集,采集到的图像中包括处于环境中的人物;
    基于增强现实社交模式显示所述摄像头采集的图像,所述增强现实社交模式用于为当前登录的帐号展示增强现实社交信息;
    基于所述人物在环境中所处的地理位置,通过增强现实方式在所述人物的周围展示第一身份信息,所述第一身份信息用于指示所述人物的社交身份。
  2. 根据权利要求1所述的方法,其中,所述基于所述人物在环境中所处的地理位置,通过增强现实方式在所述人物的周围展示第一身份信息,包括:
    识别所述图像中的所述人物,并确定所述人物对应的第一地理位置信息,所述第一地理位置信息用于表征所述人物在环境中所处的地理位置;
    基于所述第一地理位置信息,获取所述人物的所述第一身份信息;
    通过增强现实方式,在所述人物的周围展示所述第一身份信息。
  3. 根据权利要求2所述的方法,其中,所述通过增强现实方式在所述人物的周围展示所述第一身份信息,包括:
    基于所述第一身份信息生成增强现实名片;
    基于所述人物在相机坐标系下的人物位置信息,确定所述增强现实名片在所述相机坐标系下的名片位置信息;
    通过增强现实方式,在所述名片位置信息所指示的位置展示所述增强现实名片,其中,所述名片位置信息所指示的位置位于所述人物位置信息所指示位置的周围。
  4. 根据权利要求3所述的方法,其中,所述基于所述第一身份信息生成增强现实名片,包括:
    获取人物相关度,所述人物相关度基于所述第一身份信息以及所述终端对应帐号的第二身份信息确定得到;
    基于所述人物相关度确定名片尺寸,所述名片尺寸与所述人物相关度呈正相关关系;
    基于所述第一身份信息和所述名片尺寸生成所述增强现实名片。
  5. 根据权利要求3所述的方法,其中,所述通过增强现实方式,在所述名片位置信息所指示的位置展示所述增强现实名片之后,所述方法还包括:
    响应于所述人物的面部朝向变化,基于所述面部朝向调整所述增强现实名片的名片朝向,其中,所述名片朝向与所述面部朝向保持一致。
  6. 根据权利要求1至5任一所述的方法,其中,所述基于增强现实社交功能显示所述摄像头采集的图像之前,所述方法还包括:
    显示信息展示设置界面,所述信息展示设置界面用于设置对外展示的第二身份信息,所述第二身份信息与所述终端登录的帐号对应;
    响应于所述信息展示设置界面内的信息设置操作,开启所述增强现实社交模式,并向所述服务器上报所述第二身份信息以及终端在世界坐标系下的第二地理位置信息。
  7. 根据权利要求6所述的方法,其中,所述方法还包括:
    响应于所述信息展示设置界面内的权限设置操作,向所述服务器发送权限信息,所述权 限信息用于指示所述第二身份信息的查看权限。
  8. 根据权利要求6所述的方法,其中,所述显示信息展示设置界面,包括:
    响应于当前地理位置和当前时间指示处于目标行程,显示信息展示提醒;
    响应于对所述信息展示提醒的触发操作,显示所述信息展示设置界面。
  9. 根据权利要求2至5任一所述的方法,其中,所述第一身份信息设置有查看权限;
    所述基于所述第一地理位置信息,获取所述人物的所述第一身份信息,包括:
    基于所述第一地理位置信息以及终端对应帐号的第二身份信息,从服务器处获取所述第一身份信息,其中,所述第二身份信息满足所述第一身份信息的查看权限。
  10. 根据权利要求2至5任一所述的方法,其中,所述识别所述图像中的人物,并确定所述人物对应的第一地理位置信息,包括:
    识别所述图像中的人物,并确定所述人物在相机坐标系下的人物位置信息;
    基于所述人物位置信息以及终端在世界坐标系下的第二地理位置信息,确定所述人物在世界坐标系下的所述第一地理位置信息。
  11. 一种身份信息的展示方法,应用于服务器中,所述方法包括:
    接收第二终端发送的身份信息获取请求,所述身份信息获取请求中包含第一地理位置信息,所述第一地理位置信息由终端对摄像头采集到的图像进行人物识别并确定得到,所述第一地理位置信息用于表征人物在环境中所处的地理位置;
    基于所述第一地理位置信息,确定所述人物的第一身份信息,所述第一身份信息为所述人物对应的增强现实社交信息;
    向所述第二终端发送所述第一身份信息,所述第二终端用于通过增强现实方式,在所述人物的周围展示所述第一身份信息。
  12. 一种身份信息的展示装置,所述装置包括:
    图像采集模块,用于通过摄像头进行图像采集,采集到的图像中包括处于环境中的人物;
    信息展示模块,用于基于增强现实社交模式显示所述摄像头采集的图像,所述增强现实社交模式用于为当前登录的帐号展示增强现实社交信息;基于所述人物在环境中所处的地理位置,通过增强现实方式在所述人物的周围展示第一身份信息,所述第一身份信息用于指示所述人物的社交身份。
  13. 根据权利要求12所述的装置,其中,所述装置还包括:
    位置确定模块,用于识别所述图像中的所述人物,并确定所述人物对应的第一地理位置信息,所述第一地理位置信息用于表征所述人物在环境中所处的地理位置;
    信息获取模块,用于基于所述第一地理位置信息,获取所述人物的第一身份信息;
    信息展示模块,用于通过增强现实方式,在所述人物的周围展示所述第一身份信息。
  14. 根据权利要求13所述的装置,其中,所述信息展示模块,包括:
    名片生成单元,用于基于所述第一身份信息生成增强现实名片;
    名片位置确定单元,用于基于所述人物在相机坐标系下的人物位置信息,确定所述增强现实名片在所述相机坐标系下的名片位置信息;
    名片展示单元,用于通过增强现实方式,在所述名片位置信息所指示的位置展示所述增强现实名片,其中,所述名片位置信息所指示的位置位于所述人物位置信息所指示位置的周围。
  15. 根据权利要求14所述的装置,其中,所述名片生成单元,用于:
    获取人物相关度,所述人物相关度基于所述第一身份信息以及终端对应帐号的第二身份信息确定得到;
    基于所述人物相关度确定名片尺寸,所述名片尺寸与所述人物相关度呈正相关关系;
    基于所述第一身份信息和所述名片尺寸生成所述增强现实名片。
  16. 根据权利要求14所述的装置,其中,所述装置还包括:
    朝向调整模块,用于响应于所述人物的面部朝向变化,基于所述面部朝向调整所述增强现实名片的名片朝向,其中,所述名片朝向与所述面部朝向保持一致。
  17. 根据权利要求12至16任一所述的装置,其中,所述装置还包括:
    设置界面显示模块,用于显示信息展示设置界面,所述信息展示设置界面用于设置对外展示的第二身份信息,所述第二身份信息与所述终端登录的帐号对应;
    上报模块,用于响应于所述信息展示设置界面内的信息设置操作,开启增强现实社交模式,并向所述服务器上报所述第二身份信息以及终端在世界坐标系下的第二地理位置信息。
  18. 根据权利要求17所述的装置,其中,所述上报模块,还用于:
    响应于所述信息展示设置界面内的权限设置操作,向所述服务器发送权限信息,所述权限信息用于指示所述第二身份信息的查看权限。
  19. 根据权利要求17所述的装置,其中,所述设置界面显示模块,包括:
    提醒展示单元,用于响应于当前地理位置和当前时间指示处于目标行程,显示信息展示提醒;
    设置界面显示单元,用于响应于对所述信息展示提醒的触发操作,显示所述信息展示设置界面。
  20. 根据权利要求13至16任一所述的装置,其中,所述第一身份信息设置有查看权限;
    信息获取模块,用于:
    基于所述第一地理位置信息以及终端对应帐号的第二身份信息,从所述服务器处获取所述第一身份信息,其中,所述第二身份信息满足所述第一身份信息的查看权限。
  21. 根据权利要求13至16任一所述的装置,其中,所述位置确定模块,用于:
    识别图像中的人物,并确定所述人物在相机坐标系下的人物位置信息;
    基于所述人物位置信息以及终端在世界坐标系下的第二地理位置信息,确定所述人物在世界坐标系下的所述第一地理位置信息。
  22. 一种身份信息的展示装置,所述装置包括:
    请求接收模块,用于接收第二终端发送的身份信息获取请求,所述身份信息获取请求中包含第一地理位置信息,所述第一地理位置信息由终端对摄像头采集到的图像进行人物识别并确定得到,所述第一地理位置信息用于表征人物在环境中所处的地理位置;
    信息确定模块,用于基于所述第一地理位置信息,确定所述人物的第一身份信息,所述第一身份信息为所述人物对应的增强现实社交信息;
    信息发送模块,用于向所述第二终端发送所述第一身份信息,所述第二终端用于通过增强现实方式,在所述人物的周围展示所述第一身份信息。
  23. 一种终端,所述终端包括处理器和存储器,所述存储器中存储有至少一条指令,所述至少一条指令由所述处理器加载并执行以实现如权利要求1至10任一所述的身份信息的展示方法。
  24. 一种服务器,所述服务器包括处理器和存储器,所述存储器中存储有至少一条指令,所述至少一条指令、所述至少一段程序由所述处理器加载并执行以实现如权利要求10所述的身份信息的展示方法。
  25. 一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条程序代码,所述程序代码由处理器加载并执行以实现如权利要求1至10任一所述的身份信息的展示方法。
PCT/CN2022/073258 2021-01-28 2022-01-21 身份信息的展示方法、装置、终端、服务器及存储介质 WO2022161289A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/048,562 US20230066708A1 (en) 2021-01-28 2022-10-21 Identity information presentation method and apparatus, terminal, server, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110120989.XA CN114820992A (zh) 2021-01-28 2021-01-28 身份信息的展示方法、装置、终端、服务器及存储介质
CN202110120989.X 2021-01-28

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/048,562 Continuation US20230066708A1 (en) 2021-01-28 2022-10-21 Identity information presentation method and apparatus, terminal, server, and storage medium

Publications (1)

Publication Number Publication Date
WO2022161289A1 true WO2022161289A1 (zh) 2022-08-04

Family

ID=82525571

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/073258 WO2022161289A1 (zh) 2021-01-28 2022-01-21 身份信息的展示方法、装置、终端、服务器及存储介质

Country Status (3)

Country Link
US (1) US20230066708A1 (zh)
CN (1) CN114820992A (zh)
WO (1) WO2022161289A1 (zh)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180165848A1 (en) * 2016-12-08 2018-06-14 Bank Of America Corporation Facilitating Dynamic Across-Network Location Determination Using Augmented Reality Display Devices
CN111045510A (zh) * 2018-10-15 2020-04-21 中国移动通信集团山东有限公司 基于增强现实的人机交互方法及系统
CN111428549A (zh) * 2019-10-31 2020-07-17 深圳市睿洋图志科技有限公司 一种基于社交活动图像大数据的人物信息分析方法及系统

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180165848A1 (en) * 2016-12-08 2018-06-14 Bank Of America Corporation Facilitating Dynamic Across-Network Location Determination Using Augmented Reality Display Devices
CN111045510A (zh) * 2018-10-15 2020-04-21 中国移动通信集团山东有限公司 基于增强现实的人机交互方法及系统
CN111428549A (zh) * 2019-10-31 2020-07-17 深圳市睿洋图志科技有限公司 一种基于社交活动图像大数据的人物信息分析方法及系统

Also Published As

Publication number Publication date
US20230066708A1 (en) 2023-03-02
CN114820992A (zh) 2022-07-29

Similar Documents

Publication Publication Date Title
US11783862B2 (en) Routing messages by message parameter
US11372608B2 (en) Gallery of messages from individuals with a shared interest
US9854219B2 (en) Gallery of videos set to an audio time line
TWI669634B (zh) 基於擴增實境的虛擬對象分配方法及裝置
US10719989B2 (en) Suggestion of content within augmented-reality environments
US20200066046A1 (en) Sharing and Presentation of Content Within Augmented-Reality Environments
CN108108012B (zh) 信息交互方法和装置
US20120195464A1 (en) Augmented reality system and method for remotely sharing augmented reality service
US10218898B2 (en) Automated group photograph composition
JP2017526032A (ja) 場所に基づいた情報処理の方法及び装置
CN106063256A (zh) 创建连接和共享空间
TWI706329B (zh) 圖形碼產生方法、資源發送及接收方法、裝置及電子設備
CN115867882A (zh) 用于图像的基于出行的增强现实内容
WO2022161289A1 (zh) 身份信息的展示方法、装置、终端、服务器及存储介质
US20150358318A1 (en) Biometric authentication of content for social networks
TW201814646A (zh) 爲社群組資料建立虛擬入口座標之方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22745164

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12.12.2023)