CN108551519B - Information processing method, device, storage medium and system - Google Patents

Information processing method, device, storage medium and system Download PDF

Info

Publication number
CN108551519B
CN108551519B CN201810180098.1A CN201810180098A CN108551519B CN 108551519 B CN108551519 B CN 108551519B CN 201810180098 A CN201810180098 A CN 201810180098A CN 108551519 B CN108551519 B CN 108551519B
Authority
CN
China
Prior art keywords
information
data
image
terminal
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810180098.1A
Other languages
Chinese (zh)
Other versions
CN108551519A (en
Inventor
廖戈语
钟庆华
卢锟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810180098.1A priority Critical patent/CN108551519B/en
Publication of CN108551519A publication Critical patent/CN108551519A/en
Application granted granted Critical
Publication of CN108551519B publication Critical patent/CN108551519B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Environmental & Geological Engineering (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the invention discloses an information processing method, an information processing device, a storage medium and an information processing system. The embodiment of the invention collects the image to be processed; extracting target characteristic information from the image to be processed and sending the target characteristic information to a server; receiving an account identification obtained by the server according to the target characteristic information in a matching manner and data information associated with the account identification; displaying data information and acquiring the gravity sensing change information of the terminal; and adjusting the display of the data information according to the gravity sensing variation information. The flexibility and the diversity of information processing are greatly improved.

Description

Information processing method, device, storage medium and system
Technical Field
The present invention relates to the field of communications technologies, and in particular, to an information processing method, apparatus, storage medium, and system.
Background
With the continuous popularization and development of terminals, users increasingly rely on terminals, and various applications can be installed on the terminals, wherein Instant Messaging (IM) applications are widely used, and the users can complete communication and information interaction with friends through the IM applications, for example, view information of the friends.
In the prior art, a terminal displays personal information of a friend from multiple aspects in a display interface of the friend information. And the friend data information is personalized information display of the friend. By checking the data information of the friends, the friends can be quickly and preliminarily known.
In the research and practice process of the prior art, the inventor of the invention finds that in the prior art, the information display of friends is uniform and mostly simple to display characters or pictures, so that the interaction limitation is large and the information processing mode is single.
Disclosure of Invention
The embodiment of the invention provides an information processing method, an information processing device, a storage medium and an information processing system, and aims to improve the flexibility and diversity of information processing.
In order to solve the above technical problems, embodiments of the present invention provide the following technical solutions:
an information processing method comprising:
collecting an image to be processed;
extracting target characteristic information from the image to be processed, and sending the target characteristic information to a server;
receiving an account identification obtained by the server according to the target characteristic information in a matching manner and data information associated with the account identification;
displaying the information and acquiring the gravity sensing variation information of the terminal;
and adjusting the display of the data information according to the gravity sensing variation information.
An information processing apparatus comprising:
the acquisition unit is used for acquiring an image to be processed;
the sending unit is used for extracting target characteristic information from the image to be processed and sending the target characteristic information to a server;
the receiving unit is used for receiving an account identification obtained by the server according to the target characteristic information in a matching manner and data information associated with the account identification;
the display unit is used for displaying the data information and acquiring the gravity sensing variation information of the terminal;
and the adjusting unit is used for adjusting the display of the data information according to the gravity sensing variation information.
In some embodiments, the display subunit is further specifically configured to:
acquiring augmented reality model information in the data information;
analyzing the augmented reality model information through an augmented reality control, and determining a corresponding augmented reality display style;
acquiring data card information in the data information;
loading the information of the data card through an augmented reality display style to generate target information of the data card;
acquiring a current image through a camera;
analyzing the current image to identify feature points on the current image;
and determining a display plane according to the characteristic points, and displaying the information of the target data card on the display plane.
In some embodiments, the sending unit comprises;
the first analysis subunit is used for analyzing the image to be processed and identifying a two-dimensional code image in the image to be processed;
the first determining subunit is used for extracting two-dimension code characteristic information in the two-dimension code image and determining the two-dimension code characteristic information as target characteristic information;
and the first sending subunit is used for sending the target characteristic information to a server.
In some embodiments, the sending unit further includes:
the second analysis subunit is used for analyzing the image to be processed and identifying a target face area image in the image to be processed;
the second determining subunit is used for extracting the face feature information in the target face region image and determining the face feature information as target feature information;
and the second sending subunit is used for sending the target characteristic information to a server.
In some embodiments, the second analysis subunit is specifically configured to:
analyzing an image to be processed to determine a face region image on the image to be processed;
when the number of the face area images is detected to be single, determining the face area images as target face area images;
and when the number of the face area images is not single, generating prompt information, and receiving a target face area image selected by a user according to the prompt information.
In some embodiments, the information processing apparatus further includes a detection unit, a message control generation unit, and an addition control generation unit;
the detection unit is used for detecting whether the account identification exists on the terminal;
a message control generating unit, configured to generate a message control when it is detected that the account identifier exists on the terminal, where the message control is used to send a message to a client associated with the account identifier;
and the adding control generating unit is used for generating an adding control when the account identification does not exist on the terminal, and the adding control is used for sending a friend adding request to the client associated with the account identification.
A storage medium storing a plurality of instructions, the instructions being suitable for a processor to load so as to execute the steps of the information processing method.
An information handling system, the system comprising: a terminal and a server;
the terminal comprises the information processing device;
the server is configured to: the method comprises the steps of receiving target characteristic information sent by a terminal, matching according to the target characteristic information to obtain an account identification corresponding to the target characteristic information and data information associated with the account identification, and sending the account identification and the data information associated with the account identification to the terminal.
The embodiment of the invention collects the image to be processed; extracting target characteristic information from the image to be processed and sending the target characteristic information to a server; receiving an account identification obtained by the server according to the target characteristic information in a matching manner and data information associated with the account identification; displaying data information and acquiring the gravity sensing change information of the terminal; and adjusting the display of the data information according to the gravity sensing variation information. The scheme can quickly identify the target characteristic information on the image to be processed, acquire the data information related to the target characteristic information according to the target characteristic information, display the data information and adjust the display of the data information in real time according to the gravity sensing change information of the terminal, so that compared with the existing data information display, the display is uniform, and the flexibility and diversity of information processing are improved for the simple scheme of displaying characters or pictures.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of a scenario of an information handling system provided by an embodiment of the present invention;
FIG. 2 is a flow chart of an information processing method according to an embodiment of the present invention;
FIG. 3 is another schematic flow chart of an information processing method according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an image processing interface provided by an embodiment of the present invention;
FIG. 5 is another schematic diagram of an image processing interface provided by an embodiment of the invention;
FIG. 6 is another schematic diagram of an image processing interface provided by an embodiment of the invention;
FIG. 7 is another schematic diagram of an image processing interface provided by an embodiment of the invention;
FIG. 8 is a timing diagram illustrating an information processing method according to an embodiment of the present invention;
FIG. 9a is a schematic structural diagram of an information processing apparatus according to an embodiment of the present invention;
FIG. 9b is a schematic diagram of another structure of an information processing apparatus according to an embodiment of the present invention;
FIG. 9c is a schematic diagram of another structure of an information processing apparatus according to an embodiment of the present invention;
FIG. 9d is a schematic diagram of another structure of an information processing apparatus according to an embodiment of the present invention;
FIG. 9e is a schematic diagram of another structure of an information processing apparatus according to an embodiment of the present invention;
FIG. 9f is a schematic diagram of another structure of an information processing apparatus according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of an information processing system according to an embodiment of the present invention.
Fig. 11 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention provides an information processing method, an information processing device, a storage medium and an information processing system.
Referring to fig. 1, fig. 1 is a schematic view of a scenario of an information processing system according to an embodiment of the present invention, including: the terminal 10 and the server 20 may be connected through a communication network, and the communication network may include a wireless network and a wired network, wherein the wireless network includes one or more of a wireless wide area network, a wireless local area network, a wireless metropolitan area network, and a wireless personal area network. The network includes network entities such as routers, gateways, etc., which are not shown in the figure. The terminal 10 may interact with the server 20 via a communication network, for example, by downloading an application (e.g., an instant messaging application) from the server 20.
The information processing system may include an information processing apparatus, which may be specifically integrated in a terminal having computing capability, such as a tablet computer, a mobile phone, a notebook computer, a desktop computer, and the like, which has a storage unit and is equipped with a microprocessor, in fig. 1, the terminal is the terminal 10 in fig. 1, and various applications required by a user, such as an instant messaging application having an information interaction function, may be installed in the terminal 10. The terminal 10 may be configured to collect an image to be processed, extract target feature information from the image to be processed, send the target feature information to the server 20, receive an account id obtained by matching the target feature information by the server 20 and data information associated with the account id, display the data information by the terminal 10, obtain gravity sensing variation information of the terminal 10, adjust display of the data information according to the gravity sensing variation information, and the like.
The information processing system may further include a server 20, configured to receive the target feature information sent by the terminal 10, perform matching according to the target feature information to obtain an account id corresponding to the target feature information and profile information associated with the account id, and send the account id and the profile information associated with the account id to the terminal 10. The information processing system may further include a memory for storing an information base including an association relationship between the account id and the target feature information, an association relationship between the account id and the profile information, and the like, so that the server may acquire the account id and the profile information associated with the account id from the memory and transmit the acquired account id and profile information to the terminal 10.
It should be noted that the scenario diagram of the information processing system shown in fig. 1 is only an example, and the information processing system and the scenario described in the embodiment of the present invention are for more clearly illustrating the technical solution of the embodiment of the present invention, and do not form a limitation on the technical solution provided in the embodiment of the present invention.
The following are detailed below. The numbers in the following examples are not intended to limit the order of preference of the examples.
The first embodiment,
In the present embodiment, description will be made from the perspective of an information processing apparatus, which may be integrated in a terminal having an arithmetic capability, such as a tablet computer, a mobile phone, or the like, which has a storage unit and a microprocessor mounted thereon.
An information processing method comprising: collecting an image to be processed; extracting target characteristic information from the image to be processed and sending the target characteristic information to a server; receiving an account identification obtained by the server according to the target characteristic information in a matching manner and data information associated with the account identification; displaying data information and acquiring the gravity sensing change information of the terminal; and adjusting the display of the data information according to the gravity sensing variation information.
Referring to fig. 2, fig. 2 is a flowchart illustrating an information processing method according to an embodiment of the invention. The information processing method includes:
in step 101, an image to be processed is acquired.
The image to be processed may be an image in a video stream acquired in real time by a camera, or may also be an image cached or stored on a terminal, and the Format of the image may be Joint Photographic Experts Group (JPEG), Graphics Interchange Format (GIF), or the like.
In some embodiments, the to-be-processed image may be acquired by, for example, opening a local client on the terminal, such as an instant messaging client, and inputting an account id and a password, where the local client may enter a display main interface corresponding to the local client, where the display main interface is a first interface displayed after logging in through the account id and the password. The main interface comprises a shooting control which is a quick entrance for triggering to-be-processed image acquisition. When the shooting control is detected to be clicked by a user, the camera assembly is called to collect the image to be processed, and the collected image to be processed is displayed on the display screen. Optionally, the to-be-processed image may further include a camera switching control and an album control, where the camera switching control is a shortcut entry for switching between the front camera and the rear camera, and specifically, the user may switch between the front camera and the rear camera by clicking the camera switching control to obtain the to-be-processed image. The album control is a quick entry for calling an album on the terminal. Specifically, the user can call the album on the terminal by clicking the album control, so that a certain image in the album is selected as the image to be processed.
In step 102, target feature information is extracted from the image to be processed and sent to a server.
The image to be processed can be analyzed, two-dimensional code information in the image to be processed is extracted, and the two-dimensional code information is sent to a server as target characteristic information. Or analyzing the image to be processed, extracting the face feature information in the image to be processed, and sending the face feature information serving as target feature information to a server.
In some embodiments, the step of extracting the target feature information from the image to be processed may include:
(1) analyzing the image to be processed, and identifying a two-dimensional code image in the image to be processed;
(2) and extracting the two-dimension code characteristic information in the two-dimension code image, and determining the two-dimension code characteristic information as target characteristic information.
It should be noted that the two-dimensional code is a new generation bar code technology for recording data symbol information by using a black-and-white rectangular matrix in which specific geometric figures are distributed on a plane (two-dimensional direction) according to a certain rule, and is composed of a two-dimensional code matrix figure, a two-dimensional code number and the description characters below, and has the characteristics of large information amount, strong error correction capability, high reading speed, omnibearing reading and the like.
The terminal determines a two-dimensional code image in the image to be processed according to the specific shape characteristics by analyzing the image to be processed, performs coding processing on the two-dimensional code image to extract two-dimensional code characteristic information in the two-dimensional code image, and determines the two-dimensional code characteristic information as target characteristic information. Optionally, the two-dimensional code feature information may be a combination of binary digit sequences.
In some embodiments, the step of extracting the target feature information from the image to be processed may include:
(1.1) analyzing the image to be processed, and identifying a target face area image in the image to be processed;
and (1.2) extracting the face characteristic information in the target face region image, and determining the face characteristic information as target characteristic information.
The face image contains rich pattern features, such as histogram features, color features, template features, structural features, and Haar features (a Haar feature reflects gray level change of an image, and a pixel division module finds a difference value). Therefore, the characteristic scanning can be carried out on the image to be processed, and the target face area image in the image to be processed is determined. Alternatively, the target face region image may be highlighted with a circular frame on the image to be processed.
Further, the terminal can perform image preprocessing such as gray level correction and noise filtering on the target face region image, and further extract face feature information from the preprocessed target face region image. The face feature information may include geometric description information of local constituent points such as eyes, nose, mouth, and chin. And determining the face feature information as target feature information.
In some embodiments, the step of analyzing the image to be processed and identifying the target face region image in the image to be processed may include:
(2.1) analyzing the image to be processed to determine a face region image on the image to be processed;
(2.2) when the number of the detected face region images is single, determining the face region images as target face region images;
and (2.3) when the number of the face area images is not single, generating prompt information, and receiving a target face area image selected by the user according to the prompt information.
It should be noted that, since the image to be processed may include a plurality of face region images, the feature information on the image to be processed needs to be scanned first to determine all face region images on the image to be processed.
When the number of the detected face area images is single, the only face area image is determined as the target face area image. And when the number of the face area images is not single, generating prompt information, wherein the prompt information is used for prompting a user to select a target face area image and receiving the target face area image selected by the user according to the prompt information. Optionally, a pop-up window information may be generated to remind the user to determine the target face region image, and the user may manually click a certain face region image in the image to be processed to determine the face image as the target face region image.
In step 103, an account id obtained by the server according to the matching of the target feature information and the profile information associated with the account id are received.
The account identifier may include information such as an account number of the instant messaging client, an International Mobile Equipment Identity (IMEI) code, or a mailbox account number.
It is understood that the installation package file of a certain client may be included in the server, and for better description, the installation package file of the instant messaging client is illustrated as an example. The terminal can download the installation package file of the instant messaging client from the server, and decompress and install the installation package file on the terminal to generate a local client installed on the terminal. The display information of the local client is provided by the server, when the local client is opened for the first time, a user correspondingly enters a login interface, the user can input an account identifier (such as an account) and a password on the login interface for verification login, if the user does not have the account identifier, the user can register, and the account identifier and the password are stored in the server. Each account id is associated with profile information corresponding to the account, and the profile information may include personal profile information, tag information, and the like of the user. Optionally, the profile information may include: and memorizing the fragment information, the individual label information, the medal wall information and the like. The memory fragment information is composed of a plurality of historical images of the user, and the experience of the user can be quickly known by browsing the historical images. The individual label can be universally understood as 'friend impression', the information function of the individual label can be used for evaluating keywords of friends, and the friends can also evaluate keywords of themselves. Finally, all the evaluations received by one user are gathered together and displayed to other friends through the personalized tag information. The account id (e.g. account) and the associated profile information corresponding to the account are stored in the server.
Based on this, in some embodiments, before the step of acquiring the image to be processed, the method may further include:
(1) collecting target characteristic information;
(2) acquiring an account identifier associated with a local client;
the account identifier may include information such as an account of the instant messaging client, an international mobile equipment identity code, or a mailbox account.
(3) And binding the target characteristic information with the account identification, and sending the bound target characteristic information and the account identification to a server.
The user can perform a pre-binding operation by opening a local client on the terminal, that is, by selecting target feature information to bind with a login account id on the local client, it should be particularly noted that the target feature information may be two-dimensional code feature information or face feature information of the user himself.
Then, the terminal obtains an account identifier associated with the local client, for example, account information currently logged in the local client (i.e., account information of the user himself), binds the account identifier with the target feature information, and packages and sends the bound target feature information and the account identifier to the server, so that the server stores the binding relationship between the target feature information and the account identifier.
Further, after extracting the target characteristic information, the terminal sends the target characteristic information to the server, and as can be known from the above, the server pre-stores the binding relationship between the characteristic information and the account identifier, and then, after receiving the target characteristic information, the server performs matching according to the target characteristic information to obtain the account identifier (such as an account) bound to the target characteristic information and the data information associated with the account identifier, and returns the account identifier and the data information associated with the account identifier to the terminal.
In step 104, the data information is displayed and the gravity sensing variation information of the terminal is obtained.
After receiving the information, the terminal can display the information in an augmented reality mode and acquire the gravity sensing variation information of the terminal, wherein the gravity sensing variation information can include displacement variation data and angle variation data so as to display and adjust the displayed information according to the gravity sensing variation information.
It should be noted that Augmented Reality (AR), which is a new technology for seamlessly integrating real world information and virtual world information, is a technology for superimposing entity information (visual information, sound, taste, touch, etc.) that is difficult to be experienced in a certain time space range of the real world originally, after simulation through scientific technologies such as computers, virtual information is applied to the real world and is perceived by human senses, so that a sensory experience beyond Reality is achieved. The real environment and the virtual object are superimposed on the same picture or space in real time and exist simultaneously.
In some embodiments, the data information may include data card information and corresponding augmented reality model information, and optionally, the data card information may include a plurality of data cards, each of which is presented to a user personality from a side, the data cards may include memory chip data cards, personality label data cards, medal wall data cards, and the like. The augmented reality model information is a display style of augmented reality, the augmented reality model information can comprise a plurality of display styles of the augmented reality, and a user can select a favorite display style according to own preference. The step of displaying the data information may include:
(1) acquiring augmented reality model information in the data information;
(2) analyzing the augmented reality model information through the augmented reality control, and determining a corresponding augmented reality display style;
(3) acquiring data card information in the data information;
(4) loading the information of the data card through an augmented reality display style to generate target information of the data card;
(5) the information of the target data card is displayed in an augmented reality form.
It should be noted that, a Software Development Kit (SDK) with an augmented reality function may be integrated on the terminal, and the augmented reality control may be loaded through the Software Development Kit, and the information may be analyzed through the augmented reality control.
Based on the augmented reality display style, firstly, augmented reality model information in the data information is obtained, and the augmented reality control in the terminal analyzes the augmented reality model information to determine the augmented reality display style corresponding to the augmented reality model information. The augmented reality display style may include a display background style, a display frame style, and a display effect style.
Then, the data card information in the data information is obtained, and the data card information is loaded through the augmented reality display style, so that the data card information is embedded in the augmented reality display style to generate the target data card information.
Further, the step of displaying the target data card information in the form of augmented reality through the augmented reality control may include:
(1.1) acquiring a current image through a camera;
(1.2) analyzing the current image to identify feature points on the current image;
(1.3) determining a display plane according to the characteristic points, and displaying the information of the target data card on the display plane.
The method comprises the steps of shooting a current image through a camera, analyzing feature points of the current image, specifically, obtaining the feature points of an object in the shot image, determining a plane in a three-dimensional space according to a plurality of feature points, namely the plane (display plane) of the current image, displaying target data card information on the display plane in an augmented reality mode, and rendering the target data card information.
In some embodiments, the step of acquiring the gravity sensing variation information of the terminal may include:
(2.1) detecting displacement change data of the terminal through a gravity sensor;
and (2.2) detecting angle change data of the terminal through a gyroscope sensor.
It should be noted that the gravity sensor, also called as a gravity sensor, belongs to a new sensor technology, and it uses an elastic sensing element to make a cantilever type displacement device, and uses an energy storage spring made of the elastic sensing element to drive an electric contact, so as to complete the conversion from the gravity change to the electric signal. The gyroscope sensor is a simple and easy-to-use positioning and control system based on free space movement and gestures, is originally applied to a helicopter model, and is widely applied to mobile portable equipment such as mobile phones and the like.
The displacement change data of the terminal can be detected through the gravity sensor, and the displacement change data indicates position change of the terminal in a three-dimensional space position, and specifically can be position change data on an X axis, position change data on a Y axis, and position change data on a Z axis. The angle change data of the terminal can be detected through the gyroscope sensor, and similarly, the angle change data can also include angle change data on an X axis, angle change data on a Y axis, and angle change data on a Z axis.
In some embodiments, after the step of displaying the material information and acquiring the gravity sensing variation information of the terminal, the method may further include:
(3.1) detecting whether an account identification exists on the terminal;
when detecting that the account identification exists on the terminal, executing the step (3.2); and (3.3) when the account identification does not exist on the terminal.
(3.2) generating a message control, wherein the message control is used for sending a message to the account identification-associated client;
and (3.3) generating an adding control, wherein the adding control is used for sending a friend adding request to the client side associated with the account identification.
When detecting that the account identifier exists on the terminal, the account identifier and the login account identifier associated with the local client are in a friend relationship, executing the step (3.2), generating a message control, and sending a message to the client corresponding to the account identifier by clicking the message control by a user. And (3) when detecting that the account identification does not exist on the terminal, indicating that the account identification is not in friend relationship with the login account identification associated with the local client, executing step (3.3), generating an adding control, and sending a friend adding request to the client corresponding to the account identification by clicking the adding control by a user.
In step 105, the display of the data information is adjusted according to the gravity-sensitive variation information.
The terminal can adjust the display of the data information according to the displacement change data and the angle change data, and optionally, the displayed data information can be switched according to the displacement change data of the terminal. The terminal can adjust the angle of the display of the data information according to the angle change data.
In some embodiments, the adjusting the display of the material information according to the displacement variation data and the angle variation data may include:
(1) performing display switching adjustment on the display of the data information according to the displacement change data;
(2) and adjusting the display angle of the data information according to the angle change data.
The displacement direction of the terminal, such as displacement of the terminal to the left, displacement to the right, displacement to the lower side, or displacement to the upper side, may be determined according to the displacement change data of the terminal. Optionally, the left side displacement is correspondingly set as the operation of triggering the target data card information to be switched to the previous sheet, and the right side displacement is correspondingly set as the operation of triggering the target data card information to be switched to the next sheet.
Furthermore, the placing direction of the terminal, such as the terminal is flatly placed or obliquely placed, and the oblique angle, can be judged according to the angle change data of the terminal. Optionally, different rendering and displaying effects of the material information may be set according to the terminal being laid flat or being obliquely placed, and different rendering and displaying effects of the material information may also be set at different oblique angles, such as displaying the material information in different scaling sizes, and the like. The diversity of information processing is improved.
In view of the above, the embodiment of the present invention collects the image to be processed; extracting target characteristic information from the image to be processed and sending the target characteristic information to a server; receiving an account identification obtained by the server according to the target characteristic information in a matching manner and data information associated with the account identification; displaying data information and acquiring the gravity sensing change information of the terminal; and adjusting the display of the data information according to the gravity sensing variation information. The scheme can quickly identify the target characteristic information on the image to be processed, acquire the data information related to the target characteristic information according to the target characteristic information, display the data information and adjust the display of the data information in real time according to the gravity sensing change information of the terminal, so that compared with the existing data information display, the display is uniform, and the flexibility and diversity of information processing are improved for the simple scheme of displaying characters or pictures.
Example II,
The method described in the first embodiment is further illustrated by way of example.
In this embodiment, an example will be described in which the client is an instant messaging client and the information processing apparatus is specifically integrated in a terminal.
Referring to fig. 3, fig. 3 is another schematic flow chart of an information processing method according to an embodiment of the invention. The method flow can comprise the following steps:
in step 201, the terminal collects an image to be processed, analyzes the image to be processed, and identifies a two-dimensional code image in the image to be processed.
It should be noted that, after the terminal downloads the installation package corresponding to the instant messaging client for decompression and installation, an instant messaging client is correspondingly generated on the terminal, the instant messaging client installed on the terminal is also called a local client, an account identifier can be logged on the local client, the information in the information storage server corresponding to the account identifier can include social information corresponding to the account, and the social information can include personal information, tag information and the like of the user. Optionally, the profile information may include: and memorizing the fragment information, the individual label information, the medal wall information and the like. The memory fragment information is composed of a plurality of historical images of the user, and the experience of the user can be quickly known by browsing the historical images. The individual label can be universally understood as 'friend impression', the information function of the individual label can be used for evaluating keywords of friends, and the friends can also evaluate keywords of themselves. Finally, all the evaluations received by one user are gathered together and displayed to other friends through the personalized tag information. The account id (e.g. account) and the associated profile information corresponding to the account are stored in the server.
The user can operate the terminal to open the local client, call the camera assembly to collect the image to be processed, and determine the two-dimensional code image in the image to be processed according to the specific shape characteristics in the image to be processed. Alternatively, three identical squares may be provided at all two-dimensional code corners, so that the camera assembly can be positioned according to the three identical squares to identify the two-dimensional code image in the image to be processed.
In some embodiments, before the step of acquiring the image to be processed, analyzing the image to be processed, and identifying the two-dimensional code image in the image to be processed, the terminal may further include: the method comprises the steps of collecting a preset image, analyzing the preset image, identifying a two-dimensional code image in the preset image, extracting two-dimensional code characteristic information in the two-dimensional code image, binding the two-dimensional code characteristic information with an account identification logged in a local client, and sending the bound account identification and the two-dimensional code characteristic information to a server so that the server stores the binding relationship between the account identification and the two-dimensional code characteristic information.
For example, a user can open a local client through a terminal, call a camera assembly to acquire a current image, perform switching operation of a front camera or a rear camera by clicking a camera switching control, scan shape features of the current image through the terminal, determine a two-dimensional code image in the current image, and extract two-dimensional code feature information in the two-dimensional code image. The terminal acquires an account identifier logged in by a local client, such as the account identifier '131456', binds the account identifier '131456' with the two-dimensional code characteristic information, and sends the bound account identifier '131456' and the two-dimensional code characteristic information to a server. The user can associate the account identification with the two-dimension code characteristic information.
It is understood that the server stores the binding relationship in the server after receiving the account id "131456" and the two-dimensional code feature information.
In step 202, the terminal extracts two-dimensional code feature information in the two-dimensional code image, determines the two-dimensional code feature information as target feature information, and sends the target feature information to the server.
When the two-dimensional code image is determined, the terminal carries out coding processing on each shape in the two-dimensional code image, the shapes in the two-dimensional code image are subjected to optimization algorithm processing and are inversely compiled into a string of binary digit sequence combination consisting of 0 and 1, the binary digit sequence combination is determined as target characteristic information, and the binary digit sequence combination is sent to the server.
For example, in the two-dimensional code image, 0 corresponds to a white small square, and 1 corresponds to a black small square, the small squares are filled into a large square according to preset numbers (such as 8 groups), so that a complete two-dimensional code pattern which can be recognized by the terminal is obtained, the terminal encodes each shape in the two-dimensional code image during scanning the two-dimensional code image, reversely encodes each small square into numbers, forms a number combination, such as "10110001", determines the "10110001" as target characteristic information, and sends the "10110001" to the server.
In step 203, the terminal receives the account id and the profile information associated with the account id, which are obtained by the server according to the matching of the target feature information.
After receiving the target characteristic information, the server matches the digital combination "10110001" in the target characteristic information with the digital combination in the characteristic information stored in the server to determine the characteristic information which is the same as the digital combination "10110001" in the target characteristic information, and obtains data information which is associated with account number identification (such as account number) "131456" and account number identification "131456" and is correspondingly bound to the characteristic information according to the binding relation, wherein the data information can comprise data card information and corresponding augmented reality model information, optionally, the data card information can comprise a plurality of data cards, each data card can show the personality of the user from one side, and the data cards can comprise personal profile data cards, memory fragment data cards, personality label data cards, badge wall data cards and the like. The augmented reality model information is a display style of augmented reality, the augmented reality model information can comprise a plurality of display styles of the augmented reality, and a user can select a favorite display style according to own preference.
In step 204, the terminal obtains the augmented reality model information in the data information, and analyzes the augmented reality model information through the augmented reality control to determine a corresponding augmented reality display style.
It should be noted that a software development kit with an augmented reality function may be integrated on the terminal, and the augmented reality control may be loaded through the software development kit, and the data information may be analyzed through the augmented reality control.
The terminal can acquire the augmented reality model information in the data information, and the augmented reality control in the terminal analyzes the augmented reality model information to determine the augmented reality display style corresponding to the augmented reality model information. The augmented reality display style may include a display background style, a display frame style, and a display effect style.
In step 205, the terminal obtains the data card information in the data information, and loads the data card information through the augmented reality display pattern to generate the target data card information.
The terminal can obtain the data card information in the data information, and load the data card information through the augmented reality display pattern, so that the data card information is embedded in the augmented reality display pattern to generate target data card information. Alternatively, the target data card information may include a personal profile data card, a memory scrap data card, a personality label data card, and a medal wall data card loaded through an augmented reality display style.
In one embodiment, the personality labelstock card may be stored in a table format, such as that shown in table one.
Table one:
Figure BDA0001588522920000161
from the above table one, it can be seen that the number of the tags is 4, the tag contents are "eat one happy", "black belly", "natural dull", and "select phobia", the display arrangement patterns are "transverse arrangement", "longitudinal arrangement", "transverse arrangement", and "transverse arrangement", the bold and highlight display is "no", and "yes", the display font sizes are "10 pixels", and "14 pixels", respectively, and the display font parameters are all "regular font". It should be noted that the above examples are not intended to limit the present invention, and the personality tag data card may be in other formats.
In step 206, the terminal collects the current image through the camera, analyzes the current image to identify the feature points on the current image, determines the display plane according to the feature points, and displays the information of the target data card on the display plane.
The terminal can shoot a current image through the camera, analyze the characteristic points of the current image, specifically, obtain the characteristic points of an object in the shot image, if the shot image is a plurality of buildings, obtain the characteristic points on the plurality of building surfaces, confirm to obtain a plane according to the plurality of characteristic points, namely a display plane of the image, and then display the target data card information on the display plane to realize three-dimensional rendering and achieve the effect of augmented reality display. Optionally, the target data card information preferably presents a personal profile data card loaded via an augmented reality presentation style.
As shown in fig. 4, the terminal 10 displays the personal profile data card 11 loaded through the augmented reality display style on the display screen in the form of augmented reality, and the personal profile data card 11 may include introduction information such as "welcome to the virtual world of jack" edited by the user.
In step 207, the terminal detects displacement change data of the terminal through the gravity sensor, and detects angle change data of the terminal through the gyroscope sensor.
The displacement change data of the terminal can be detected through the gravity sensor, and the displacement change data indicates position change of the terminal in a three-dimensional space position, and specifically can be position change data on an X axis, position change data on a Y axis, and position change data on a Z axis. The angle change data of the terminal can be detected through the gyroscope sensor, and similarly, the angle change data can also include angle change data on an X axis, angle change data on a Y axis, and angle change data on a Z axis.
In step 208, the terminal performs display switching adjustment on the display of the data information according to the displacement change data, and performs display angle adjustment on the display of the data information according to the angle change data.
The displacement direction of the terminal, such as displacement of the terminal to the left, displacement to the right, displacement to the lower side, or displacement to the upper side, may be determined according to the displacement change data of the terminal. Optionally, the left side displacement is correspondingly set to be the operation of triggering the target data card information to be switched into the previous data card, and the right side displacement is correspondingly set to be the operation of triggering the target data card information to be switched into the next data card.
Furthermore, the placing direction of the terminal, such as the terminal is flatly placed or obliquely placed, and the oblique angle, can be judged according to the angle change data of the terminal. Optionally, different rendering and displaying effects of the material information can be set according to the terminal in a flat or oblique manner. And under different oblique angles, different rendering display effects of the target data card can be set, such as different display of the display scaling size of the target data card according to different angles, and the like. The diversity of information processing is improved.
For example, as shown in fig. 5, a user can hold the terminal 10 by hand and perform a shaking operation to the right, the terminal 10 can detect the displacement variation data of the terminal through the gravity sensor, and determine that the terminal shakes to the right according to the displacement variation data, and correspondingly, the personal profile data card 11 loaded through the augmented reality display mode can be quickly switched to the memory fragment data card 12 loaded through the augmented reality display mode, the memory fragment data card 12 includes a plurality of images of the user, the user can also manually click the image in the memory fragment data card 12 to enlarge and view the image, and the image can also be switched and displayed by sliding the image.
Similarly, as shown in fig. 6, the user can hold the terminal 10 by hand and shake the terminal to the right, the terminal 10 can detect the displacement change data of the terminal through the gravity sensor, and determine that the terminal shakes to the right according to the displacement change data, and correspondingly, the memory fragment data card 12 loaded by the augmented reality display pattern can be quickly switched to the individual label data card 13 loaded by the augmented reality display pattern, the individual label data card 13 includes a plurality of label descriptions of the user, and the user can add a label to the user by manually clicking the label description.
In the same way, as shown in fig. 7, the user can hold the terminal 10 by hand, and shake the operation to the right, the terminal 10 can detect the displacement change data of the terminal through the gravity sensor, and confirm that the terminal shakes to the right according to the displacement change data, the corresponding can be with the loaded individual character label data card 13 of the augmented reality display style fast switch over into the loaded medal wall data card 14 of the augmented reality display style, include a plurality of medal descriptions of the user in this individual character label data card 14, can know the liking of this user fast through looking over this medal description.
It is easy to think that the user can also shake the terminal 10 to the left by holding the terminal, the terminal 10 will detect the displacement change data of the terminal through the gravity sensor, and determine that the terminal shakes to the left according to the displacement change data, and correspondingly can execute the operation of switching the target data card information to the previous data card.
In step 209, the terminal detects whether an account id exists on the terminal.
When the terminal detects that an account identifier exists on the terminal, executing step 210; when the terminal detects that the account id does not exist on the terminal, step 211 is executed.
In step 210, the terminal generates a message control.
When the terminal detects that the account identifier exists on the terminal, the account identifier and the login account identifier associated with the local client are in a friend relationship, a message control is generated, and a user can quickly send a message to the client corresponding to the account identifier by clicking the message control.
In step 211, the terminal generates an add control.
When the terminal detects that no account identification exists on the terminal, the account identification and the login account identification associated with the local client are not in a friend relationship, an adding control is generated, and a user can quickly send a friend adding request to the client corresponding to the account identification by clicking the adding control.
Therefore, when a user of a local client of the terminal encounters other users occasionally in a real scene, the local client can be started, the two-dimensional code image scanning function is used, the two-dimensional code image in the image to be identified and processed of the other users is obtained through the camera assembly, the two-dimensional code characteristic information of the other users is extracted, account identification and data information of the other users are obtained from the server by using the two-dimensional code characteristic information, the obtained data information is displayed in an augmented reality mode, the displayed data information can be displayed and adjusted according to displacement change data and angle change data of the terminal, compared with the existing data information display, the display is uniform, and flexibility and diversity of information processing are improved for a simple scheme of displaying characters or pictures.
Example III,
The method described in example two is further illustrated in detail below by way of example.
Referring to fig. 8, fig. 8 is a timing diagram illustrating an information processing method according to an embodiment of the invention. The method flow can comprise the following steps:
in step S1, the terminal acquires an image to be processed, and extracts target feature information from the image to be processed.
The user can operate the terminal to open the local client, call the camera assembly to collect the image to be processed, and determine the two-dimensional code image in the image to be processed according to the specific shape characteristics in the image to be processed. Alternatively, three identical squares may be provided at all two-dimensional code corners, so that the camera assembly can be positioned according to the three identical squares to identify the two-dimensional code image in the image to be processed.
Further, each shape in the two-dimensional code image is subjected to encoding processing, the shape in the two-dimensional code image is subjected to optimization algorithm processing, and is inversely compiled into a string of binary digit sequence combination consisting of 0 and 1, and the binary digit sequence combination such as '10110001' is determined as the target characteristic information.
In step S2, the terminal transmits the target feature information to the server.
Wherein the terminal sends the binary digit sequence combination "10110001" (target characteristic information) to the server.
In step S3, the server obtains the account id and the profile information associated with the account id according to the matching of the target feature information.
After receiving the target feature information, the server matches the digital combination "10110001" in the target feature information with the digital combination in the feature information stored in the server to determine the feature information that is the same as the digital combination "10110001" in the target feature information, and obtains account id (for example, account id) "131456" and information associated with the account id "131456" that are bound to the feature information according to the binding relationship, where the information may include information about a data card and information about a corresponding augmented reality model.
In step S4, the server sends the account id and the profile information associated with the account id to the terminal.
The server sends the account id (such as account number) "131456" and the information of the data card associated with the account id "131456" and the corresponding augmented reality model information to the terminal.
In step S5, the terminal displays the data information and obtains the gravity-sensitive variation information of the terminal.
The terminal can acquire augmented reality model information in the data information, the augmented reality control in the terminal analyzes the augmented reality model information, an augmented reality display pattern corresponding to the augmented reality model information is determined, the data card information is loaded through the augmented reality display pattern, the data card information is embedded in the augmented reality display pattern, target data card information is generated, and the target data card information is displayed in an augmented reality mode.
Furthermore, displacement change data of the terminal is detected through a gravity sensor, and angle change data of the terminal is detected through a gyroscope sensor.
In step S6, the terminal adjusts the display of the material information according to the gravity-sensitive variation information.
The terminal performs display switching adjustment on the display of the data information according to the displacement change data, and performs display angle adjustment on the display of the data information according to the angle change data.
Example four,
In order to better implement the information processing method provided by the embodiment of the present invention, an embodiment of the present invention further provides an apparatus based on the information processing method. The terms are the same as those in the above-described information processing method, and details of implementation may refer to the description in the method embodiment.
Referring to fig. 9a, fig. 9a is a schematic structural diagram of an information processing apparatus according to an embodiment of the present invention, where the information processing apparatus may include a collecting unit 301, a sending unit 302, a receiving unit 303, a display unit 304, an adjusting unit 305, and the like.
The acquisition unit 301 is configured to acquire an image to be processed.
The image to be processed may be an image in a video stream acquired in real time by a camera, or may be a cached image or an image stored on a terminal.
In some embodiments, the to-be-processed image may be acquired by acquiring the to-be-processed image by the acquisition unit 301 invoking the camera assembly to acquire the to-be-processed image and displaying the acquired to-be-processed image on the display screen. Optionally, the to-be-processed image may further include a camera switching control and an album control, where the camera switching control is a shortcut entry for switching between the front camera and the rear camera, and specifically, the user may switch between the front camera and the rear camera by clicking the camera switching control to obtain the to-be-processed image. The album control is a quick entry for calling an album on the terminal. Specifically, the user can call the album on the terminal by clicking the album control, so that a certain image in the album is selected as the image to be processed.
A sending unit 302, configured to extract target feature information from the image to be processed, and send the target feature information to a server.
The sending unit 302 may analyze the image to be processed, extract two-dimensional code information in the image to be processed, and send the two-dimensional code information as target feature information to the server. Or analyzing the image to be processed, extracting the face feature information in the image to be processed, and sending the face feature information serving as target feature information to a server.
In some embodiments, as shown in fig. 9d, the sending unit 302 may comprise a first analyzing subunit 3021, a first determining subunit 3022 and a first sending subunit 3023, as follows:
the first analysis subunit 3021 is configured to analyze the image to be processed and identify a two-dimensional code image in the image to be processed;
a first determining subunit 3022, configured to extract two-dimensional code feature information in the two-dimensional code image, and determine the two-dimensional code feature information as target feature information;
a first sending subunit 3023, configured to send the target feature information to the server.
The first analyzing subunit 3021 determines, by analyzing the image to be processed, a two-dimensional code image in the image to be processed according to the specific shape feature, and the first determining subunit 3022 performs encoding processing on the two-dimensional code image to extract two-dimensional code feature information in the two-dimensional code image and determine the two-dimensional code feature information as target feature information. And transmits the target feature information to the server through the first transmitting sub-unit 3023.
In some embodiments, as shown in fig. 9e, the sending unit 302 may further include a second analyzing subunit 3021, a second determining subunit 3022, and a second sending subunit 3023, as follows:
a second analysis subunit 3021, configured to analyze the image to be processed, and identify a target face region image in the image to be processed;
a second determining subunit 3022, configured to extract face feature information in the target face region image, and determine the face feature information as target feature information;
a second sending subunit 3023, configured to send the target feature information to the server.
The pattern features contained in the face image are rich, such as histogram features, color features, template features, structural features, Haar features and the like. The second analysis subunit 3021 may perform feature scanning on the image to be processed to determine a target face region image in the image to be processed.
Further, the second determining subunit 3022 may perform image preprocessing such as gray scale correction and noise filtering on the target face region image, and further extract face feature information from the preprocessed target face region image. The face feature information may include geometric description information of local constituent points such as eyes, nose, mouth, and chin. And determining the face feature information as target feature information. And transmits the target feature information to the server through the second transmitting sub-unit 3023.
In some embodiments, the second analytical subunit is specifically for: analyzing the image to be processed to determine the face area image on the image to be processed, determining the face area image as a target face area image when the number of the face area images is detected to be single, generating prompt information when the number of the face area images is detected not to be single, and receiving the target face area image selected by a user according to the prompt information.
A receiving unit 303, configured to receive an account id obtained by matching the target feature information by the server and data information associated with the account id.
The account identifier may include information such as an account of the instant messaging client, an international mobile equipment identity code, or a mailbox account.
The user can perform a pre-binding operation by opening a local client on the terminal, that is, by selecting target feature information to bind with a login account id on the local client, it should be particularly noted that the target feature information may be two-dimensional code feature information or face feature information of the user himself.
Then, the terminal obtains an account identifier associated with the local client, for example, account information currently logged in the local client (i.e., account information of the user himself), binds the account identifier with the target feature information, and packages and sends the bound target feature information and the account identifier to the server, so that the server stores the binding relationship between the target feature information and the account identifier.
Further, after extracting the target feature information, the terminal sends the target feature information to the server, and as can be known from the above, the server pre-stores the binding relationship between the feature information and the account id, and then, after receiving the target feature information, the server performs matching according to the target feature information to obtain the account id (e.g., the account) bound to the target feature information and the data information associated with the account id, and returns the account id and the data information associated with the account id to the receiving unit 303.
The display unit 304 is used for displaying the data information and acquiring the gravity sensing variation information of the terminal.
After the receiving unit 303 receives the data information, the display unit 304 may display the data information in an augmented reality manner, and acquire the gravity sensing variation information of the terminal, where the gravity sensing variation information may include displacement variation data and angle variation data, so as to perform display adjustment on the displayed data information according to the gravity sensing variation information.
In some embodiments, as shown in fig. 9b, the display unit 304 may include a display subunit 3041, a first detection subunit 3042, and a second detection subunit 3043, as follows:
a display subunit 3041 for displaying the data information;
a first detecting subunit 3042, configured to detect displacement change data of the terminal through a gravity sensor;
and a second detecting subunit 3043, configured to detect angle change data of the terminal through a gyro sensor.
The first detecting subunit 3042 may invoke a gravity sensor to detect displacement change data of the terminal, where the displacement change data indicates a position change in a three-dimensional spatial position of the terminal, and specifically may be position change data in an X axis, position change data in a Y axis, and position change data in a Z axis. The second detecting subunit 3043 may invoke the gyroscope sensor to detect the angle variation data of the terminal, and similarly, the angle variation data may also include the angle variation data on the X axis, the angle variation data on the Y axis, and the angle variation data on the Z axis.
In some embodiments, the data information includes data card information and corresponding augmented reality model information, and the display subunit 3041 is specifically configured to: acquiring augmented reality model information in the data information, analyzing the augmented reality model information through an augmented reality control, determining a corresponding augmented reality display pattern, acquiring data card information in the data information, loading the data card information through the augmented reality display pattern, generating target data card information, and displaying the target data card information in an augmented reality mode.
Based on this, the display subunit 3041 first obtains the augmented reality model information in the data information, and analyzes the augmented reality model information through the augmented reality control in the terminal, and determines the augmented reality display style corresponding to the augmented reality model information. The augmented reality display style may include a display background style, a display frame style, and a display effect style.
Then, the display subunit 3041 obtains the data card information in the data information, and loads the data card information by calling the augmented reality display pattern, so as to embed the data card information in the augmented reality display pattern to generate the target data card information.
Further, the display subunit 3041 is specifically configured to: acquiring augmented reality model information in the data information, analyzing the augmented reality model information through an augmented reality control, determining a corresponding augmented reality display pattern, acquiring data card information in the data information, loading the data card information through the augmented reality display pattern, generating target data card information, acquiring a current image through a camera, analyzing the current image to identify characteristic points on the current image, determining a display plane according to the characteristic points, and displaying the target data card information on the display plane.
The display subunit 3041 may call the camera to shoot the current image, perform feature point analysis on the current image, specifically, perform feature point acquisition on an object in the shot image, determine a plane in a three-dimensional space according to a plurality of feature points, that is, a plane (display plane) of the current image, and then display the target data card information on the display plane in an augmented reality manner and perform rendering.
An adjusting unit 305, configured to adjust the display of the data information according to the gravity-sensitive variation information.
The adjusting unit 305 may adjust the display of the data information according to the displacement change data and the angle change data, and optionally, the adjusting unit 305 may switch the displayed data information according to the displacement change data of the terminal. The adjustment unit 305 may perform an angular adjustment on the display of the material information according to the angle change data.
In some embodiments, as shown in fig. 9c, the adjusting unit 305 may include a first adjusting sub-unit 3051 and a second adjusting sub-unit 3052, as follows:
a first adjustment subunit 3051, configured to perform display switching adjustment on the display of the data information according to the displacement change data;
a second adjusting subunit 3052, configured to adjust a display angle of the data information according to the angle change data.
The first adjusting subunit 3051 may determine, according to the displacement change data of the terminal, a displacement direction of the terminal, such as displacement of the terminal to the left, displacement of the terminal to the right, displacement of the terminal to the lower side, or displacement of the terminal to the upper side. Optionally, the first adjusting subunit 3051 may set the left side displacement corresponding to the operation of switching the trigger target data card information to the previous one, and set the right side displacement corresponding to the operation of switching the trigger target data card information to the next one.
Further, the second adjusting subunit 3052 may determine, according to the angle change data of the terminal, a placement direction of the terminal, such as the terminal is laid flat or tilted, and a tilted angle. Optionally, the second adjusting subunit 3052 may set different rendering and displaying effects of the material information according to the terminal being laid flat or being tilted, and under different tilting angles, the second adjusting subunit 3052 may also set different rendering and displaying effects of the material information, such as different display sizes of the material information, and the like. The diversity of information processing is improved.
In some embodiments, as shown in fig. 9f, the information processing apparatus may further include a detection unit 306, a message control generation unit 307, and an addition control generation unit 308.
A detecting unit 306, configured to detect whether the account id exists on the terminal;
a message control generating unit 307, configured to generate a message control when it is detected that the account identifier exists on the terminal, where the message control is configured to send a message to a client associated with the account identifier;
an adding control generating unit 308, configured to generate an adding control when it is detected that the account id does not exist on the terminal, where the adding control is used to send a request for adding a friend to a client associated with the account id.
When the message control generating unit 307 detects that the account id exists on the terminal, it indicates that the account id is in a friend relationship with a login account id associated with the local client, and generates a message control, so that the user can send a message to the client corresponding to the account id by clicking the message control. When the adding control generating unit 308 detects that the account id does not exist on the terminal, it indicates that the account id and the login account id associated with the local client are not in a friend relationship, and generates an adding control, so that the user can send a friend adding request to the client corresponding to the account id by clicking the adding control.
The specific implementation of each unit can refer to the previous embodiment, and is not described herein again.
As can be seen from the above, in the embodiment of the present invention, the to-be-processed image is acquired by the acquisition unit 301; the sending unit 302 extracts target feature information from the image to be processed and sends the target feature information to the server; the receiving unit 303 receives an account id obtained by the server according to the matching of the target feature information and the data information associated with the account id; the display unit 304 displays the data information and acquires the gravity sensing variation information of the terminal; the adjustment unit 305 adjusts the display of the material information based on the gravity sensitive variation information. The scheme can quickly identify the target characteristic information on the image to be processed, acquire the data information related to the target characteristic information according to the target characteristic information, display the data information and adjust the display of the data information in real time according to the gravity sensing change information of the terminal, so that compared with the existing data information display, the display is uniform, and the flexibility and diversity of information processing are improved for the simple scheme of displaying characters or pictures.
Example V,
Accordingly, referring to fig. 10, an information processing system according to an embodiment of the present invention includes an information processing apparatus and a server, where the information processing apparatus may be integrated in a terminal, and the information processing apparatus is any one of the information processing apparatuses provided in the embodiments of the present invention, specifically, refer to the fourth embodiment. For example, taking as an example that the information processing apparatus is specifically integrated in a terminal, then:
the terminal is used for collecting an image to be processed, extracting target characteristic information from the image to be processed, sending the target characteristic information to the server, receiving the account identification obtained by the server according to the target characteristic information in a matching mode and the information related to the account identification, displaying the information, obtaining the gravity sensing variation information of the terminal, and adjusting the display of the information according to the gravity sensing variation information.
The server is used for receiving target characteristic information sent by the terminal, matching the target characteristic information according to the target characteristic information to obtain an account identifier corresponding to the target characteristic information and data information associated with the account identifier, and sending the account identifier and the data information associated with the account identifier to the terminal.
For example, after receiving the target feature information, the server matches the two-dimensional code feature information or the face feature information in the target feature information with the two-dimensional code feature information or the face feature information stored in the server, determines the two-dimensional code feature information or the face feature information matched with the two-dimensional code feature information or the face feature information in the target feature information, obtains account identifiers (such as account numbers) bound to the two-dimensional code feature information or the face feature information and information associated with the account identifiers according to the binding relationship, and sends the account identifiers and the information associated with the account identifiers to the terminal.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Since the information processing system may include any information processing apparatus provided in the embodiment of the present invention, the beneficial effects that can be achieved by any information processing apparatus provided in the embodiment of the present invention can be achieved, and detailed descriptions are given in the foregoing embodiment and are not repeated herein.
Example six,
An embodiment of the present invention further provides a terminal, as shown in fig. 11, the terminal may include a Radio Frequency (RF) circuit 601, a memory 602 including one or more computer-readable storage media, an input unit 603, a display unit 604, a sensor 605, an audio circuit 606, a Wireless Fidelity (WiFi) module 607, a processor 608 including one or more processing cores, and a power supply 609. Those skilled in the art will appreciate that the terminal structure shown in fig. 11 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the RF circuit 601 may be used for receiving and transmitting signals during a message transmission or communication process, and in particular, for receiving downlink messages from a base station and then processing the received downlink messages by one or more processors 608; in addition, data relating to uplink is transmitted to the base station. In general, the RF circuit 601 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuit 601 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), and the like.
The memory 602 may be used to store software programs and modules, and the processor 608 executes various functional applications and information processing by operating the software programs and modules stored in the memory 602. The memory 602 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the terminal, etc. Further, the memory 602 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 602 may also include a memory controller to provide the processor 608 and the input unit 603 access to the memory 602.
The input unit 603 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, in one particular embodiment, input unit 603 may include a touch-sensitive surface as well as other input devices. The touch-sensitive surface, also referred to as a touch display screen or a touch pad, may collect touch operations by a user (e.g., operations by a user on or near the touch-sensitive surface using a finger, a stylus, or any other suitable object or attachment) thereon or nearby, and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 608, and can receive and execute commands sent by the processor 608. In addition, touch sensitive surfaces may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 603 may include other input devices in addition to the touch-sensitive surface. In particular, other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 604 may be used to display information input by or provided to the user and various graphical user interfaces of the terminal, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 604 may include a Display panel, and optionally, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may overlay the display panel, and when a touch operation is detected on or near the touch-sensitive surface, the touch operation is transmitted to the processor 608 to determine the type of touch event, and the processor 608 then provides a corresponding visual output on the display panel according to the type of touch event. Although in FIG. 11 the touch-sensitive surface and the display panel are two separate components to implement input and output functions, in some embodiments the touch-sensitive surface may be integrated with the display panel to implement input and output functions.
The terminal may also include at least one sensor 605, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel according to the brightness of ambient light, and a proximity sensor that may turn off the display panel and/or the backlight when the terminal is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured in the terminal, detailed description is omitted here.
Audio circuitry 606, a speaker, and a microphone may provide an audio interface between the user and the terminal. The audio circuit 606 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electric signal, which is received by the audio circuit 606 and converted into audio data, which is then processed by the audio data output processor 608, and then transmitted to, for example, another terminal via the RF circuit 601, or the audio data is output to the memory 602 for further processing. The audio circuit 606 may also include an earbud jack to provide communication of peripheral headphones with the terminal.
WiFi belongs to short-distance wireless transmission technology, and the terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 607, and provides wireless broadband internet access for the user. Although fig. 11 shows the WiFi module 607, it is understood that it does not belong to the essential constitution of the terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 608 is a control center of the terminal, connects various parts of the entire handset using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 602 and calling data stored in the memory 602, thereby performing overall monitoring of the handset. Optionally, processor 608 may include one or more processing cores; preferably, the processor 608 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 608.
The terminal also includes a power supply 609 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 608 via a power management system that may be used to manage charging, discharging, and power consumption. The power supply 609 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, the terminal may further include a camera, a bluetooth module, and the like, which will not be described herein. Specifically, in this embodiment, the processor 608 in the terminal loads the executable file corresponding to the process of one or more application programs into the memory 602 according to the following instructions, and the processor 608 runs the application programs stored in the memory 602, thereby implementing various functions:
collecting an image to be processed; extracting target characteristic information from the image to be processed and sending the target characteristic information to a server; receiving account identification obtained by the server according to the target characteristic information in a matching manner and data information associated with the account identification; displaying the information and acquiring the gravity sensing variation information of the terminal; and adjusting the display of the data information according to the gravity sensing variation information.
In the above embodiments, the descriptions of the embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the information processing method, and are not described herein again.
As can be seen from the above, the terminal according to the embodiment of the present invention may acquire an image to be processed; extracting target characteristic information from the image to be processed and sending the target characteristic information to a server; receiving an account identification obtained by the server according to the target characteristic information in a matching manner and data information associated with the account identification; displaying data information and acquiring the gravity sensing change information of the terminal; and adjusting the display of the data information according to the gravity sensing variation information. The scheme can quickly identify the target characteristic information on the image to be processed, acquire the data information related to the target characteristic information according to the target characteristic information, display the data information and adjust the display of the data information in real time according to the gravity sensing change information of the terminal, so that compared with the existing data information display, the display is uniform, and the flexibility and diversity of information processing are improved for the simple scheme of displaying characters or pictures.
Example seven,
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, the present invention provides a storage medium, in which a plurality of instructions are stored, and the instructions can be loaded by a processor to execute the steps in any one of the information processing methods provided by the embodiments of the present invention. For example, the instructions may perform the steps of:
collecting an image to be processed; extracting target characteristic information from the image to be processed and sending the target characteristic information to a server; receiving account identification obtained by the server according to the target characteristic information in a matching manner and data information associated with the account identification; displaying the information and acquiring the gravity sensing variation information of the terminal; and adjusting the display of the data information according to the gravity sensing variation information.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium can execute the steps in any information processing method provided in the embodiment of the present invention, the beneficial effects that can be achieved by any information processing method provided in the embodiment of the present invention can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The above detailed description is provided for an information processing method, an information processing apparatus, a storage medium, and an information processing system according to embodiments of the present invention, and a specific example is applied in the present disclosure to explain the principles and embodiments of the present invention, and the description of the above embodiments is only used to help understanding the method and the core idea of the present invention; meanwhile, for those skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in view of the above, the content of the present specification should not be construed as a limitation to the present invention.

Claims (12)

1. An information processing method, applied to a terminal, the method comprising:
identifying the image to be processed through the instant messaging client;
when the identification is successful, acquiring augmented reality model information in the data information, analyzing the augmented reality model information through an augmented reality control, and determining a corresponding augmented reality display style, wherein the augmented reality display style comprises a display background style, a display frame style and a display effect style;
acquiring data card information in the data information, and loading the data card information through the augmented reality display pattern to enable the data card information to be nested in the augmented reality display pattern so as to generate target data card information;
gather current image through the camera, it is right current image carries out the analysis, in order to discern a plurality of characteristic points on the current image, according to a plurality of characteristic points determine current image display plane in three-dimensional space will target data card information is shown with augmented reality's form the display plane is gone up and is rendered, target data card is one of a plurality of data cards that data card information includes, data information is:
the server obtains a plurality of data cards in the data information associated with the account identification of the target user according to the target characteristic information in a matching manner, wherein the target characteristic information is obtained by extracting the target characteristic information from the image to be processed by the terminal and sending the target characteristic information to the server;
when the shaking operation aiming at the terminal is detected, the other data cards except the target data card in the plurality of data cards are switched and displayed in an augmented reality mode based on the gravity sensing variation information of the shaking operation.
2. The process of claim 1, wherein the gravity induced strain change information comprises displacement change data and angle change data, the process further comprising:
detecting displacement change data of the terminal through a gravity sensor;
and detecting angle change data of the terminal through a gyroscope inductor.
3. The processing method of claim 2, further comprising:
and adjusting the display angle of the display of the data information according to the angle change data.
4. The processing method according to claim 1, wherein the step of extracting target feature information from the image to be processed includes:
analyzing an image to be processed, and identifying a two-dimensional code image in the image to be processed;
extracting two-dimension code characteristic information in the two-dimension code image, and determining the two-dimension code characteristic information as target characteristic information.
5. The processing method according to claim 1, wherein the step of extracting target feature information from the image to be processed includes:
analyzing an image to be processed, and identifying a target face area image in the image to be processed;
and extracting the face characteristic information in the target face area image, and determining the face characteristic information as target characteristic information.
6. The processing method according to claim 5, wherein the step of analyzing the image to be processed and identifying the target face region image in the image to be processed comprises:
analyzing an image to be processed to determine a face region image on the image to be processed;
when the number of the face area images is detected to be single, determining the face area images as target face area images;
and when the number of the face area images is not single, generating prompt information, and receiving a target face area image selected by a user according to the prompt information.
7. The process of any one of claims 1 to 6, wherein the step of switching display of data cards other than the target data card of the plurality of data cards in the augmented reality mode further comprises:
detecting whether the account identification exists on the terminal;
when the account identification is detected to exist on the terminal, generating a message control, wherein the message control is used for sending a message to a client associated with the account identification;
and when detecting that the account identification does not exist on the terminal, generating an adding control, wherein the adding control is used for sending a friend adding request to the client associated with the account identification.
8. An information processing apparatus characterized by comprising:
the acquisition unit is used for identifying the image to be processed through the instant messaging client;
the display unit is used for acquiring augmented reality model information in the data information when the identification is successful, analyzing the augmented reality model information through an augmented reality control and determining a corresponding augmented reality display style, wherein the augmented reality display style comprises a display background style, a display frame style and a display effect style; acquire data card information in the data information, through augmented reality shows the pattern loading data card information makes data card information nestification is in the augmented reality shows the pattern to generate target data card information, gather current image through the camera, it is right current image carries out the analysis, in order to discern a plurality of characteristic points on the current image, according to a plurality of characteristic points determine current image display plane in three-dimensional space will target data card information shows with augmented reality's form show and play up on the display plane, target data card does one of a plurality of data cards that data information includes, data information is:
the server obtains a plurality of data cards in the data information associated with the account identification of the target user according to the target characteristic information in a matching manner, wherein the target characteristic information is obtained by extracting the target characteristic information from the image to be processed by the terminal and sending the target characteristic information to the server;
and the adjusting unit is used for switching and displaying other data cards except the target data card in the plurality of data cards in an augmented reality mode based on the gravity sensing change information of the shaking operation when the shaking operation aiming at the terminal is detected.
9. The information processing apparatus according to claim 8, characterized in that the apparatus further comprises:
the first detection subunit is used for detecting displacement change data of the terminal through a gravity sensor;
and the second detection subunit is used for detecting the angle change data of the terminal through a gyroscope sensor.
10. The information processing apparatus according to claim 9, characterized in that the apparatus further comprises:
and the second adjusting subunit is used for adjusting the display angle of the display of the data information according to the angle change data.
11. A storage medium storing a plurality of instructions, the instructions being adapted to be loaded by a processor to perform the steps of the information processing method according to any one of claims 1 to 7.
12. An information processing system, the system comprising: a terminal and a server;
the terminal includes the information processing apparatus according to any one of claims 8 to 10;
the server is configured to: the method comprises the steps of receiving target characteristic information sent by a terminal, matching according to the target characteristic information to obtain an account identification corresponding to the target characteristic information and data information associated with the account identification, and sending the account identification and the data information associated with the account identification to the terminal.
CN201810180098.1A 2018-03-05 2018-03-05 Information processing method, device, storage medium and system Active CN108551519B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810180098.1A CN108551519B (en) 2018-03-05 2018-03-05 Information processing method, device, storage medium and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810180098.1A CN108551519B (en) 2018-03-05 2018-03-05 Information processing method, device, storage medium and system

Publications (2)

Publication Number Publication Date
CN108551519A CN108551519A (en) 2018-09-18
CN108551519B true CN108551519B (en) 2021-03-19

Family

ID=63516280

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810180098.1A Active CN108551519B (en) 2018-03-05 2018-03-05 Information processing method, device, storage medium and system

Country Status (1)

Country Link
CN (1) CN108551519B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111198608B (en) * 2018-11-16 2021-06-22 广东虚拟现实科技有限公司 Information prompting method and device, terminal equipment and computer readable storage medium
CN111223220B (en) * 2018-11-26 2022-07-12 阿里巴巴集团控股有限公司 Image correlation method and device and server
CN109949928B (en) * 2019-03-06 2021-09-07 智美康民(珠海)健康科技有限公司 Three-dimensional pulse wave display method and device, computer equipment and storage medium
CN109935318B (en) * 2019-03-06 2021-07-30 智美康民(珠海)健康科技有限公司 Three-dimensional pulse wave display method and device, computer equipment and storage medium
CN110442295A (en) * 2019-07-09 2019-11-12 努比亚技术有限公司 Wearable device control method, wearable device and computer readable storage medium
CN110380955B (en) * 2019-07-23 2022-01-28 北京字节跳动网络技术有限公司 Message processing method and device and electronic equipment
CN111125798A (en) * 2019-12-12 2020-05-08 维沃移动通信有限公司 Certificate display method and electronic equipment
CN113470189A (en) * 2021-06-30 2021-10-01 北京市商汤科技开发有限公司 Special effect display method and device, electronic equipment and storage medium
CN113628545A (en) * 2021-09-06 2021-11-09 越达光电科技(浙江)有限公司 Comprehensive display wearable device and watch

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104932687A (en) * 2009-09-30 2015-09-23 联想(北京)有限公司 Mobile terminal and method for displaying information on mobile terminal
JP5527811B2 (en) * 2010-04-20 2014-06-25 Necカシオモバイルコミュニケーションズ株式会社 Terminal device and program
CN103631467A (en) * 2012-08-20 2014-03-12 华为终端有限公司 Method for regulating display content and mobile terminal
CN104572732A (en) * 2013-10-22 2015-04-29 腾讯科技(深圳)有限公司 Method and device for inquiring user identification and method and device for acquiring user identification
CN106373197B (en) * 2016-09-06 2020-08-07 广州视源电子科技股份有限公司 Augmented reality method and augmented reality device
CN106897817A (en) * 2017-01-22 2017-06-27 朗坤智慧科技股份有限公司 A kind of construction speed Visualized management system and method based on augmented reality

Also Published As

Publication number Publication date
CN108551519A (en) 2018-09-18

Similar Documents

Publication Publication Date Title
CN108551519B (en) Information processing method, device, storage medium and system
CN108496150B (en) Screen capture and reading method and terminal
CN109472179A (en) Two-dimensional code identification method, terminal and computer readable storage medium
CN110555171B (en) Information processing method, device, storage medium and system
CN109445894A (en) A kind of screenshot method and electronic equipment
CN108628985B (en) Photo album processing method and mobile terminal
US20190227764A1 (en) Multi-screen interaction method and system in augmented reality scene
CN110795007B (en) Method and device for acquiring screenshot information
CN110674662A (en) Scanning method and terminal equipment
CN109495616B (en) Photographing method and terminal equipment
CN109582475A (en) A kind of sharing method and terminal
CN108132752A (en) A kind of method for editing text and mobile terminal
CN108718389B (en) Shooting mode selection method and mobile terminal
CN109495638B (en) Information display method and terminal
CN108898040A (en) A kind of recognition methods and mobile terminal
CN107273024B (en) A kind of method and apparatus realized using data processing
CN109753202A (en) A kind of screenshotss method and mobile terminal
CN108123999A (en) A kind of information push method and mobile terminal
CN109166164B (en) Expression picture generation method and terminal
CN110418004A (en) Screenshot processing method, terminal and computer readable storage medium
CN108121583B (en) Screen capturing method and related product
CN107967086B (en) Icon arrangement method and device for mobile terminal and mobile terminal
CN110719361B (en) Information transmission method, mobile terminal and storage medium
CN108109188A (en) A kind of image processing method and mobile terminal
CN108255374A (en) The deployment method and terminal device of a kind of file

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant