CN113556423B - Information processing method, device, system, storage medium and electronic equipment - Google Patents

Information processing method, device, system, storage medium and electronic equipment Download PDF

Info

Publication number
CN113556423B
CN113556423B CN202110800979.0A CN202110800979A CN113556423B CN 113556423 B CN113556423 B CN 113556423B CN 202110800979 A CN202110800979 A CN 202110800979A CN 113556423 B CN113556423 B CN 113556423B
Authority
CN
China
Prior art keywords
information
task
target
intelligent glasses
application program
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110800979.0A
Other languages
Chinese (zh)
Other versions
CN113556423A (en
Inventor
林鼎豪
陈碧莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110800979.0A priority Critical patent/CN113556423B/en
Publication of CN113556423A publication Critical patent/CN113556423A/en
Priority to PCT/CN2022/088483 priority patent/WO2023284355A1/en
Application granted granted Critical
Publication of CN113556423B publication Critical patent/CN113556423B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72412User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides an information processing method, an information processing device, an information processing system, a computer readable storage medium and an electronic device, and relates to the technical field of computer control. The information processing method includes: the information sending equipment responds to the operation of touching or approaching the intelligent glasses, the target information is sent to the intelligent glasses, the intelligent glasses forward the target information to the information responding equipment, and the information responding equipment responds to the target information. The present disclosure can reduce the operating cost of a multi-device scenario.

Description

Information processing method, device, system, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of computer control technologies, and in particular, to an information processing method, an information processing apparatus, an information processing system, a computer-readable storage medium, and an electronic device.
Background
In a scenario where multiple devices execute operations, users all need to operate the devices owned by the users, so as to implement corresponding functions. For example, the user a and the user B need to go to the same location, and in the case of requiring mobile phone navigation, both the user a and the user B need to configure navigation on their respective mobile phones. The operation and configuration required by each user is inconvenient and high in operation cost.
Disclosure of Invention
The present disclosure provides an information processing method, an information processing apparatus, an information processing system, a computer-readable storage medium, and an electronic device, thereby overcoming, at least to some extent, the problem of high user operation cost in a multi-device scenario.
According to a first aspect of the present disclosure, there is provided an information processing method applied to an information transmitting apparatus, including: and responding to the operation of touching or approaching the intelligent glasses, and sending the target information to the intelligent glasses so that the intelligent glasses forward the target information to the information response equipment, and the information response equipment responds to the target information.
According to a second aspect of the present disclosure, there is provided an information processing method applied to an information response device, including: receiving target information; the target information is sent to the intelligent glasses by the information sending equipment in response to the operation of touching or approaching the intelligent glasses, and is forwarded to the information responding equipment by the intelligent glasses; responding to the target information.
According to a third aspect of the present disclosure, there is provided an information processing apparatus applied to an information transmitting device, the information processing apparatus configured to: and responding to the operation of touching or approaching the intelligent glasses, and sending the target information to the intelligent glasses so that the intelligent glasses forward the target information to the information response equipment, and the information response equipment responds to the target information.
According to a fourth aspect of the present disclosure, there is provided an information processing apparatus applied to an information response device, including: the information receiving module is used for receiving target information; the target information is sent to the intelligent glasses by the information sending equipment in response to the operation of touching or approaching the intelligent glasses, and is forwarded to the information responding equipment by the intelligent glasses; and the information response module is used for responding to the target information.
According to a fifth aspect of the present disclosure, there is provided an information processing system including: the information sending equipment is used for responding to the operation of touching or approaching the intelligent glasses and sending the target information to the intelligent glasses; the intelligent glasses are used for forwarding the target information to the information response equipment; and the information response equipment is used for receiving the target information and responding to the target information.
According to a sixth aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the information processing method described above.
According to a seventh aspect of the present disclosure, there is provided an electronic device comprising a processor; a memory for storing one or more programs, which when executed by the processor, cause the processor to implement the information processing method described above.
In the technical solutions provided in some embodiments of the present disclosure, on one hand, in a scene of multi-device operation of the present disclosure, target information sent by an information sending device may be transmitted to an information responding device through smart glasses, and the information responding device may respond to the target information, and through this interaction manner, it is not necessary for a user to manually operate the information responding device, and it is also possible to implement that the information responding device executes a corresponding function, thereby reducing the operation cost of the information responding device; on the other hand, because no manual operation of the user exists for the information response equipment, the possibility of misoperation is avoided, meanwhile, the time of the manual operation is saved, and the use experience of the user is greatly improved; on the other hand, the information sending equipment responds to the operation of touching or approaching the intelligent glasses to send the target information to the intelligent glasses, and then the intelligent glasses send the target information to the information responding equipment, so that the interestingness of the operation is increased for the user of the information sending equipment. In addition, under the condition that the intelligent glasses are rapidly developed, the scheme disclosed by the invention has a wide application prospect.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 shows a schematic diagram of an exemplary architecture of an information handling system of an embodiment of the present disclosure;
FIG. 2 illustrates a schematic structural diagram of an electronic device suitable for use in implementing embodiments of the present disclosure;
fig. 3 schematically shows a flowchart of an information processing method applied to an information transmitting apparatus according to an exemplary embodiment of the present disclosure;
fig. 4 schematically shows a flowchart of an information processing method applied to an information responding apparatus according to an exemplary embodiment of the present disclosure;
FIG. 5 illustrates a schematic diagram of types of targeted content of embodiments of the present disclosure;
FIG. 6 is a schematic diagram illustrating a process for determining targeted content according to an embodiment of the present disclosure;
FIG. 7 illustrates a block diagram of smart glasses according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram illustrating an interaction process of an information processing scheme of an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of an information processing scheme, for example, a navigation scenario;
FIG. 10 shows a schematic diagram of an information processing scheme, for example, in a motion scene;
fig. 11 schematically shows a block diagram of an information processing apparatus applied to an information response device according to an exemplary embodiment of the present disclosure;
fig. 12 schematically shows a block diagram of an information processing apparatus applied to an information response device according to another exemplary embodiment of the present disclosure;
fig. 13 schematically shows a block diagram of an information processing apparatus applied to an information responding device according to still another exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the steps. For example, some steps may be decomposed, some steps may be combined or partially combined, and thus the actual execution order may be changed according to the actual situation. In addition, all of the following terms "first" and "second" are used for distinguishing purposes only and should not be construed as limiting the present disclosure.
FIG. 1 shows a schematic diagram of an exemplary architecture of an information handling system of an embodiment of the present disclosure. As shown in fig. 1, the information processing system of the embodiment of the present disclosure may include an information transmitting apparatus 1100, smart glasses 1200, and an information responding apparatus 1300.
The information transmitting apparatus 1100 may transmit the target information to the smart glasses 1200 in response to an operation of touching or approaching the smart glasses 1200. The smart glasses 1200 may forward the target information to the information responding apparatus 1300. The information responding apparatus 1300 can respond to the target information.
The device types of the information transmitting device 1100 and the information responding device 1300 may include, but are not limited to, a smart phone, a tablet, a smart watch, and the like. The information transmission apparatus 1100 and the information response apparatus 1300 may be the same type of apparatus, for example, both being smart phones. Alternatively, the information sending device 1100 and the information responding device 1300 may be of different device types, for example, the information sending device 1100 is a smartphone, and the information responding device 1300 is a tablet computer. The present disclosure is not so limited.
The information transmitting apparatus 1100 transmits the target information to the smart glasses 1200 through the first communication manner in response to an operation of touching or approaching the smart glasses 1200. The smart glasses 1200 may forward the target information to the information responding apparatus 1300 through the second communication manner. Generally, the second communication mode is a longer transmission distance than the first communication mode. For example, the first communication mode may be an NFC or UWB mode, and the second communication mode may be a bluetooth or WiFi mode.
In some implementation scenarios, the target information sent by the information sending device 1100 may include information such as image, audio, text, and the like, and when the information responding device 1300 receives the target information, a corresponding player may be started to play the image, audio, text, and the like.
In other implementations, the target information sent by the information delivery device 1100 may comprise information generated by an application. When the information response device 1300 acquires the target information and in a case where the information response device 1300 is installed with the application program, the information response device 1300 may automatically start the application program, and thus, a user of the information response device 1300 does not need to manually search for the application program and start the application program, and convenience of operation is improved.
In a scenario where the target information includes feature information of a task of the application, the information responding apparatus 1300 may also determine the task of the application based on the feature information and execute the task.
In addition, the information response apparatus 1300 may also generate target content, which is content that the smart glasses 1200 can play, based on the feature information. Thus, the information responding apparatus 1300 may transmit the target content to the smart glasses 1200, and the target content is played by the smart glasses 1200. The target content includes, but is not limited to, images, audio, and the like.
The smart glasses 1200 may include a content receiving unit, a light emitting unit, and an image display unit to show image content included in the target content.
Specifically, the content receiving unit may be configured to receive the image content transmitted by the information responding apparatus 1300, for example, based on the second communication method. The light emitting unit may be used to play the image content. And the image display unit may be used to display the image content played by the light emitting unit.
It is understood that in some embodiments, the light emitting unit may include a light engine equipped on the smart glasses 1200, and the image display unit may include lenses of the smart glasses 1200.
For another example, in a case where the target content includes audio content, the smart glasses 1200 may further include an audio playing unit for playing the audio content.
For another example, in the case where the target content includes both image content and audio content, the smart glasses 1200 may control them to be played simultaneously or separately.
In addition, the target content transmitted by the information responding apparatus 1300 may be content converted based on task information of a currently running task. In this case, the smart glasses 1200 may further include a task control unit.
Specifically, the task control unit may be configured to generate a task control instruction in response to a task control operation of a user, and send the task control instruction to the information response apparatus 1300. The information responding apparatus 1300 may control the task currently running in response to the task control instruction, for example, pause the task, start the task, terminate the task, and the like.
It should be noted that in the present embodiment, the information transmitting apparatus 1100 may be held by one user, and the smart glasses 1200 and the information responding apparatus 1300 may be held by another user. For example, user a may hold the information transmitting device 1100, and user B wears the smart glasses 1200 and holds the information responding device 1300. Thus, the user a can implement a scheme of triggering the corresponding function on the information response device 1300 of the user B by only touching or approaching the information transmission device 1100 to the smart glasses 1200 of the user B, and can also play the corresponding content on the smart glasses 1200.
FIG. 2 shows a schematic diagram of an electronic device suitable for use in implementing exemplary embodiments of the present disclosure. The information transmission apparatus and/or the information response apparatus of the exemplary embodiments of the present disclosure may be configured as in fig. 2. It should be noted that the electronic device shown in fig. 2 is only an example, and should not bring any limitation to the functions and the application scope of the embodiment of the present disclosure.
The electronic device of the present disclosure includes at least a processor and a memory for storing one or more programs, which when executed by the processor, cause the processor to implement the information processing method of the exemplary embodiments of the present disclosure.
Specifically, as shown in fig. 2, the electronic device 200 may include: a processor 210, an internal memory 221, an external memory interface 222, a Universal Serial Bus (USB) interface 230, a charging management Module 240, a power management Module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication Module 250, a wireless communication Module 260, an audio Module 270, a speaker 271, a microphone 272, a microphone 273, an earphone interface 274, a sensor Module 280, a display 290, a camera Module 291, a pointer 292, a motor 293, a button 294, and a Subscriber Identity Module (SIM) card interface 295. The sensor module 280 may include a depth sensor, a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
It is to be understood that the illustrated structure of the embodiments of the present disclosure does not constitute a specific limitation to the electronic device 200. In other embodiments of the present disclosure, electronic device 200 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units, such as: the Processor 210 may include an Application Processor (AP), a modem Processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural Network Processor (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors. Additionally, a memory may be provided in processor 210 for storing instructions and data.
The wireless communication function of the electronic device 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like.
The mobile communication module 250 may provide a solution including 2G/3G/4G/5G wireless communication applied on the electronic device 200.
The Wireless Communication module 260 may provide solutions for Wireless Communication applied to the electronic device 200, including Wireless Local Area Networks (WLANs), such as Wireless Fidelity (Wi-Fi) Networks, bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), ultra Wide Band (UWB), infrared (IR), and the like.
The electronic device 200 implements a display function through the GPU, the display screen 290, the application processor, and the like. The GPU is a microprocessor for image processing, coupled to a display screen 290 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
The electronic device 200 may implement a shooting function through the ISP, the camera module 291, the video codec, the GPU, the display screen 290, the application processor, and the like. In some embodiments, the electronic device 200 may include 1 or N camera modules 291, where N is a positive integer greater than 1, and if the electronic device 200 includes N cameras, one of the N cameras is the main camera.
Internal memory 221 may be used to store computer-executable program code, including instructions. The internal memory 221 may include a program storage area and a data storage area. The external memory interface 222 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 200.
The present disclosure also provides a computer-readable storage medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device.
A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable storage medium may transmit, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The computer-readable storage medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software, or may be implemented by hardware, and the described units may also be disposed in a processor. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
Fig. 3 schematically shows a flowchart of an information processing method applied to an information transmitting apparatus of an exemplary embodiment of the present disclosure. Referring to fig. 3, the information processing method may include the steps of:
and S32, determining target information.
According to some embodiments of the present disclosure, the target information includes an instruction to open an application. It may be considered that, after the device acquires the target information, the application may be automatically started based on the instruction to open the application. For example, the target information includes an instruction to open a navigation application, and the device may automatically start the navigation application after acquiring the target information.
According to further embodiments of the present disclosure, the target information includes task information of a task currently running by the application. After the device acquires the target information, the application program can be started, and the task of the application program can be started based on the task information. Still taking the navigation application as an example, in case the target information contains destination information, the device may automatically initiate a navigation task to the target location.
First, the information transmission apparatus may determine task information of a task in which the application is currently running. Specifically, the application program may be an application program in a pre-built white list, and when the application program runs, the information sending device may monitor a current running state of the application program in real time, and obtain task information of a currently running task. The determined task may also be a task in a pre-constructed white list, which is not limited by the present disclosure.
In addition, the information transmission apparatus may start the task in response to a task setting operation of the user on the interface of the application program. Taking the navigation application as an example, the task setting operation may be an operation of setting a destination and triggering start of navigation.
Next, the information transmitting apparatus may extract feature information from the task information and generate target information based on the feature information. Specifically, the feature information includes at least one feature item and feature data of the feature item. Taking the navigation task as an example, the task information may include current time, destination information, information of the entire navigation route, information of the remaining time of navigation, etc., and the feature information may be only the destination information or the navigation route information. After the characteristic information is obtained, target information can be generated by combining information such as a starting instruction of the application program.
And S34, responding to the operation of touching or approaching the intelligent glasses, sending the target information to the intelligent glasses, so that the intelligent glasses forward the target information to the information response equipment, and the information response equipment responds to the target information.
After the target information is determined, the information sending equipment can respond to the operation of touching or approaching the intelligent glasses and send the target information to the intelligent glasses.
For example, the information sending device may determine the smart glasses in response to a touch operation of the user on the information sending device and the smart glasses. Specifically, through the mode that one of NFC bumps, information sending equipment can acquire the equipment information of intelligent glasses, and can determine intelligent glasses through the equipment information.
In the present disclosure, the result of the information sending device determining the smart glasses may be used as a trigger condition for sending the target information, that is, when the smart glasses are determined, the target information is sent to the smart glasses.
The smart glasses may then forward the target information to the information-responding device for response by the information-responding device to the target information.
In some embodiments of the present disclosure, when the communication mode between the information sending device and the smart glasses is recorded as the first communication mode, the smart glasses may further forward the target information to the information responding device through the first communication mode.
In other embodiments of the present disclosure, the smart glasses may further forward the target information to the information response device through the second communication manner.
Specifically, the first communication method and the second communication method are different communication methods. For example, the first communication mode may be an NFC or UWB mode, and the second communication mode may be a bluetooth or WiFi mode. In some embodiments, the transmission distance of the first communication means is smaller than the transmission distance of the second communication means.
It should be noted that, the smart glasses and the information response device may be connected in advance, or may be connected with the information response device after the smart glasses receive the target information, which is not limited in this disclosure.
Fig. 4 schematically shows a flowchart of an information processing method applied to an information responding apparatus of an exemplary embodiment of the present disclosure. Referring to fig. 4, the information processing method may include the steps of:
s42, receiving target information; the target information is sent to the intelligent glasses by the information sending equipment in response to the operation of touching or approaching the intelligent glasses, and is forwarded to the information responding equipment by the intelligent glasses.
Referring to the descriptions of step S32 and step S34, the process of receiving the target information by the information response device will not be described again here.
And S44, responding to the target information.
According to some embodiments of the present disclosure, the target information includes an instruction to open an application. In this case, the information responding apparatus may start an application corresponding to the target information. For example, the information responding device may automatically launch the navigation application upon receiving the target information.
According to further embodiments of the present disclosure, the target information includes characteristic information of a task of the application. In this case, the information responding apparatus may extract the feature information from the target information, determine the task of the application based on the feature information, and execute the task.
Still taking navigation as an example, the characteristic information may be destination information input by the user. After the information response device acquires the destination information and starts the navigation application program, the information response device can determine a navigation task going to the destination based on the destination information and automatically execute the navigation task.
In addition, the information response device of the present disclosure may also transmit the relevant information of the task to the smart glasses, whereby the user may view the task and/or control the task through the smart glasses.
First, the information responding apparatus can generate target content based on the characteristic information. The target content is content to be played at the intelligent glasses end. Target content may refer to media content including, but not limited to, images, audio, and the like.
Referring to fig. 5, the type of the target content may include pictures, videos, texts, symbols, audios, and the like. Wherein, the pictures can comprise static images and dynamic images; a video may be understood as a series of consecutive images; the characters can be presented in an image-based mode, and the characters can also comprise numbers; the symbol refers to a symbol carried by a computer system and can also be a symbol drawn by a user.
Referring to fig. 6, the feature information includes at least one feature item and feature data of the feature item. Taking the navigation task as an example, the task information for one task may include current time, destination information, information of the entire navigation route, information of the remaining time of navigation, and the like. The generated target content may only need information of the current travel route, and does not concern information of other aspects, in this case, the current travel route may be used as a feature item, specific data of the current travel route may be used as feature data, and the feature data may include, for example, a name of a road where the current travel route is located, how much distance left to turn, and the like.
The information response device may determine a content configuration style corresponding to the feature item. Specifically, a mapping relationship between the feature item and the content configuration style may be pre-constructed, and the content configuration style corresponding to the feature item may be determined according to the pre-constructed mapping relationship. The content configuration style includes, but is not limited to, font size, font color, image size, image color, image style, content rendering position, content rendering transparency, and the like.
After the feature data of the feature items and the corresponding content configuration patterns are determined, combining the feature data of the feature items and the content configuration patterns corresponding to the feature items to generate target content.
Next, the information responding device may transmit the target content to the smart glasses so that the smart glasses may play the target content.
In the process of executing the task by the information response device, the information response device can update the target content in real time by using the generated characteristic data so as to make the target content consistent with the process of executing the task, wherein the characteristic data of the current travel route is changed generally.
Fig. 7 shows a block diagram of smart glasses according to an embodiment of the present disclosure.
Referring to fig. 7, taking the target content as the image content as an example, after receiving the target content, the content receiving unit 71 may convert the target content to generate a signal that can be played by the light emitting unit 72, and then forward the signal to the light emitting unit 72, and transmit the signal to the image display unit 73 by the light emitting unit 72, so that the user wearing the smart glasses can view the target content.
In addition, the content receiving unit 71 may also perform processes of filtering, denoising, and the like of the target content. The content receiving unit 71 may also be understood as a data processing unit of smart glasses. The light emitting unit 72 includes a light engine. The image display unit 73 includes lenses on smart glasses, and all or part of the lenses may serve as an interface for displaying target content.
Fig. 7 illustrates the structure of the target content playing only on one side, however, in other embodiments, both lenses of the smart glasses may display the target content, or different portions of the target content, which is not limited by the present disclosure.
Although not shown in fig. 7, the smart glasses of the present disclosure may further include an audio playing unit for playing audio content that may be contained in the target content.
In addition, the intelligent glasses can further comprise a task control unit, wherein the task control unit is used for responding to task operation of a user, generating a task control instruction and sending the task control instruction to the information response device so as to control the task.
Specifically, the task control unit may be disposed on a temple of the smart glasses, and may include a touch sensing module, so that a corresponding task control command may be generated in response to a sliding operation, a clicking operation, or the like of the user. Or, one or more keys of the entity can be configured on the glasses legs, so that the corresponding task control instruction can be generated in response to the operation of pressing the keys by the user.
In addition, the mapping relationship between the user's operation and the task control may be previously configured on the information response device. For example, what operation corresponds to suspending the task, what operation corresponds to starting the task, and the like, which is not limited by the present disclosure.
For the information response equipment, the task control instruction sent by the intelligent glasses can be responded, and the task is controlled.
In addition to controlling tasks through the control operation of the smart glasses, the user may not be convenient for manual operation in consideration of some scenes such as sports fitness. In this case, a scheme based on a voice control task may be configured.
First, the information responding apparatus can acquire voice information. The voice information may be directly acquired by the information response device, or acquired by the smart glasses and sent to the information response device.
Next, the information responding apparatus may recognize the voice information, and determine a keyword related to task control. Specifically, the keywords may be configured in advance, and the disclosure does not limit this process.
Then, the information responding apparatus can control the task based on the keyword.
For example, when the user says "stop navigating", the navigation task of the information responding apparatus is terminated. It is understood that the disappearance of the progress of the task results in the disappearance of the target content on the smart glasses.
An interactive process of the information processing scheme of the embodiment of the present disclosure will be explained with reference to fig. 8.
In step S802, the information transmitting apparatus can determine target information.
In step S804, in response to the operation of the user, the information sending device and the smart glasses implement a touch-and-click operation.
In step S806, the information transmission device may transmit the object information to the smart glasses through a communication method such as NFC or UWB in response to the touch operation in step S804.
In step S808, the smart glasses may transmit the target information to the information response device through a communication means such as bluetooth or WiFi.
In step S810, the information responding apparatus starts an application corresponding to the target information and executes a corresponding task.
In step S812, the information responding apparatus may determine target content based on the executed task.
In step S814, the information response device may transmit the target content to the smart glasses.
In step S816, the smart glasses may play the target content.
In addition, the user can also realize the control of tasks through the operation to intelligent glasses.
The following describes an information processing scheme of the present disclosure by taking a navigation scenario as an example.
Referring to fig. 9, after the user sets a destination on the cell phone 91 and initiates navigation, navigation information on the interface of the cell phone 91 may be presented.
The user may perform a touch-and-touch operation based on NFC with respect to the mobile phone 91 and the smart glasses 92, and the mobile phone 91 may transmit target information including navigation information to the smart glasses 92.
The smart glasses 92 may forward the target information to the cell phone 93 through bluetooth or WiFi.
After the cell phone 93 receives the target information, it may start a navigation application and perform the same navigation task as the cell phone 91. As can be seen from the figure, the navigation interfaces of the cell phone 91 and the cell phone 93 can be the same.
In addition, the mobile phone 93 may transmit the target content to the smart glasses 92 by bluetooth or WiFi. Thus, the target content can be displayed on the lenses of the smart glasses 92.
In one aspect, although FIG. 9 illustrates only one lens for presentation of information, the target content may be presented on both lenses, or different portions of the target content may be presented separately.
On the other hand, as can be seen from fig. 9, the target content presented on the smart glasses 92 may not be all the navigation information. As can be seen from the content displayed on the interface of the mobile phone 93, the navigation information also includes at least information of the remaining navigation time, the next road, and the like, which are selectively removed in the process of generating the target content by the mobile phone 93, which also takes into account that the glasses lens is smaller than the mobile phone interface, and the smart glasses also need to see the real road, so in some scheme strategies of the present disclosure, all the navigation information may not be presented on the glasses lens. In addition, the specifically presented target content can be customized to meet the personalized requirements of different users.
As shown in fig. 9, information on the current road and the direction and distance of the next road may be displayed on the smart glasses 92. It is understood that as the user travels, the data of the target content may change, that is, the information displayed on the smart glasses 92 may also change.
It should be noted that, in the case that the mobile phone 91 controls the mobile phone 93 to perform the same navigation task as the mobile phone 91 by means of the smart glasses 92, the user holding the mobile phone 91 and the user holding the mobile phone 93 can respectively go to the destination, that is, a specific navigation process to the destination, and the mobile phone 91 and the mobile phone 93 are independent of each other and each perform the respective navigation task.
The following describes the content processing scheme of the present disclosure by taking a motion scene as an example.
Referring to fig. 10, after the user makes a jogging setting on the cell phone 101, motion information on the cell phone 101 interface may be presented.
The user may perform a touch and click operation based on NFC for the mobile phone 101 and the smart glasses 102, and the mobile phone 101 may transmit target information including motion information to the smart glasses 102.
The smart glasses 102 may forward the target information to the cell phone 103 by means of bluetooth or WiFi.
After the cell phone 103 receives the goal information, the corresponding fitness application can be launched and perform the same jogging task as the cell phone 101. As can be seen from the figure, the motion interfaces of the mobile phone 101 and the mobile phone 103 may be the same.
In addition, the mobile phone 103 may transmit the target content to the smart glasses 102 through bluetooth or WiFi. Thus, the target content can be displayed on the lenses of the smart glasses 102.
Similarly, in one aspect, the target content may also be presented on both lenses, or different portions of the target content may be presented separately.
On the other hand, the target content presented on the smart glasses 102 may not be all of the motion information. But only the currently set jogging state (shown in the image of a person running) and the number of kilometers completed. However, the specifically presented target content may be customized by the user, which is not limited by this disclosure.
In addition, as the user continues to jog, the data of the target content may change, that is, the information displayed on the smart glasses 102 may also change. Referring to fig. 10, the displayed content may change, at least in terms of the number of kilometers completed.
In summary, on one hand, in the scene of multi-device operation disclosed by the present disclosure, the target information sent by the information sending device may be transmitted to the information responding device through the smart glasses, and the information responding device may respond to the target information, and through this interaction manner, the information responding device may also execute the corresponding function without the user manually operating the information responding device, thereby reducing the operation cost of the information responding device; on the other hand, because no manual operation of the user is performed on the information response equipment, the possibility of misoperation is avoided, meanwhile, the time of the manual operation is saved, and the use experience of the user is greatly improved; on the other hand, the information sending equipment responds to the operation of touching or approaching the intelligent glasses to send the target information to the intelligent glasses, and then the intelligent glasses send the target information to the information responding equipment, so that the interestingness of the operation is increased for the user of the information sending equipment. In addition, under the condition that the intelligent glasses are rapidly developed, the scheme disclosed by the invention has a wide application prospect.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Further, an information processing apparatus applied to an information transmitting device is also provided in the present exemplary embodiment. The information processing apparatus is configured to: and responding to the operation of touching or approaching the intelligent glasses, and sending the target information to the intelligent glasses so that the intelligent glasses forward the target information to the information response equipment and the information response equipment responds to the target information.
According to an exemplary embodiment of the present disclosure, an information processing apparatus is configured to transmit target information to smart glasses through a first communication manner. The intelligent glasses forward the target information to the information response equipment through a second communication mode.
According to an exemplary embodiment of the present disclosure, the information processing apparatus is further configured to determine task information of a task on which the application is currently running; feature information is extracted from the task information, and target information is generated based on the feature information.
According to an exemplary embodiment of the present disclosure, the information processing apparatus is further configured to start the task in response to a task setting operation by the user on the interface of the application program.
Further, an information processing apparatus applied to an information response device is also provided in the present exemplary embodiment.
Fig. 11 schematically shows a block diagram of an information processing apparatus applied to an information responding device of an exemplary embodiment of the present disclosure. Referring to fig. 11, the information processing apparatus 11 applied to the information response device according to an exemplary embodiment of the present disclosure may include an information receiving module 111 and an information response module 113.
Specifically, the information receiving module 111 may be configured to receive the target information; the target information is sent to the intelligent glasses by the information sending equipment in response to the operation of touching or approaching the intelligent glasses, and is forwarded to the information responding equipment by the intelligent glasses; the information response module 113 may be used to respond to the target information.
According to an exemplary embodiment of the present disclosure, the information transmitting apparatus transmits the target information to the smart glasses through the first communication manner in response to an operation of touching or approaching the smart glasses. In this case, the information receiving module 111 may be configured to receive the target information forwarded by the smart glasses through the second communication means.
According to an example embodiment of the present disclosure, the information response module 113 may be configured to start an application corresponding to the target information.
According to an exemplary embodiment of the present disclosure, the target information contains characteristic information of a task of the application. In this case, the information response module 113 may be further configured to extract feature information from the target information; determining a task of the application program based on the characteristic information, and executing the task.
According to an exemplary embodiment of the present disclosure, referring to fig. 12, the information processing apparatus 12 may further include a content transmission module 121, compared to the information processing apparatus 11.
Specifically, the content transmission module 121 may be configured to generate the target content based on the characteristic information; and sending the target content to the intelligent glasses so that the intelligent glasses can play the target content.
According to an exemplary embodiment of the present disclosure, the feature information includes at least one feature item and feature data of the feature item. In this case, the process of the content transmission module 121 generating the target content may be configured to determine a content configuration style corresponding to the feature item; and generating target content by combining the feature data of the feature items and the content configuration styles corresponding to the feature items.
According to an exemplary embodiment of the present disclosure, referring to fig. 13, the information processing apparatus 13 may further include a task control module 131, compared to the information processing apparatus 12.
Specifically, the task control module 131 may be configured to update the target content in real time using the generated feature data during the task execution process, so that the target content is consistent with the task execution process.
According to an exemplary embodiment of the present disclosure, the task control module 131 may be further configured to control the task in response to a task control instruction sent by the smart glasses; wherein the task control instruction is generated based on the control operation of the user for the intelligent glasses.
According to an exemplary embodiment of the present disclosure, the task control module 131 may be further configured to acquire voice information; recognizing the voice information and determining keywords related to task control; and controlling the task based on the keywords.
Since each functional module of the information processing apparatus according to the embodiment of the present disclosure is the same as that in the embodiment of the method described above, it is not described herein again.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed, for example, synchronously or asynchronously in multiple modules.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (14)

1. An information processing method applied to an information transmission apparatus, comprising:
determining a task interface and task information of a task currently running by an application program, extracting characteristic information from the task information, generating target information based on the characteristic information, responding to the operation that the information sending equipment touches or approaches to the intelligent glasses, sending the target information to the intelligent glasses through a first communication mode, so that the intelligent glasses can forward the target information to information response equipment through a second communication mode, and the information response equipment executes the task of the application program, wherein the transmission distance of the first communication mode is smaller than that of the second communication mode;
when the information response device executes the task of the application program, the information response device displays the task interface of the task currently running by the application program on the information sending device.
2. The information processing method according to claim 1, characterized by further comprising:
and responding to the task setting operation of the user on the interface of the application program and starting the task.
3. An information processing method applied to an information response device, comprising:
receiving target information; the target information is generated based on characteristic information in task information of a task currently running by an application program of information sending equipment, the information sending equipment responds to the operation of the information sending equipment touching or approaching the intelligent glasses and sends the operation to the intelligent glasses through a first communication mode, the intelligent glasses forward the operation to the information responding equipment through a second communication mode, and the transmission distance of the first communication mode is smaller than that of the second communication mode;
extracting the feature information from the target information, determining the task of the application program based on the feature information, and executing the task;
when the information response device executes the task of the application program, the information response device displays a task interface of the task currently running by the application program on the information sending device.
4. The information processing method according to claim 3, characterized by further comprising:
generating target content based on the characteristic information;
and sending the target content to the intelligent glasses so that the intelligent glasses can play the target content.
5. The information processing method according to claim 4, wherein the feature information contains at least one feature item and feature data of the feature item; wherein generating the target content based on the characteristic information comprises:
determining a content configuration style corresponding to the characteristic item;
and generating the target content by combining the feature data of the feature item and the content configuration style corresponding to the feature item.
6. The information processing method according to claim 5, characterized by further comprising:
and updating the target content in real time by using the generated characteristic data in the task execution process so as to enable the target content to be consistent with the task execution process.
7. The information processing method according to claim 6, characterized by further comprising:
responding to a task control instruction sent by the intelligent glasses, and controlling the task;
wherein the task control instruction is generated based on a control operation of a user for the smart glasses.
8. The information processing method according to claim 6, characterized by further comprising:
acquiring voice information;
recognizing the voice information and determining keywords related to task control;
controlling the task based on the keyword.
9. An information processing apparatus applied to an information transmitting device, characterized in that the information processing apparatus is configured to: determining a task interface and task information of a task currently running by an application program, extracting characteristic information from the task information, generating target information based on the characteristic information, responding to the operation that the information sending equipment touches or approaches to the intelligent glasses, sending the target information to the intelligent glasses through a first communication mode, so that the intelligent glasses can forward the target information to information response equipment through a second communication mode, and the information response equipment executes the task of the application program, wherein the transmission distance of the first communication mode is smaller than that of the second communication mode;
when the information response device executes the task of the application program, the information response device displays the task interface of the task currently running by the application program on the information sending device.
10. An information processing apparatus applied to an information response device, comprising:
the information receiving module is used for receiving target information; the target information is generated based on characteristic information in task information of a task currently running by an application program of information sending equipment, the information sending equipment responds to the operation that the information sending equipment touches or approaches to the intelligent glasses and sends the operation to the intelligent glasses through a first communication mode, the intelligent glasses forward the operation to the information responding equipment through a second communication mode, and the transmission distance of the first communication mode is smaller than that of the second communication mode;
the information response module is used for extracting the characteristic information from the target information, determining a task of the application program based on the characteristic information and executing the task;
when the information response device executes the task of the application program, the information response device displays a task interface of the task currently running by the application program on the information sending device.
11. An information processing system, comprising:
the information sending equipment is used for determining a task interface and task information of a task currently running by an application program, extracting characteristic information from the task information, generating target information based on the characteristic information, responding to the operation of the information sending equipment for touching or approaching the intelligent glasses, and sending the target information to the intelligent glasses in a first communication mode;
the intelligent glasses are used for forwarding the target information to information response equipment through a second communication mode, and the transmission distance of the first communication mode is smaller than that of the second communication mode;
the information response device is used for receiving the target information, extracting the characteristic information from the target information, determining the task of the application program based on the characteristic information and executing the task;
when the information response device executes the task of the application program, the information response device displays the task interface of the task currently running by the application program on the information sending device.
12. The information processing system of claim 11, wherein the information response device is further configured to: generating target content based on the characteristic information, and sending the target content to the intelligent glasses;
the intelligent glasses are also used for playing the target content.
13. A computer-readable storage medium on which a computer program is stored, characterized in that the program realizes the information processing method according to any one of claims 1 to 8 when executed by a processor.
14. An electronic device, comprising:
a processor;
a memory for storing one or more programs that, when executed by the processor, cause the processor to implement the information processing method of any one of claims 1 to 8.
CN202110800979.0A 2021-07-15 2021-07-15 Information processing method, device, system, storage medium and electronic equipment Active CN113556423B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110800979.0A CN113556423B (en) 2021-07-15 2021-07-15 Information processing method, device, system, storage medium and electronic equipment
PCT/CN2022/088483 WO2023284355A1 (en) 2021-07-15 2022-04-22 Information processing method, apparatus, and system, storage medium, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110800979.0A CN113556423B (en) 2021-07-15 2021-07-15 Information processing method, device, system, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN113556423A CN113556423A (en) 2021-10-26
CN113556423B true CN113556423B (en) 2023-04-07

Family

ID=78103259

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110800979.0A Active CN113556423B (en) 2021-07-15 2021-07-15 Information processing method, device, system, storage medium and electronic equipment

Country Status (2)

Country Link
CN (1) CN113556423B (en)
WO (1) WO2023284355A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113556423B (en) * 2021-07-15 2023-04-07 Oppo广东移动通信有限公司 Information processing method, device, system, storage medium and electronic equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112435677A (en) * 2020-10-26 2021-03-02 西安万像电子科技有限公司 Intelligent glasses and system

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103927350A (en) * 2014-04-04 2014-07-16 百度在线网络技术(北京)有限公司 Smart glasses based prompting method and device
CN106888350A (en) * 2015-12-15 2017-06-23 北京奇虎科技有限公司 Processing method of taking pictures, intelligent glasses and user terminal based on intelligent glasses
CN106604215A (en) * 2016-12-27 2017-04-26 宇龙计算机通信科技(深圳)有限公司 Wi-Fi hotspot sharing method and system
CN109996348B (en) * 2017-12-29 2022-07-05 中兴通讯股份有限公司 Method, system and storage medium for interaction between intelligent glasses and intelligent equipment
KR102455382B1 (en) * 2018-03-02 2022-10-18 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN108600632B (en) * 2018-05-17 2021-04-20 Oppo(重庆)智能科技有限公司 Photographing prompting method, intelligent glasses and computer readable storage medium
CN117544931A (en) * 2019-08-09 2024-02-09 华为技术有限公司 Information sharing method, terminal device, storage medium, and computer program product
CN112732217A (en) * 2020-12-30 2021-04-30 深圳增强现实技术有限公司 Information interaction method, terminal and storage medium of intelligent glasses for 5G messages
CN112817665A (en) * 2021-01-22 2021-05-18 北京小米移动软件有限公司 Equipment interaction method and device and storage medium
CN112883883A (en) * 2021-02-26 2021-06-01 北京蜂巢世纪科技有限公司 Information presentation method, information presentation device, information presentation medium, glasses, and program product
CN113556423B (en) * 2021-07-15 2023-04-07 Oppo广东移动通信有限公司 Information processing method, device, system, storage medium and electronic equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112435677A (en) * 2020-10-26 2021-03-02 西安万像电子科技有限公司 Intelligent glasses and system

Also Published As

Publication number Publication date
WO2023284355A1 (en) 2023-01-19
WO2023284355A9 (en) 2023-03-16
CN113556423A (en) 2021-10-26

Similar Documents

Publication Publication Date Title
KR102303115B1 (en) Method For Providing Augmented Reality Information And Wearable Device Using The Same
US20160063767A1 (en) Method for providing visual reality service and apparatus for the same
EP3023969A2 (en) Display and method and electronic device
CN105320262A (en) Method and apparatus for operating computer and mobile phone in virtual world and glasses thereof
KR20150142282A (en) Function controlling method and electronic device thereof
KR102223308B1 (en) Method for image processing and electronic device implementing the same
CN112188461B (en) Control method and device of near field communication device, medium and electronic equipment
CN113289327A (en) Display control method and device of mobile terminal, storage medium and electronic equipment
CN113238727A (en) Screen switching method and device, computer readable medium and electronic equipment
KR20180028165A (en) Method fog playing content and electronic device thereof
CN113556423B (en) Information processing method, device, system, storage medium and electronic equipment
CN112995402A (en) Control method and device, computer readable medium and electronic equipment
CN112241199B (en) Interaction method and device in virtual reality scene
CN115543535A (en) Android container system, android container construction method and device and electronic equipment
CN113766127B (en) Mobile terminal control method and device, storage medium and electronic equipment
CN111437600A (en) Plot showing method, plot showing device, plot showing equipment and storage medium
CN113936089A (en) Interface rendering method and device, storage medium and electronic equipment
CN112689151B (en) Live broadcast method and device, computer equipment and storage medium
CN111274489B (en) Information processing method, device, equipment and storage medium
CN114564101A (en) Three-dimensional interface control method and terminal
CN111796740A (en) Unmanned vehicle control method, device and system based on wearable intelligent equipment
CN114356529A (en) Image processing method and device, electronic equipment and storage medium
CN111770484B (en) Analog card switching method and device, computer readable medium and mobile terminal
CN113329375B (en) Content processing method, device, system, storage medium and electronic equipment
CN114429506A (en) Image processing method, apparatus, device, storage medium, and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant