CN113515192A - Information processing method and device for wearable equipment and wearable equipment - Google Patents

Information processing method and device for wearable equipment and wearable equipment Download PDF

Info

Publication number
CN113515192A
CN113515192A CN202110529006.8A CN202110529006A CN113515192A CN 113515192 A CN113515192 A CN 113515192A CN 202110529006 A CN202110529006 A CN 202110529006A CN 113515192 A CN113515192 A CN 113515192A
Authority
CN
China
Prior art keywords
electronic device
electronic
virtual operation
operation interface
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110529006.8A
Other languages
Chinese (zh)
Inventor
周梁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shining Reality Wuxi Technology Co Ltd
Original Assignee
Shining Reality Wuxi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shining Reality Wuxi Technology Co Ltd filed Critical Shining Reality Wuxi Technology Co Ltd
Priority to CN202110529006.8A priority Critical patent/CN113515192A/en
Publication of CN113515192A publication Critical patent/CN113515192A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an information processing method and device for wearable equipment and the wearable equipment, wherein the method comprises the following steps: in response to determining that the wearable device establishes a communication connection with the at least two electronic devices, displaying, for an electronic device of the at least two electronic devices, a virtual operating interface of the electronic device at a target location in the space, wherein the target location is determined based on a physical location of the electronic device in the space; for an electronic device of at least two electronic devices, in response to receiving an operation of a user on a virtual operation interface of the electronic device, generating a control instruction for controlling the electronic device so that the electronic device interacts with at least another electronic device of the at least two electronic devices. The method enables the user to quickly and accurately determine the position of the electronic equipment in the space, and can control interaction among the electronic equipment through the wearable equipment, so that the operation efficiency of the user on the electronic equipment is improved, and the operation is intuitive and visible.

Description

Information processing method and device for wearable equipment and wearable equipment
Technical Field
The present application relates to the field of wearable device technologies, and in particular, to an information processing method and apparatus for a wearable device, and a wearable device.
Background
With the development of technology, more and more electronic devices appear in daily life and work of people, and become an indispensable important part in life of people. Thus, there are typically multiple terminal electronic devices per user.
In the related art, after a user takes a plurality of terminal electronic devices, the user can directly operate each terminal electronic device, so that interaction among different terminal electronic devices is realized. For example, a user may operate a personal computer to send a file to a cell phone, where the user may view the file.
Disclosure of Invention
The embodiment of the application provides an information processing method and device for wearable equipment and the wearable equipment.
In a first aspect, an embodiment of the present application provides an information processing method for a wearable device, where the method includes: in response to determining that the wearable device establishes a communication connection with the at least two electronic devices, displaying, for an electronic device of the at least two electronic devices, a virtual operating interface of the electronic device at a target location in the space, wherein the target location is determined based on a physical location of the electronic device in the space; for an electronic device of at least two electronic devices, in response to receiving an operation of a user on a virtual operation interface of the electronic device, generating a control instruction for controlling the electronic device so that the electronic device interacts with at least another electronic device of the at least two electronic devices.
In a second aspect, an embodiment of the present application provides an information processing apparatus for a wearable device, including: a display module configured to display, for an electronic device of the at least two electronic devices, a virtual operating interface of the electronic device at a target location in the space in response to determining that the wearable device establishes a communication connection with the at least two electronic devices, wherein the target location is determined based on a physical location of the electronic device in the space; the processing module is configured to, for an electronic device of the at least two electronic devices, in response to receiving an operation of a user on a virtual operation interface of the electronic device, generate a control instruction for controlling the electronic device so that the electronic device interacts with at least another electronic device of the at least two electronic devices.
In a third aspect, embodiments of the present application provide a wearable device, which includes a memory, a processor, and a computer program stored on the memory, where the processor executes the computer program to implement the steps of the above method.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the above method.
In a fifth aspect, the present application provides a computer program product, which includes a computer program, and when being executed by a processor, the computer program implements the steps of the above method.
According to the information processing method for the wearable device, the wearable device can display the virtual operation interfaces of the electronic devices, and the positions of the virtual operation interfaces displayed in the space correspond to the physical positions of the electronic devices, so that a user can quickly and accurately determine the positions of the electronic devices in the space through the position relations of the virtual operation interfaces displayed by the wearable device in the space. And data interaction among a plurality of different electronic devices can be realized through the operation of the operation interface, so that the operation action is simplified, and the operation efficiency of the user on the electronic devices is improved.
Drawings
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present application may be applied;
fig. 2 is a flowchart of an information processing method for a wearable device according to an embodiment of the present application;
FIG. 3 is a flowchart of one embodiment of step S10 of FIG. 2 according to an embodiment of the present application;
FIG. 4 is a flowchart of one embodiment of step S120 in FIG. 3 according to an embodiment of the present application;
FIG. 5 is a flowchart of one embodiment of step S20 of FIG. 2 according to an embodiment of the present application;
FIG. 6 is a flowchart of one embodiment of step S230 in FIG. 5 according to an embodiment of the present application;
FIG. 7 is a diagram illustrating an application scenario in which a target object moves between two virtual operation interfaces in the present embodiment;
FIG. 8 is a flowchart illustrating another embodiment of step S20 in FIG. 2 according to an embodiment of the present application;
FIG. 9 is a flowchart illustrating another embodiment of step S10 in FIG. 2 according to an embodiment of the present application;
fig. 10 is a block diagram of a structure of an information processing apparatus for a wearable device according to an embodiment of the present application.
Detailed Description
Various aspects and features of the present application are described herein with reference to the drawings.
It will be understood that various modifications may be made to the embodiments of the present application. Accordingly, the foregoing description should not be construed as limiting, but merely as exemplifications of embodiments. Those skilled in the art will envision other modifications within the scope and spirit of the application.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the application and, together with a general description of the application given above and the detailed description of the embodiments given below, serve to explain the principles of the application.
These and other characteristics of the present application will become apparent from the following description of preferred forms of embodiment, given as non-limiting examples, with reference to the attached drawings.
It is also to be understood that although the present application has been described with reference to some specific examples, those skilled in the art are able to ascertain many other equivalents to the practice of the present application.
The above and other aspects, features and advantages of the present application will become more apparent in view of the following detailed description when taken in conjunction with the accompanying drawings.
Specific embodiments of the present application are described hereinafter with reference to the accompanying drawings; however, it is to be understood that the disclosed embodiments are merely exemplary of the application, which can be embodied in various forms. Well-known and/or repeated functions and constructions are not described in detail to avoid obscuring the application of unnecessary or unnecessary detail. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present application in virtually any appropriately detailed structure.
The specification may use the phrases "in one embodiment," "in another embodiment," "in yet another embodiment," or "in other embodiments," etc., which may each refer to one or more of the same or different embodiments in accordance with the application.
Fig. 1 shows an exemplary system architecture to which an embodiment of an information processing method for a wearable device or an information processing apparatus for a wearable device of the present application may be applied.
As shown in fig. 1, the system architecture may include electronic devices 101, 102, 103, a wearable device 104. The electronic devices 101, 102, 103 and the wearable device 104 may be connected by a network 105, and different ones of the electronic devices 101, 102, 103 may also be connected by a network. Here, the network connection may include connections in a variety of ways, such as wire, wireless communication links, or fiber optic cables, to name a few. The electronic devices 101, 102, 103 and the wearable device 104 may interact with the network 105 to send or receive data information or the like.
The user may interact with the wearable device 104 over a network using the electronic devices 101, 102, 103 to receive or transmit data, etc. The electronic devices 101, 102, 103 may have installed thereon various communication client applications, such as a web browser application, a shopping-like application, a search-like application, an instant messaging tool, a mailbox client, social platform software, and the like. And the electronic devices 101, 102, 103 may also store various local files, images, etc.
The electronic devices 101, 102, 103 may be various electronic devices usable by users, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, mpeg Audio Layer 4), laptop and desktop computers, and the like. Of course, the electronic devices 101, 102, and 103 may be software, and when the electronic devices 101, 102, and 103 are software, they may be installed in the electronic devices listed above. It may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The wearable device 104 may be an electronic device having a function of displaying a virtual image (e.g., a virtual operation interface) in a space, including but not limited to AR smart glasses, VR smart glasses, and the like. In general, the wearable device 104 may include a microdisplay, an optical system that processes images displayed by the microdisplay, and the like. An optical system in the wearable device 104 may process the image displayed by the microdisplay and direct the processed image to the human eye, thereby allowing the user to create a visual effect of seeing the image in space. Therefore, we can refer to an image displayed in space as seen by the user as a virtual image.
The wearable device 104 may also provide various services, for example, an electronic device that supports virtual display of the operation interfaces of the electronic devices 101, 102, and 103. Here, the wearable device 104 may have a computing function, and the wearable device 104 directly provides various services. Alternatively, the wearable device 104 may implement a computing function through a terminal device such as a mobile phone, and at this time, the wearable device and the terminal device may be used as the wearable device 104 as a whole to provide various services. Therefore, when it is determined that the wearable device establishes a communication connection with at least two electronic devices, the wearable device 104 may analyze and process data such as physical locations of the electronic devices in space, and send a processing result (e.g., a result of interaction between different electronic devices through a control instruction) to the head-mounted electronic device for display.
It should be noted that the information processing method for the wearable device provided in the embodiment of the present application is generally executed by the wearable device 104, and accordingly, the information processing apparatus for the wearable device is generally disposed in the wearable device 104. Here, the wearable device 104 may be the wearable device itself, or the wearable device may be an entirety of the wearable device and other electronic devices that provide computing functionality for the wearable device, without being limited thereto.
It should be understood that the number of electronic devices 101, 102, 103 and wearable device 104 in fig. 1 is merely illustrative. There may be any number of electronic devices 101, 102, 103 and wearable devices 104, as implementation needs dictate. For example, two wearable devices 104 may be included in fig. 1, and each wearable device 104 may display a virtual operation interface of each electronic device when each wearable device 104 establishes a communication connection with each electronic device, and a user may implement an operation on the electronic device through any one of the wearable devices 104.
Fig. 2 is a flowchart of an information processing method for a wearable device according to an embodiment of the present application. The embodiment provides an information processing method for a wearable device, which can be applied to electronic devices such as the wearable device, and the method comprises the following steps:
in the case that it is determined that the wearable device establishes communication connection with at least two electronic devices, for any one of the electronic devices, a virtual operation interface of the electronic device may be displayed at a target position in the space. Wherein the target location may be used to characterize a physical location of the electronic device in space. Specifically, wearable devices such as smart glasses and smart helmets can establish communication connection with other electronic devices through their own communication devices, such as wired and/or wireless connection with mobile phones, computers, and the like, and perform data interaction with the connected electronic devices. In an instance in which it is determined that a communication connection is established with multiple electronic devices, the wearable device may display a virtual operation interface corresponding to each electronic device in the space at a target location that characterizes a physical location of the electronic device in the space. For example, if a first electronic device is at a first target location in the space, the wearable device may display a first virtual operation interface corresponding to the first electronic device at the first target location, and if a second electronic device is at a second target location in the space, the wearable device may display a second virtual operation interface corresponding to the second electronic device at the second target location. The plurality of virtual operation interfaces at different target positions may represent a positional relationship between the corresponding plurality of electronic devices, and the above-mentioned relative positional relationship between the first virtual operation interface and the second virtual operation interface may correspond to a relative positional relationship between the first electronic device and the second electronic device. In addition, each virtual operation interface displayed by the wearable device in the embodiment can be operated by the user, so that the corresponding electronic device can be controlled through the virtual operation interface.
Further, for any one of the at least two electronic devices, the wearable device, upon receiving an operation of the virtual operation interface of the electronic device by the user, may generate a control instruction for controlling the electronic device, so that the electronic device may interact with at least another electronic device of the at least two electronic devices. Specifically, a user operates the electronic device through a virtual operation interface displayed by the wearable device, so that a corresponding control instruction can be generated, and the control instruction can operate the electronic device, so that interaction can be performed between the electronic devices. For example, a user can control one electronic device to send a file to another electronic device or multiple electronic devices through a virtual operation interface displayed by the wearable device, so that data interaction between the user and the other electronic devices through the wearable device is achieved, and the efficiency of operation among the multiple electronic devices is improved.
The information processing method of the present embodiment will be described in more detail below with reference to the drawings. As shown in fig. 2, the information processing method for a wearable device according to the embodiment of the present application includes the following steps:
and S10, in response to the fact that the wearable device is determined to be in communication connection with the at least two electronic devices, displaying a virtual operation interface of the electronic device at a target position in the space for the electronic device of the at least two electronic devices.
The wearable device may include smart glasses (e.g., AR glasses), a smart helmet, and other devices having a display function. The wearable device can be in communication connection with other electronic devices in a wireless or wired mode, for example, the smart glasses can be in communication connection with a mobile phone, a computer and the like.
In this embodiment, an executing entity (e.g., the wearable device in fig. 1) of the information processing method for the wearable device may perform data interaction with one or more electronic devices of the at least two electronic devices when it is determined that the wearable device establishes a communication connection with the at least two electronic devices. For any one of the at least two electronic devices, the execution subject may obtain data of the virtual operation interface from the electronic device, and generate the virtual operation interface of the electronic device, so that the wearable device may display the generated virtual operation interface at a target position in the space. It should be noted that, for each electronic device, the target position displayed by the corresponding virtual operation interface of each electronic device may be determined based on the physical position of the electronic device in space. As an example, the at least two electronic devices may include a first electronic device and a second electronic device, where the first electronic device is located at a physical location 1 of a space, and the second electronic device is located at a physical location 2 of the space, then the wearable device may display a virtual operation interface corresponding to the first electronic device at the physical location 1, and the wearable device may display a virtual operation interface corresponding to the second electronic device at the physical location 2.
According to the scheme disclosed by the embodiment, on one hand, the physical position of the electronic device corresponding to the virtual operation interface can be determined through the virtual operation interface displayed by the wearable device, so that a user can quickly determine the physical position of the corresponding electronic device according to the virtual operation interface; on the other hand, each electronic device can have a respective virtual operation interface in the wearable device, and the relative position relationship of each virtual operation interface corresponds to the relative position relationship between each electronic device, so that a user can conveniently operate the corresponding electronic device through the virtual operation interface. For example, the relative position relationship between the first electronic device and the second electronic device is "in front of and behind" for the user, and then the relative position relationship displayed in the wearable device by the first virtual operation interface of the first electronic device and the second virtual operation interface of the second electronic device is also "in front of and behind" so that the user can determine the relative position relationship of each electronic device in the space according to the relative position relationship between each virtual operation interface.
It is to be understood that the virtual operation interface may be an interactive interface for a user to operate the electronic device at the wearable device, and the user operating the virtual operation interface displayed at the wearable device may implement control of the electronic device corresponding to the virtual operation interface. For the specific display mode of the virtual operation interface, the wearable device may be preset, for example, the display mode of the virtual operation interface corresponding to the wearable device may be set according to the identity or type of the electronic device.
S20, for an electronic device of the at least two electronic devices, in response to receiving an operation of the virtual operation interface of the electronic device by the user, generating a control instruction for controlling the electronic device, so that the electronic device interacts with at least another electronic device of the at least two electronic devices.
In this embodiment, the user may operate the virtual operation interface displayed at the target position in the space in various ways. For example, the virtual operation interface can be operated by inputting voice, text and/or gestures. The wearable device can respond, generate a control instruction for controlling the electronic device corresponding to the virtual operation interface, and then can realize the control of the user on the corresponding electronic device based on the control instruction.
As an example, for the virtual operation interface of the first electronic device, the execution subject may display the virtual operation interface of the first electronic device at the target position 1 in the space. The user can perform operations such as APP icon double-click on a virtual operation interface displayed in the space through gestures, and the recognition of the gestures of the user by the execution main body can generate a control instruction such as APP opening for controlling the first electronic device.
In one aspect, the control instructions may be instructions generated by the wearable device for individually controlling the electronic device itself, including instructions for controlling the operation of the electronic device itself. For example, the generated control instruction is a control instruction for controlling the first electronic device, and the control instruction may be a control instruction for controlling the first electronic device to connect to the network access server.
On the other hand, the control instruction may also be an instruction to control the electronic device to interact with at least another electronic device. For example, a first virtual operation interface is displayed in the space by the first electronic device, a second virtual operation interface is displayed in the space by the second electronic device, and after the user performs an operation such as sending a file to the second electronic device on the first virtual operation interface displayed in the space, the wearable device may generate a corresponding control instruction based on the operation of the user, so as to implement interaction such as file sending between the first electronic device and the second electronic device. Of course, if more electronic devices are connected to the wearable device, the wearable device can interact with more electronic devices by responding to the operation of the user on one virtual operation interface.
Therefore, on the one hand, the interaction among the plurality of electronic devices may be mediated by the wearable device, and different electronic devices are respectively connected with the wearable device and interact with each other, thereby realizing the interaction among the different electronic devices. On the other hand, after responding to the received operation of the user on the virtual operation interface, the wearable device generates a control instruction, and the control instruction can respectively control the related electronic devices, so that the associated electronic devices can establish communication without taking the wearable device as an intermediary, and interaction between the associated electronic devices is realized.
The information processing method for the wearable device, provided by the above embodiments of the application, enables the virtual operation interface displayed in the wearable device to correspond to the physical position of the electronic device, so that a user can quickly and accurately determine the relative position relationship of the electronic device in space through the relative position relationships of the plurality of virtual operation interfaces. According to the scheme disclosed by the embodiment, data interaction among a plurality of different electronic devices can be realized through operation on the virtual operation interface by a user, the user does not need to operate and check the different electronic devices respectively, the same effect can be realized only through display of the wearable device, the operation action is simplified, and the operation efficiency of the user on the electronic devices is improved.
In some optional embodiments of the present application, the "displaying, for an electronic device of the at least two electronic devices, a virtual operation interface of the electronic device at a target position in space" in the step S10 may be implemented as follows, as shown in fig. 3, and includes the following steps:
s110, aiming at the electronic equipment in at least two pieces of electronic equipment, image acquisition is carried out on the electronic equipment.
In this embodiment, the wearable device may perform image acquisition on the current usage scene through the image acquisition apparatus. At least two electronic devices communicatively coupled to the wearable device may be included in the captured image. Therefore, the physical position information of the electronic equipment in the current use scene can be embodied in the image, and the relative position relationship between the electronic equipment in the current use scene can also be embodied.
For example, a camera is installed on the wearable device, and after the user wears the wearable device, the camera can be used for image acquisition of at least two electronic devices. Of course, the wearable device may also use other external devices to image capture the electronic device. The external device may send the captured image to the wearable device after image capture.
Accurate information of a current use scene can be obtained through image acquisition of the electronic equipment, so that a data basis is provided for determining an accurate target position displayed by the virtual operation interface.
And S120, analyzing the acquired image and determining the physical position information of the electronic equipment in the space.
In this embodiment, the acquired image may be analyzed by using one or more preset image analysis algorithms, so as to determine the physical location information of the electronic device in the space. For example, each electronic device may be identified in the image through an image identification algorithm, and the depth information of each electronic device may be calculated based on the plurality of images, so that the physical location of the different electronic devices may be determined.
And S130, sequencing the electronic devices based on the physical position information of the electronic devices.
In this embodiment, the physical location information may have various expressions. For example, the physical position information of the electronic device may be represented based on a reference object in the image, may be represented based on the shooting position of the image, or may be directly specified as the physical position of the electronic device in the world coordinate system. The execution body may sort the electronic devices by physical location information of the electronic devices. For example, an object closest to the shooting position may be selected as a reference object, and the electronic devices may be sorted in order of proximity to each other by acquiring physical position information indicated by the reference object. For another example, the physical location information may be physical coordinates in a world coordinate system, and in this case, the execution subjects may be sorted according to distances from the physical coordinates of the electronic devices to the wearable device.
And S140, determining corresponding target positions for the electronic devices based on the sorting result, so that the virtual operation interfaces of the electronic devices are sequentially arranged and displayed according to the layers under the condition that the virtual operation interfaces are displayed at the corresponding target positions.
In this embodiment, based on the sorting result of each electronic device in the space obtained in step S130, the executing entity may determine the target position corresponding to each electronic device by using various means, so that the executing entity may display the virtual operation interface corresponding to each electronic device at the determined target position. It can be understood that the target position can represent the physical position of the electronic device to a certain extent, the target positions corresponding to the electronic devices in the space are usually different, and the distances between the electronic devices and the wearable device are also different (certainly, the special case of equal distances is not excluded, and for the electronic devices with equal distances, the sequence can be divided in a random equal manner so as to generate a sequencing result), so that the different electronic devices have a "near-far" positional relationship with the wearable device. For example, the first electronic device is relatively close to the wearable device, the second electronic device is relatively far away from the wearable device, and the first electronic device and the second electronic device have a hierarchical relationship with respect to the wearable device, so that the wearable device displays the first virtual operation interface and the second virtual operation interface at respective target positions (the target positions corresponding to the first electronic device and the second electronic device are the respective physical positions), and the first virtual operation interface and the second virtual operation interface can be sequentially arranged and displayed according to a hierarchy.
For another example, the executing entity determines the electronic device closest to the wearable device from the sorting result of the electronic devices in the space, and may determine the electronic device as the first electronic device, then the target position of the first electronic device is determined by directly determining the physical position of the first electronic device in the space as the target position, then, the target position of the first electronic equipment is taken as the initial position, the corresponding target positions of the rest electronic equipment are sequentially determined according to the sequencing result, therefore, if the distance between the electronic device and the wearable device is gradually increased in the sorting result, the distance between the target position corresponding to each electronic device and the wearable device is also gradually increased, and the distances between the adjacent target positions can be equal, so that the virtual operation interfaces of the electronic equipment can be sequentially arranged and displayed according to the hierarchy. It is to be understood that the execution subject may also sequentially display the virtual operation interfaces of the electronic devices in a hierarchical manner, which is not limited herein.
As can be seen, in the scheme provided in this embodiment, first, an image of a current usage scene of each electronic device may be obtained, so that a physical location of each electronic device may be accurately determined, and each electronic device may be sorted according to the determined physical location, so that the wearable device may sort the virtual operation interfaces of each electronic device according to a sorting result of the electronic device, and the virtual operation interfaces of each electronic device may present a spatial hierarchy according to the sorting, and the virtual operation interfaces are arranged more sequentially.
In addition, when the wearable device displays the virtual operation interfaces of the electronic devices, the wearable device can display the virtual operation interfaces in a three-dimensional mode, so that the electronic devices and the corresponding virtual operation interfaces can show spatial layering for users, meanwhile, the three-dimensional display effect can be enhanced, and the stereoscopic impression is increased.
In some optional implementations, the "analyzing the acquired image and determining the physical location information of the electronic device in space" in the step S120 may be implemented as follows, as shown in fig. 4, and includes the following steps:
s1210, analyzing the acquired image to acquire scene information of the electronic equipment;
s1220, determining relative distance information and orientation information of the electronic equipment and the wearable equipment in space based on the scene information;
s1230, determining physical location information of the electronic device in space based on the determined relative distance information and the determined orientation information.
In the implementation mode, the current use scene can be acquired when the image of the electronic equipment is acquired, and corresponding scene information can be acquired when the acquired image is analyzed. For example, image acquisition may be performed separately for each electronic device, or image acquisition may also be performed for a plurality of electronic devices. The context information may include information of all usage scenarios of the electronic devices to be displayed on the wearable device, including information of the total number of the electronic devices, the enabled number, the volume and/or the posture; and also spatial information of the scene of use (such as the size of the physical space in the image as a whole) and/or environmental information (such as the degree of light), etc.
The execution main body can accurately calculate the relative distance information, the direction information and the like between each electronic device and the wearable device in the space based on the scene information, and large calculation deviation cannot be caused. Wherein, relative distance information can represent the distance between electronic equipment and the wearable equipment in the space, and position information can represent the position of electronic equipment relative wearable equipment in the space, like straight ahead, left side, right side, rear side etc. certainly can also be based on specific numerical value when showing to improve the precision. After the relative distance information and the orientation information corresponding to the electronic device are acquired, the execution main body can accurately calculate the physical position of the electronic device in the space based on the relative distance information and the orientation information.
In the implementation mode, when the physical position of the electronic equipment in the space is determined, the use scene can be firstly analyzed, and the relative distance information and the orientation information between the electronic equipment and the wearable equipment can be accurately obtained according to the actual use scene, so that the hierarchical relationship of each electronic equipment relative to the wearable equipment can be accurately embodied.
It will be appreciated that the electronic device may determine the physical location of the electronic device in a number of ways, and is not limited solely thereto. For example, in one aspect, the wearable device may determine a physical location of each electronic device in space according to the acquired image; on the other hand, the wearable device can directly receive the position information sent by the electronic device while acquiring the operation interface data sent by the electronic device, and the position information can be position information located by the electronic device itself, so that the wearable device can determine the physical position of the electronic device in the space based on the position information sent by the electronic device.
In some optional embodiments of the present application, the "generating a control instruction for controlling the electronic device in response to receiving an operation of the virtual operation interface of the electronic device by the user, so that the electronic device interacts with at least another electronic device of the at least two electronic devices" in step S20 may be implemented as follows, as shown in fig. 5, including:
s210, aiming at the virtual operation interface of the electronic equipment, generating a first operation instruction based on the operation of a target object in the virtual operation interface by a user.
In this embodiment, the user may operate the virtual operation interface displayed by the wearable device in various ways. For example, the user may operate the virtual operation interface by gestures, voice, and the like. After receiving the operation of the user, the execution main body may generate a corresponding first operation instruction. As an example, if the user operates the virtual operation interface in a voice manner, the first operation instruction generated accordingly may be a voice instruction. The execution main body can send a first operation instruction to the electronic equipment corresponding to the virtual operation interface. The first operation instruction may include at least one of: gesture instructions, voice instructions, and text instructions.
And S220, generating a corresponding control instruction based on the first operation instruction.
The control command may be a command for controlling the electronic device. The execution main body can generate a corresponding control instruction according to the first operation instruction. Therefore, when the first operation command is changed, the control command is changed accordingly. As an example, if the user performs a double-click operation on a first APP displayed in a virtual operation interface of the first electronic device, the correspondingly generated first operation instruction may be a double-click gesture instruction, and the double-click gesture instruction may generate an opening instruction for opening the first APP in the first electronic device and the like.
For different first control instructions, the different first control instructions represent different control instructions. For example, for the gesture instruction, if the dragging corresponds to a copy instruction in the control instruction, a copy operation needs to be performed on an object displayed in the first electronic device; the discarding gesture corresponds to a deletion instruction in the control instruction, that is, a deletion operation needs to be performed on an object such as a file displayed in the first electronic device. Of course, the voice command, the text command and the like can also be changed correspondingly to represent different control commands, and if the user sends out 'copy' through voice, the corresponding control command is copy operation; if the user sends "delete" through voice, the corresponding control instruction is delete operation, and so on, which is not described herein again.
And S230, executing the control instruction to control the electronic equipment to interact with at least another electronic equipment in the at least two electronic equipment.
The control instruction can not only control the electronic equipment, but also control the interaction between any electronic equipment and other electronic equipment. Therefore, the execution main body executes the control instruction, and can control interaction among different electronic devices. For example, a user issues a gesture instruction for dragging a target object to a second electronic device to a virtual operation interface of a first electronic device, and the executing body may generate a corresponding copy instruction (or a cut instruction) based on the gesture instruction, so that the target object may be controlled to be copied (or cut) from the first electronic device to the second electronic device.
For another example, a user inputs a text instruction to the virtual operation interface of the first electronic device, where the text instruction may instruct the first electronic device to send target data to the second electronic device, where the target data may be data acquired by the first electronic device from an external server, and the execution main body generates a corresponding control instruction (e.g., a data sending control instruction) after acquiring the text instruction, so as to control the first electronic device to download the target data from the server, and send the target data to the second electronic device, so that the second electronic device may receive the target data.
Therefore, the user can form the operation instruction by using convenient modes such as gestures, voice, texts and the like aiming at the virtual operation interface of the electronic equipment displayed by the wearable equipment, and interaction between the two electronic equipments is rapidly realized. The method and the device not only facilitate user operation, but also can effectively improve the interaction efficiency of data among different electronic devices.
In some optional implementations of this embodiment, the step S230 of "executing the control instruction to control the electronic device to interact with at least another electronic device of the at least two electronic devices" may be implemented as follows, as shown in fig. 6, including:
s2310, a control instruction is executed. The control instruction may include movement trajectory information of the target object.
S2320, the control target object moves from the virtual operation interface of the electronic device to the virtual operation interface of at least another electronic device interacting with the electronic device according to the trajectory indicated by the movement trajectory information.
In this implementation manner, the target object may be an object displayed in a virtual operation interface corresponding to the electronic device. For example, the target object may be a file displayed in a virtual operation interface corresponding to the first electronic device. When the user operates the target object through a gesture or the like (operation instruction) to move from the virtual operation interface of one electronic device to the virtual operation interface of another electronic device, the wearable device may generate a corresponding control instruction based on the gesture or the like, and the control instruction may include the movement trajectory information of the target object. Therefore, when the execution main body executes the control instruction, the target object can be moved from one electronic device to another electronic device according to the movement track indicated by the movement track information, so that the target object can be displayed in the virtual operation interface of the other electronic device. When the execution body executes the control instruction, an engine such as unity may render an image of the target object moving according to the trajectory indicated by the movement trajectory information, and control the wearable device to display the image, so that the user may see that the target object moves according to the movement trajectory when operating on the target object.
As an example, for a first electronic device and a second electronic device, a wearable device such as smart glasses shown in fig. 7 may display a first virtual operation interface 701 and a second virtual operation interface 702 at respective target locations in a space 700. For the first electronic device, in response to receiving an operation of a user to move a target object a displayed on the first virtual operation interface 701 to the second virtual operation interface 702 in the direction of the dotted arrow, a control instruction for controlling the first electronic device may be generated, and the wearable device executing the control instruction may control the target object a to move from the first virtual operation interface 701 to the second virtual operation interface 702 in the direction of the dotted arrow, as shown in fig. 7. Fig. 7 is a schematic diagram illustrating an application scenario in which a target object moves between two virtual operation interfaces in the present embodiment.
In the implementation mode, the target object moves from the virtual operation interface of one electronic device to the virtual operation interface of another electronic device according to the movement track indicated by the movement track information, so that the wearable device can visually display the movement operation of the target object to a user, the operation is more intuitive, and the user experience is improved.
It can be understood that, for the process of moving the target object from the virtual operation interface of one electronic device to the virtual operation interface of another electronic device according to the movement track, the target object can be presented in the form of a dynamic image. For example, the target object is displayed on the moving track by animation, or dynamic icons, so that the moving process of the target object from one electronic device to another electronic device can be displayed by visual effect.
In some optional embodiments of the application, for any one of the at least two electronic devices, the execution main body may generate the second operation instruction when receiving an operation of the user on an operation interface of the electronic device. The size and/or the displayed position of the virtual operation interface of the electronic equipment in the space can be updated based on the second operation instruction.
In this embodiment, the position and/or size of the virtual operation interface of each electronic device displayed in the space by the wearable device may be adjusted. The user performs an operation of adjusting the position and/or size of the virtual operation interface, and may generate a second operation instruction. After the wearable device acquires the second operation instruction, a corresponding control instruction can be generated based on the second operation instruction. For example, a user makes a movement or rotation gesture with respect to the first virtual operation interface, the wearable device generates a corresponding control instruction based on the movement or rotation gesture, and controls the position of the first virtual operation interface of the electronic device shown in the space to be changed accordingly. Certainly, the user can also enlarge or reduce the size of the first virtual operation interface display, so that the user can operate the first virtual operation interface with higher flexibility, the function of operating the operation interface by the user is expanded, and the display effect of the wearable device is improved.
In some optional embodiments of the present application, when a user operates a virtual operation interface of at least one electronic device displayed in a space, the user may operate the virtual operation interface itself, such as dragging, compressing, stretching, and the like, so that the user may operate the virtual operation interface flexibly. In this embodiment, the user may restore the virtual operation interface of each electronic device to the corresponding target position by restoring the restoration instruction of the virtual operation interface. Specifically, the execution main body may control the virtual operation interface of each electronic device to return to the original position when receiving the return instruction. Here, the initial position may be a target position corresponding to each electronic device specified by the execution subject. For example, the virtual operation interface is restored to an original position sequentially displayed according to the hierarchy, and the original position can be a target position in the space and associated with the physical position of the electronic device. In this embodiment, the recovery operation of the wearable device on the operation interface enables the user to perform the operation of recovering the original display position of the virtual operation interface when the display position of the virtual operation interface is disordered, so that the user can determine the physical position of each electronic device in the space according to the original display position of the virtual operation interface. It will be appreciated that the restore instructions described above may also restore the initial size of the virtual operator interface.
In some optional embodiments of the present application, the step 20 of "generating a control instruction for controlling the electronic device to interact with at least another electronic device of the at least two electronic devices" is implemented as follows, as shown in fig. 8, and includes the following steps:
and S240, generating a moving instruction aiming at the target object in the virtual operation interface of the electronic equipment.
In this embodiment, the control instruction may include a movement instruction for moving an object displayed in the virtual operation interface. Specifically, the execution subject may generate a movement instruction, and the movement instruction may be used to move a target object displayed in a virtual operation interface of the electronic device. Here, the movement instruction may include start position information and end position information, the position indicated by the start position information may be located on a virtual operation interface of the electronic device where the target object is initially located, and the position indicated by the end position information may be located on a virtual operation interface of another electronic device interacting with the electronic device where the target object is located. The target object can be various objects displayed in the virtual operation interface in the electronic device, for example, the target object can be a file, data, an application program, a picture, a video, and the like. When a user performs an operation such as dragging on a target object, the execution subject may generate a movement instruction for the target object.
And S250, executing the moving instruction, and acquiring the related information of the target object from the electronic equipment based on the initial position information.
In this embodiment, the execution body may execute the movement instruction in various ways, so that the target object may be deleted, copied, and the like. For example, the execution subject may send a movement instruction to the electronic device where the target object is located, so that the electronic device may respond, determine the target object based on the start position information, and delete, copy, or the like the target object.
Then, the execution subject may send the target object and/or the related information and the like to the electronic device interacting with the electronic device where the target object is initially located. Alternatively, the electronic device where the target object is originally located may send the target object and/or related information, etc. to the electronic device with which it interacts, without being limited uniquely herein. The related information of the target object may include related information such as name, icon, type, etc. of the target object.
S260, storing/installing the target object in the electronic device interacting with the electronic device based on the acquired information, and displaying an icon of the target object at a position indicated by the end position information.
In this embodiment, the execution body may control the electronic device interacting with the electronic device to store/install the target object by means of sending a movement instruction or the like, and display the stored/installed target object at the position indicated by the end point position information.
It is understood that the target object may be an image, a file, or the like that is locally stored in the first electronic device. In the case of executing the movement instruction, the execution subject may copy or cut the target object from the first electronic device based on the start position information, and the second electronic device interacting with the first electronic device may locally store the target object such as an image or a file and display the target object at a position indicated by the end position information. The target object may be an object in which an application or the like is installed. When the execution body executes the movement instruction, information such as the name of the application program may be determined from the first electronic device based on the start position information, so that the second electronic device interacting with the first electronic device may install the application program according to the determined information and display an icon of the installed application program at a position indicated by the end position information. In this case, the execution agent may delete the application from the first electronic device after specifying information such as a name of the application as a target object from the first electronic device based on the start position information, or may not delete the application, and the execution agent is not particularly limited. The first electronic device may be an electronic device where the target object is initially located, and the second electronic device may be an electronic device which interacts with the first electronic device.
It can also be understood that, in the case that a target object displayed on the virtual operation interface of one electronic device moves from the virtual operation interface of the electronic device to the virtual operation interface of another electronic device interacting with the electronic device, the wearable device may display a moving process of the target object moving from the start position indicated by the start position information to the end position indicated by the end position information. Of course, the wearable device may not display the moving process of the target object from the start position indicated by the start position information to the end position indicated by the end position information, but only display that the target object is selected at the start position indicated by the start position information and appears at the end position indicated by the end position information.
In some optional embodiments of the present application, the step 10 of displaying, for an electronic device of the at least two electronic devices, a virtual operation interface of the electronic device at a target position in a space may be implemented by the following steps, as shown in fig. 9:
s150, aiming at the electronic equipment in at least two pieces of electronic equipment, obtaining information to be displayed of the electronic equipment, and generating a virtual operation interface of the electronic equipment according to a preset rule;
and S160, displaying the generated virtual operation interface at the target position corresponding to the electronic equipment.
In this embodiment, the information to be displayed may include information associated with a target object in the electronic device, such as a name, a storage location, a data amount, and the like of the target object. For any one of at least two electronic devices, the execution main body may acquire information to be displayed in a virtual operation interface to be displayed in the electronic device, and then may generate an operation interface of the electronic device according to a preset rule based on the information to be displayed in the electronic device. The preset rule may be a preset manner for generating the operation interface, including appearance, specific content, and the like of the operation interface. For example, the information to be displayed may be displayed in a manner of grouping according to attributes, such as dividing various applications to be displayed into different groups according to attributes of office, entertainment, finance, and the like, and displaying the different groups in the virtual operation interface. Finally, the execution subject may display the generated virtual operation interface at the corresponding target position.
In this embodiment, the virtual operation interfaces of each of the two electronic devices may be generated by using the same rule, so that the appearances of the virtual operation interfaces of different electronic devices are substantially the same. For example, the virtual operation interfaces of the mobile phone and the computer are generated by the same rule, so that the appearances of the virtual operation interfaces of the mobile phone and the computer are basically the same. This will effectively improve the efficiency that wearable equipment handled virtual operation interface, improve the functioning speed, avoid causing the time delay for the user and feel, also can realize the clean and tidy effect of picture simultaneously.
In some optional embodiments of the present application, the step 10 of "displaying, for an electronic device of the at least two electronic devices, a virtual operation interface of the electronic device at a target position in space" may further be implemented by: acquiring information to be displayed and type information of the electronic equipment; determining a first identifier of the electronic equipment based on the acquired type information; generating a virtual operation interface with the determined first identifier according to a preset rule; and displaying the generated virtual operation interface at a target position corresponding to the electronic equipment. In this embodiment, the type information may include information such as a type, a model, and an external shape of the electronic device, and the execution subject may obtain the type information of the electronic device by analyzing the acquired image of the electronic device, or may obtain the type information of the electronic device by using the interaction data with the electronic device.
Then, the execution subject may determine the first identifier of the electronic device based on the acquired type information. The first identifier may be used to identify the electronic device. In this embodiment, the determined first identifier of the electronic device may be associated with the virtual operation interface of the electronic device by setting the first identifier on the virtual operation interface. For example, the first identifier may be a thumbnail of the electronic device, or may be a text for introducing the electronic device. The wearable device may display the generated virtual operation interface including the first identifier at the target location. The user can clearly know the type, the model and other information of the electronic equipment at the physical position corresponding to the target position through checking the first identifier, and the user can conveniently check the information.
An embodiment of the present application further provides an information processing apparatus for a wearable device, as shown in fig. 10, including a display module and a processing module. Wherein the display module is configured to display, for an electronic device of the at least two electronic devices, a virtual operation interface of the electronic device at a target location in the space in response to determining that the wearable device establishes a communication connection with the at least two electronic devices, wherein the target location is determined based on a physical location of the electronic device in the space; the processing module is configured to, for an electronic device of the at least two electronic devices, in response to receiving an operation of a user on a virtual operation interface of the electronic device, generate a control instruction for controlling the electronic device so that the electronic device interacts with at least another electronic device of the at least two electronic devices.
In some optional embodiments of the present application, the information processing apparatus further comprises an acquisition module configured to: aiming at the electronic equipment in at least two pieces of electronic equipment, carrying out image acquisition on the electronic equipment; analyzing the acquired image and determining the physical position information of the electronic equipment in the space; sorting the electronic devices based on the physical location information of the electronic devices; and determining corresponding target positions for the electronic equipment based on the sequencing results, so that the virtual operation interfaces of the electronic equipment are sequentially arranged and displayed according to the hierarchy under the condition that the virtual operation interfaces are displayed at the corresponding target positions.
In some optional embodiments of the present application, the acquisition module is further configured to: analyzing the acquired image to acquire scene information of the electronic equipment; determining relative distance information and orientation information of the electronic device and the wearable device in space based on the scene information; based on the determined relative distance information and orientation information, physical location information of the electronic device in space is determined.
In some optional embodiments of the present application, the processing module is further configured to: aiming at a virtual operation interface of the electronic equipment, generating a first operation instruction based on the operation of a target object in the virtual operation interface by a user, wherein the first operation instruction at least comprises one of the following instructions: gesture instructions, voice instructions, and text instructions; generating a corresponding control instruction based on the first operation instruction; and executing the control instruction to control the electronic equipment to interact with at least one other electronic equipment in the at least two electronic equipment.
In some optional embodiments of the present application, the processing module is further configured to: executing a control instruction, wherein the control instruction comprises the movement track information of the target object; and the control target object moves from the virtual operation interface of the electronic equipment to the virtual operation interface of at least another electronic equipment interacting with the electronic equipment according to the track indicated by the moving track information.
In some optional embodiments of the present application, the processing module is further configured to: for the electronic equipment in at least two pieces of electronic equipment, responding to the received operation of a user on a virtual operation interface of the electronic equipment, and generating a second operation instruction; and updating the size and/or the displayed position of the virtual operation interface of the electronic equipment in the space based on the second operation instruction.
In some optional embodiments of the present application, the information processing apparatus further comprises a recovery module configured to: and restoring the virtual operation interface of each electronic device to the corresponding target position in response to receiving the restoration instruction of the virtual operation interface.
In some optional embodiments of the present application, the control instruction comprises a movement instruction for moving an object displayed in the virtual operation interface; the processing module is further configured to: generating a movement instruction for a target object in a virtual operation interface of the electronic equipment, wherein the movement instruction comprises starting position information and end position information, a position indicated by the starting position information is located in the virtual operation interface of the electronic equipment, and a position indicated by the end position information is located in a virtual operation interface of another electronic equipment interacting with the electronic equipment; executing a moving instruction, and acquiring related information of a target object from the electronic equipment based on the initial position information; based on the acquired information, a target object is stored/installed in an electronic device interacting with the electronic device, and an icon of the target object is displayed at a position indicated by the end position information.
In some optional embodiments of the present application, the display module is further configured to: aiming at electronic equipment in at least two pieces of electronic equipment, acquiring information to be displayed of the electronic equipment, and generating a virtual operation interface of the electronic equipment according to a preset rule; and displaying the generated virtual operation interface at a target position corresponding to the electronic equipment.
In some optional embodiments of the present application, the display module is further configured to: the method comprises the steps that type information of electronic equipment in at least two pieces of electronic equipment is obtained; determining a first identifier of the electronic equipment based on the acquired type information; generating a virtual operation interface with the determined first identifier according to a preset rule; and displaying the generated virtual operation interface at a target position corresponding to the electronic equipment.
The modules recited in the apparatus shown in fig. 10 correspond to the respective steps in the method described in the respective flowcharts. Thus, the operations and features described above for the method are equally applicable to the apparatus and the modules included therein, and are not described in detail here.
The embodiment of the application also provides a wearable device, which comprises a memory, a processor and a computer program stored on the memory, wherein the processor executes the computer program to realize the steps of the information processing method.
Embodiments of the present application also provide a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the information processing method as above.
Embodiments of the present application also provide a computer program product, which includes a computer program, and when the computer program is executed by a processor, the steps of the above information processing method are implemented.
The above embodiments are only exemplary embodiments of the present application, and are not intended to limit the present application, and the protection scope of the present application is defined by the claims. Various modifications and equivalents may be made by those skilled in the art within the spirit and scope of the present application and such modifications and equivalents should also be considered to be within the scope of the present application.

Claims (14)

1. An information processing method for a wearable device, comprising:
in response to determining that the wearable device establishes a communication connection with at least two electronic devices, displaying, for an electronic device of the at least two electronic devices, a virtual operating interface of the electronic device at a target location in space, wherein the target location is determined based on a physical location of the electronic device in space;
for an electronic device of the at least two electronic devices, in response to receiving an operation of a user on a virtual operation interface of the electronic device, generating a control instruction for controlling the electronic device, so that the electronic device interacts with at least another electronic device of the at least two electronic devices.
2. The method of claim 1, wherein the displaying, for an electronic device of the at least two electronic devices, a virtual operation interface of the electronic device at a target location in space comprises:
aiming at the electronic equipment in the at least two pieces of electronic equipment, carrying out image acquisition on the electronic equipment;
analyzing the acquired image and determining the physical position information of the electronic equipment in the space;
sorting each electronic device based on the physical location information of each electronic device;
and determining a corresponding target position for each electronic device based on the sequencing result, so that the virtual operation interfaces of the electronic devices are sequentially arranged and displayed according to the hierarchy under the condition that the virtual operation interfaces are displayed at the corresponding target positions.
3. The method of claim 2, wherein analyzing the acquired image to determine the physical location information of the electronic device in space comprises:
analyzing the acquired image to acquire scene information of the electronic equipment;
determining relative distance information and orientation information of the electronic equipment and the wearable equipment in space based on the scene information;
based on the determined relative distance information and orientation information, physical location information of the electronic device in space is determined.
4. The method of claim 1, wherein the generating, in response to receiving an operation of a virtual operation interface of the electronic device by a user, a control instruction for controlling the electronic device to interact with at least another electronic device of the at least two electronic devices comprises:
aiming at a virtual operation interface of the electronic equipment, generating a first operation instruction based on the operation of a target object in the virtual operation interface by a user, wherein the first operation instruction at least comprises one of the following instructions: gesture instructions, voice instructions, and text instructions;
generating a corresponding control instruction based on the first operation instruction;
and executing the control instruction to control the electronic equipment to interact with at least another electronic equipment in the at least two electronic equipment.
5. The method of claim 4, wherein the executing the control instructions to control the electronic device to interact with at least another electronic device of the at least two electronic devices comprises:
executing the control instruction, wherein the control instruction comprises the movement track information of the target object;
and controlling the target object to move from the virtual operation interface of the electronic equipment to the virtual operation interface of at least another electronic equipment interacting with the electronic equipment according to the track indicated by the moving track information.
6. The method of claim 1, wherein the method further comprises:
for the electronic equipment in the at least two pieces of electronic equipment, responding to the received operation of a user on a virtual operation interface of the electronic equipment, and generating a second operation instruction;
and updating the size and/or the displayed position of the virtual operation interface of the electronic equipment in the space based on the second operation instruction.
7. The method of claim 1 or 6, wherein the method further comprises:
and in response to receiving a recovery instruction of the virtual operation interface, recovering the virtual operation interface of each electronic device to the corresponding target position.
8. The method of claim 1, wherein the control instruction comprises a move instruction for moving an object displayed in the virtual operation interface;
the generating a control instruction for controlling the electronic device to interact with at least another electronic device of the at least two electronic devices includes:
generating a movement instruction for a target object in a virtual operation interface of the electronic equipment, wherein the movement instruction comprises starting position information and end position information, the position indicated by the starting position information is located in the virtual operation interface of the electronic equipment, and the position indicated by the end position information is located in a virtual operation interface of another electronic equipment interacting with the electronic equipment;
executing the moving instruction, and acquiring related information of the target object from the electronic equipment based on the initial position information;
storing/installing the target object in an electronic device interacting with the electronic device based on the acquired information, and displaying an icon of the target object at a position indicated by the end position information.
9. The method of claim 1, wherein the displaying, for an electronic device of the at least two electronic devices, a virtual operation interface of the electronic device at a target location in space comprises:
aiming at the electronic equipment in the at least two pieces of electronic equipment, acquiring information to be displayed of the electronic equipment, and generating a virtual operation interface of the electronic equipment according to a preset rule;
and displaying the generated virtual operation interface at a target position corresponding to the electronic equipment.
10. The method of claim 9, wherein the displaying, for an electronic device of the at least two electronic devices, a virtual operation interface of the electronic device at a target location in space comprises:
the type information of the electronic equipment is obtained aiming at the electronic equipment in the at least two pieces of electronic equipment;
determining a first identifier of the electronic equipment based on the acquired type information;
generating a virtual operation interface with the determined first identifier according to the preset rule;
and displaying the generated virtual operation interface at a target position corresponding to the electronic equipment.
11. An information processing apparatus for a wearable device, comprising:
a display module configured to display, for an electronic device of the at least two electronic devices, a virtual operation interface of the electronic device at a target location in a space in response to determining that the wearable device establishes a communication connection with the at least two electronic devices, wherein the target location is determined based on a physical location of the electronic device in the space;
the processing module is configured to generate a control instruction for controlling an electronic device in response to receiving an operation of a virtual operation interface of the electronic device by a user for the electronic device, so that the electronic device interacts with at least another electronic device in the at least two electronic devices.
12. A wearable device comprising a memory, a processor and a computer program stored on the memory, the processor executing the computer program to implement the steps of the method of any of claims 1-10.
13. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 10.
14. A computer program product comprising a computer program which, when executed by a processor, carries out the steps of the method of any one of claims 1 to 10.
CN202110529006.8A 2021-05-14 2021-05-14 Information processing method and device for wearable equipment and wearable equipment Pending CN113515192A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110529006.8A CN113515192A (en) 2021-05-14 2021-05-14 Information processing method and device for wearable equipment and wearable equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110529006.8A CN113515192A (en) 2021-05-14 2021-05-14 Information processing method and device for wearable equipment and wearable equipment

Publications (1)

Publication Number Publication Date
CN113515192A true CN113515192A (en) 2021-10-19

Family

ID=78064521

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110529006.8A Pending CN113515192A (en) 2021-05-14 2021-05-14 Information processing method and device for wearable equipment and wearable equipment

Country Status (1)

Country Link
CN (1) CN113515192A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022161432A1 (en) * 2021-01-28 2022-08-04 维沃移动通信有限公司 Display control method and apparatus, and electronic device and medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106803988A (en) * 2017-01-03 2017-06-06 苏州佳世达电通有限公司 Message transfer system and information transferring method
CN107450665A (en) * 2011-09-21 2017-12-08 谷歌公司 It is superimposed with the wearable computer of the control and instruction for external equipment
CN111766937A (en) * 2019-04-02 2020-10-13 广东虚拟现实科技有限公司 Virtual content interaction method and device, terminal equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107450665A (en) * 2011-09-21 2017-12-08 谷歌公司 It is superimposed with the wearable computer of the control and instruction for external equipment
CN106803988A (en) * 2017-01-03 2017-06-06 苏州佳世达电通有限公司 Message transfer system and information transferring method
CN111766937A (en) * 2019-04-02 2020-10-13 广东虚拟现实科技有限公司 Virtual content interaction method and device, terminal equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022161432A1 (en) * 2021-01-28 2022-08-04 维沃移动通信有限公司 Display control method and apparatus, and electronic device and medium

Similar Documents

Publication Publication Date Title
US20220084279A1 (en) Methods for manipulating objects in an environment
KR102552551B1 (en) Keyboards for virtual, augmented and mixed reality display systems
US10394334B2 (en) Gesture-based control system
CN109313509B (en) Visual halo around the field of vision
CN106716302B (en) Method, apparatus, and computer-readable medium for displaying image
CN110633008B (en) User interaction interpreter
EP3769509B1 (en) Multi-endpoint mixed-reality meetings
CN111527525A (en) Mixed reality service providing method and system
KR20170031733A (en) Technologies for adjusting a perspective of a captured image for display
CN110568929B (en) Virtual scene interaction method and device based on virtual keyboard and electronic equipment
US11954245B2 (en) Displaying physical input devices as virtual objects
US20220317776A1 (en) Methods for manipulating objects in an environment
CN102779000A (en) User interaction system and method
CN114690900B (en) Input identification method, device and storage medium in virtual scene
CN111880652A (en) Method, apparatus and storage medium for moving position of AR object
US11367416B1 (en) Presenting computer-generated content associated with reading content based on user interactions
CN106909219B (en) Interaction control method and device based on three-dimensional space and intelligent terminal
CN113515192A (en) Information processing method and device for wearable equipment and wearable equipment
Lee et al. Tunnelslice: Freehand subspace acquisition using an egocentric tunnel for wearable augmented reality
US20210209840A1 (en) Generating a 3D Model of a Fingertip for Visual Touch Detection
CN111901518B (en) Display method and device and electronic equipment
CN111580658A (en) AR-based conference method and device and electronic equipment
EP4064211A2 (en) Indicating a position of an occluded physical object
CN115223248A (en) Hand gesture recognition method, and training method and device of hand gesture recognition model
WO2020226956A1 (en) Device, method, and graphical user interface for generating cgr objects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination