CN107133028B - Information processing method and electronic equipment - Google Patents
Information processing method and electronic equipment Download PDFInfo
- Publication number
- CN107133028B CN107133028B CN201710202059.2A CN201710202059A CN107133028B CN 107133028 B CN107133028 B CN 107133028B CN 201710202059 A CN201710202059 A CN 201710202059A CN 107133028 B CN107133028 B CN 107133028B
- Authority
- CN
- China
- Prior art keywords
- electronic device
- data
- user
- target
- dimensional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses an information processing method and electronic equipment, wherein the method comprises the following steps: the method comprises the following steps that connection between first electronic equipment and second electronic equipment is established, wherein the second electronic equipment is provided with a display unit, and the display unit can display a two-dimensional graphical interface; the first electronic device obtains target graphical data based on the connection, wherein the target graphical data comprises all graphical data or part of graphical data of the graphical interface; and displaying a target object through the lens module by using the target graphic data, wherein the target object is displayed in a three-dimensional space.
Description
Technical Field
The present invention relates to information processing technologies, and in particular, to an information processing method and an electronic device.
Background
Computers are widely used in various fields such as home, office, medical, and so on. The user can not only work through the computer, but also entertain through the computer. As an important output device of a computer, all pictures displayed on a computer screen are presented to a user through a two-dimensional interface. Even if the computer displays a stereoscopic picture, the third-dimensional (vertical screen) data of the stereoscopic picture is processed to be consistent, and the actually displayed picture is still two-dimensional.
In many scenarios, for example, when modeling in three-dimensional (3D), a user needs to view a more vivid stereoscopic image, and if three-dimensional data is presented through a two-dimensional interface, the user cannot intuitively know a real model of the stereoscopic image, which obviously brings inconvenience to the user.
Disclosure of Invention
In order to solve the above technical problem, embodiments of the present invention provide an information processing method and an electronic device.
The information processing method provided by the embodiment of the invention comprises the following steps:
the method comprises the following steps that connection between first electronic equipment and second electronic equipment is established, wherein the second electronic equipment is provided with a display unit, and the display unit can display a two-dimensional graphical interface;
the first electronic device obtains target graphical data based on the connection, wherein the target graphical data comprises all graphical data or part of graphical data of the graphical interface;
and displaying a target object through the lens module by using the target graphic data, wherein the target object is displayed in a three-dimensional space.
In an embodiment of the present invention, the target graphics data includes: the data processing method comprises the following steps of (1) displaying a model with a first shape in a three-dimensional space based on first-class data and second-class data, wherein the model corresponds to the graphical interface; the menu key having the second shape can be displayed in a three-dimensional space based on the second type data.
In the embodiment of the present invention, the method further includes:
detecting a first input operation of a user for the model or the menu key;
sending operation information corresponding to the first input operation to the second electronic equipment;
receiving updated target graphic data sent by the second electronic device;
adjusting display parameters of the model or menu key based on the updated target graphical data.
In the embodiment of the present invention, the method further includes:
receiving updated target graphics data sent by the second electronic device, wherein the updated target graphics data is generated by the second electronic device based on a second operation of a user;
adjusting display parameters of the model or menu key based on the updated target graphical data.
In the embodiment of the present invention, the method further includes:
and displaying a three-dimensional boundary box while displaying the target graphic data through the lens module, wherein the three-dimensional boundary box is used for indicating the area range which can be displayed by the target graphic data of the user.
The electronic device provided by the embodiment of the invention is a first electronic device, and comprises:
the communication interface is used for establishing connection with second electronic equipment, wherein the second electronic equipment is provided with a display unit, and the display unit can display a two-dimensional graphical interface; obtaining target graphical data based on the connection, the target graphical data including all or part of graphical data of the graphical interface;
and the lens module is used for displaying the target object by utilizing the target graphic data, wherein the target object is displayed in a three-dimensional space.
In an embodiment of the present invention, the target graphics data includes: a first type of data and a second type of data, wherein,
the lens module is specifically used for: displaying a model having a first shape in a three-dimensional space based on the first type of data, the model corresponding to the graphical interface; and displaying a menu key with a second shape in a three-dimensional space based on the second type of data.
In an embodiment of the present invention, the electronic device further includes:
the input device is used for detecting a first input operation of a user for the model or the menu key;
the communication interface is further configured to send operation information corresponding to the first input operation to the second electronic device; sending operation information corresponding to the first input operation to the second electronic equipment;
the lens module is further used for: adjusting display parameters of the model or menu key based on the updated target graphical data.
In an embodiment of the present invention, the communication interface is further configured to receive updated target graphics data sent by the second electronic device, where the updated target graphics data is generated by the second electronic device based on a second operation of a user;
the lens module is further used for: adjusting display parameters of the model or menu key based on the updated target graphical data.
In an embodiment of the present invention, the lens module is further configured to: and displaying the target graphic data and a three-dimensional bounding box at the same time, wherein the three-dimensional bounding box is used for indicating the area range which can be displayed by the target graphic data of the user.
In the technical scheme of the embodiment of the invention, the first electronic equipment is connected with the second electronic equipment, wherein the second electronic equipment is provided with a display unit, and the display unit can display a two-dimensional graphical interface; the first electronic device obtains target graphical data based on the connection, wherein the target graphical data comprises all graphical data or part of graphical data of the graphical interface; and displaying a target object through the lens module by using the target graphic data, wherein the target object is displayed in a three-dimensional space. By adopting the technical scheme of the embodiment of the invention, the graphic data which is originally displayed on the second electronic equipment by the two-dimensional graphic interface can be displayed in the three-dimensional space through the lens module of the head-mounted equipment (namely the first electronic equipment), so that a user can intuitively browse a 3D picture, the immersion of the user on the picture is enhanced, and the user experience is greatly improved.
Drawings
FIG. 1 is a first flowchart illustrating an information processing method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a connection according to an embodiment of the present invention;
FIG. 3 is a 3D display of a target object according to an embodiment of the invention;
FIG. 4 is a diagram of a 3D preview window according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a three-dimensional bounding box according to an embodiment of the present invention;
FIG. 6 is a second flowchart illustrating an information processing method according to an embodiment of the present invention;
FIG. 7 is a third flowchart illustrating an information processing method according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
So that the manner in which the features and aspects of the embodiments of the present invention can be understood in detail, a more particular description of the embodiments of the invention, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings.
Fig. 1 is a first schematic flow chart of an information processing method according to an embodiment of the present invention, and as shown in fig. 1, the information processing method includes the following steps:
step 101: the first electronic equipment is connected with second electronic equipment, wherein the second electronic equipment is provided with a display unit, and the display unit can display a two-dimensional graphical interface.
In an embodiment of the present invention, the first electronic device is a head-mounted device, for example: intelligent glasses, intelligent helmet. The first electronic equipment can present a virtual display screen through the lens module, and the virtual display screen can perform 3D preview on graphic data in the second electronic equipment as an expansion screen of the second electronic equipment.
In the embodiment of the present invention, the first electronic device establishes a connection with the second electronic device, and the connection may be a wired connection or a wireless connection, and is preferably a wireless connection. The first electronic device and the second electronic device may establish an indirect connection through a relay in the internet, or may establish a direct connection between the first electronic device and the second electronic device. For example, a wireless local area network direct connection or bluetooth connection or a Device-to-Device (D2D) connection may be established between the first electronic Device and the second electronic Device.
FIG. 2 is a schematic diagram of a connection between a first electronic device and a second electronic device; in fig. 2, the smart glasses are a first electronic device, a Personal Computer (PC) is a second electronic device, and the first electronic device and the second electronic device are connected to each other through an adapter. And a hybrid display management application is run in the first electronic device and/or the second electronic device and is used for simultaneously managing the display of the entity display screen and the virtual display screen.
The second electronic device has a display unit, and in this embodiment, the second electronic device may be various electronic devices such as a notebook computer, a desktop computer, or a projector. The display unit corresponds to a specific physical display screen. The physical display screen can be a liquid crystal display screen, an electronic ink display screen or a projection display screen and the like.
In the embodiment of the invention, the display unit in the second electronic device displays a two-dimensional graphical interface. Here, the light emitted from the display unit is incident to the left and right eyes of the user, and the phases of the light for the left eye and the light for the right eye are identical, and thus, the picture viewed by the user is two-dimensional.
Step 102: the first electronic device obtains target graphical data based on the connection, wherein the target graphical data comprises all graphical data or part of graphical data of the graphical interface.
In an embodiment of the present invention, the target graphics data includes: the data processing method comprises the following steps of (1) displaying a model with a first shape in a three-dimensional space based on first-class data and second-class data, wherein the model corresponds to the graphical interface; the menu key having the second shape can be displayed in a three-dimensional space based on the second type data.
In the above scheme, the first type of data is all or part of the graphical data of the graphical interface.
As shown in fig. 3, the first type of data is part of the graphical data of the graphical interface, and four frames are displayed on the second electronic device (PC), wherein the frame at the lower right corner is transmitted to the first electronic device for 3D display. Of course, the first type of data may also be all the graphic data of the graphical interface, and at this time, the second electronic device transmits all the graphic data of the graphical interface to the first electronic device for 3D display. The user can flexibly select the graphic data needing 3D display from all the graphic data of the graphic interface.
Step 103: and displaying a target object through the lens module by using the target graphic data, wherein the target object is displayed in a three-dimensional space.
In the embodiment of the present invention, since the target graphic data includes the first type data and the second type data, the target object displayed by the lens module includes: a mold having a first shape and a menu key having a second shape. As shown in fig. 3, the first-shaped model is a cubic model, and the second-shaped menu key is a rectangular menu key. Here, when the first electronic device does not acquire the target graphics data from the second electronic device side, only the menu key may be displayed by the lens module.
In an embodiment of the present invention, the first electronic device has a lens module, and the light corresponding to the target graphic data is projected to the eyes of the user through the lens module, so that the user views the target object, and the target object is displayed in a three-dimensional space.
In the embodiment of the invention, the processor controls the light source to emit corresponding light rays based on the target graphic data, and the light rays are refracted by the lens module and thrown into eyes of a user to form a virtual target object in the retina of the user.
In the embodiment of the present invention, the display of the target object in the three-dimensional space means that the target object is displayed in a three-dimensional manner, and in order to achieve a three-dimensional display effect, the phases of the light projected to the left eye of the user and the light projected to the right eye of the user by the lens module are not consistent, so that the user can have a three-dimensional viewing experience.
In the embodiment of the invention, the first electronic device is provided with the orientation tracking device, and when the first electronic device is worn on the head of a user, the orientation of the head of the user can be detected in real time through the orientation tracking device, such as head raising, head lowering, head turning left and head turning right of the user. Since the first electronic device knows the orientation of the user's head at any time, it is possible to display a virtual target object within the user's field of view.
In one embodiment, as shown in fig. 4, in order to better display the target object, a 3D preview window is further displayed around the target object, the 3D model is displayed inside the 3D preview window, and the menu key is located at the bottom of the 3D preview window. The user can quickly locate the 3D model through the 3D preview window.
Here, the 3D preview window has three display modes, which are a first display mode, a second display mode, and a third display mode, respectively; wherein:
when the 3D preview window is in the first display mode, no matter which direction the head of the user faces, the 3D preview window always moves adaptively along with the head movement of the user, and the 3D preview window is ensured to be positioned in the visual field of the user.
When the 3D preview window is in the second display mode, the 3D preview window always remains in a relative positional relationship with the second electronic device regardless of which orientation the user's head is oriented and regardless of which orientation the second electronic device is moved to, e.g., the 3D preview window is always located at a particular position to the left of the second electronic device.
When the 3D preview window is in the third display mode, the absolute position of the 3D preview window remains unchanged regardless of which orientation the user's head is oriented and regardless of which orientation the second electronic device is moved to.
Of course, the user may set the position of the 3D preview window to a fixed state to reduce the misoperation caused by the user being unable to accurately position the 3D preview window. When the visual range of the user deviates from the 3D preview window, the first electronic equipment sends a prompt to the user to prompt the user to deviate from the 3D preview window. Here, the presentation may be realized by sound, image, vibration, or the like.
The user can change the orientation of the 3D preview window through operation, and can also change the orientation of the 3D model in the 3D preview window. Of course, the user may also hide the 3D preview window or display the 3D preview window by operation. Here, the operation may be realized by, but not limited to, a gesture operation, a mouse operation from the second electronic device, an operation from the remote control device.
In the embodiment of the invention, the shape of the 3D preview window can be a cube, a cylinder and the like, and a user can self-define the shape of the 3D preview window according to requirements.
In one embodiment, in order to facilitate a user to better manipulate a virtual object in a three-dimensional space, a three-dimensional bounding box is displayed while the target graphic data is displayed through the lens module, wherein the three-dimensional bounding box is used for indicating an area range in which the target graphic data can be displayed by the user. As shown in fig. 5, the three-dimensional bounding box is rendered by a stereoscopic mesh, and all virtual objects are rendered within the three-dimensional bounding box.
Fig. 6 is a second schematic flowchart of an information processing method according to an embodiment of the present invention, and as shown in fig. 6, the information processing method includes the following steps:
step 601: the first electronic equipment is connected with second electronic equipment, wherein the second electronic equipment is provided with a display unit, and the display unit can display a two-dimensional graphical interface.
In an embodiment of the present invention, the first electronic device is a head-mounted device, for example: intelligent glasses, intelligent helmet. The first electronic equipment can present a virtual display screen through the lens module, and the virtual display screen can perform 3D preview on graphic data in the second electronic equipment as an expansion screen of the second electronic equipment.
In the embodiment of the present invention, the first electronic device establishes a connection with the second electronic device, and the connection may be a wired connection or a wireless connection, and is preferably a wireless connection. The first electronic device and the second electronic device may establish an indirect connection through a relay in the internet, or may establish a direct connection between the first electronic device and the second electronic device. For example, a wireless local area network direct connection or a bluetooth connection or a D2D connection may be established between the first electronic device and the second electronic device.
FIG. 2 is a schematic diagram of a connection between a first electronic device and a second electronic device; in fig. 2, the smart glasses are a first electronic device, the PC is a second electronic device, and the first electronic device and the second electronic device are connected to each other through an adapter. And a hybrid display management application is run in the first electronic device and/or the second electronic device and is used for simultaneously managing the display of the entity display screen and the virtual display screen.
The second electronic device has a display unit, and in this embodiment, the second electronic device may be various electronic devices such as a notebook computer, a desktop computer, or a projector. The display unit corresponds to a specific physical display screen. The physical display screen can be a liquid crystal display screen, an electronic ink display screen or a projection display screen and the like.
In the embodiment of the invention, the display unit in the second electronic device displays a two-dimensional graphical interface. Here, the light emitted from the display unit is incident to the left and right eyes of the user, and the phases of the light for the left eye and the light for the right eye are identical, and thus, the picture viewed by the user is two-dimensional.
Step 602: the first electronic device obtains target graphical data based on the connection, wherein the target graphical data comprises all graphical data or part of graphical data of the graphical interface.
In an embodiment of the present invention, the target graphics data includes: the data processing method comprises the following steps of (1) displaying a model with a first shape in a three-dimensional space based on first-class data and second-class data, wherein the model corresponds to the graphical interface; the menu key having the second shape can be displayed in a three-dimensional space based on the second type data.
In the above scheme, the first type of data is all or part of the graphical data of the graphical interface.
As shown in fig. 3, the first type of data is part of the graphical data of the graphical interface, and four frames are displayed on the second electronic device (PC), wherein the frame at the lower right corner is transmitted to the first electronic device for 3D display. Of course, the first type of data may also be all the graphic data of the graphical interface, and at this time, the second electronic device transmits all the graphic data of the graphical interface to the first electronic device for 3D display. The user can flexibly select the graphic data needing 3D display from all the graphic data of the graphic interface.
Step 603: and displaying a target object through the lens module by using the target graphic data, wherein the target object is displayed in a three-dimensional space.
In the embodiment of the present invention, since the target graphic data includes the first type data and the second type data, the target object displayed by the lens module includes: a mold having a first shape and a menu key having a second shape. As shown in fig. 3, the first-shaped model is a cubic model, and the second-shaped menu key is a rectangular menu key. Here, when the first electronic device does not acquire the target graphics data from the second electronic device side, only the menu key may be displayed by the lens module.
In an embodiment of the present invention, the first electronic device has a lens module, and the light corresponding to the target graphic data is projected to the eyes of the user through the lens module, so that the user views the target object, and the target object is displayed in a three-dimensional space.
In the embodiment of the invention, the processor controls the light source to emit corresponding light rays based on the target graphic data, and the light rays are refracted by the lens module and thrown into eyes of a user to form a virtual target object in the retina of the user.
In the embodiment of the present invention, the display of the target object in the three-dimensional space means that the target object is displayed in a three-dimensional manner, and in order to achieve a three-dimensional display effect, the phases of the light projected to the left eye of the user and the light projected to the right eye of the user by the lens module are not consistent, so that the user can have a three-dimensional viewing experience.
In the embodiment of the invention, the first electronic device is provided with the orientation tracking device, and when the first electronic device is worn on the head of a user, the orientation of the head of the user can be detected in real time through the orientation tracking device, such as head raising, head lowering, head turning left and head turning right of the user. Since the first electronic device knows the orientation of the user's head at any time, it is possible to display a virtual target object within the user's field of view.
In one embodiment, as shown in fig. 4, in order to better display the target object, a 3D preview window is further displayed around the target object, the 3D model is displayed inside the 3D preview window, and the menu key is located at the bottom of the 3D preview window. The user can quickly locate the 3D model through the 3D preview window.
Here, the 3D preview window has three display modes, which are a first display mode, a second display mode, and a third display mode, respectively; wherein:
when the 3D preview window is in the first display mode, no matter which direction the head of the user faces, the 3D preview window always moves adaptively along with the head movement of the user, and the 3D preview window is ensured to be positioned in the visual field of the user.
When the 3D preview window is in the second display mode, the 3D preview window always remains in a relative positional relationship with the second electronic device regardless of which orientation the user's head is oriented and regardless of which orientation the second electronic device is moved to, e.g., the 3D preview window is always located at a particular position to the left of the second electronic device.
When the 3D preview window is in the third display mode, the absolute position of the 3D preview window remains unchanged regardless of which orientation the user's head is oriented and regardless of which orientation the second electronic device is moved to.
Of course, the user may set the position of the 3D preview window to a fixed state to reduce the misoperation caused by the user being unable to accurately position the 3D preview window. When the visual range of the user deviates from the 3D preview window, the first electronic equipment sends a prompt to the user to prompt the user to deviate from the 3D preview window. Here, the presentation may be realized by sound, image, vibration, or the like.
The user can change the orientation of the 3D preview window through operation, and can also change the orientation of the 3D model in the 3D preview window. Of course, the user may also hide the 3D preview window or display the 3D preview window by operation. Here, the operation may be realized by, but not limited to, a gesture operation, a mouse operation from the second electronic device, an operation from the remote control device.
In the embodiment of the invention, the shape of the 3D preview window can be a cube, a cylinder and the like, and a user can self-define the shape of the 3D preview window according to requirements.
In one embodiment, in order to facilitate a user to better manipulate a virtual object in a three-dimensional space, a three-dimensional bounding box is displayed while the target graphic data is displayed through the lens module, wherein the three-dimensional bounding box is used for indicating an area range in which the target graphic data can be displayed by the user. As shown in fig. 5, the three-dimensional bounding box is rendered by a stereoscopic mesh, and all virtual objects are rendered within the three-dimensional bounding box.
Step 604: detecting a first input operation of a user for the model or the menu key; sending operation information corresponding to the first input operation to the second electronic equipment; receiving updated target graphic data sent by the second electronic device; adjusting display parameters of the model or menu key based on the updated target graphical data.
In the embodiment of the present invention, the first input operation may be, but is not limited to, the following operations: gesture operation, voice control operation, and the like.
Taking the gesture operation as an example, the first electronic device has an image capturing device, through which a gesture of the user can be captured, and the user can change the display angle of the model through the gesture, for example, turn the model to the left by 30 degrees, present the bottom of the model in front, and so on. In order to make the virtual display frame consistent with the operation of the user, the operation information corresponding to the first input operation is sent to the second electronic equipment, the second electronic equipment updates the target graphic data, and then the first electronic equipment updates the display effect in real time.
In the embodiment of the invention, the first electronic device is used as the 3D expansion screen of the second electronic device, and can be dynamically updated in real time according to the graphic data on the second electronic device. When the 3D model is presented through the first electronic device, the user can intuitively know the size of the model, and if a plurality of models exist, the relative sizes among the models can be easily compared, so that the operation of the user on the models can be more accurate.
In the embodiment of the invention, the first electronic device is responsible for receiving the graphic data of the second electronic device and realizing 3D display, and the second electronic device is responsible for a specific data processing process, so that the number of chips of the first electronic device is less, and the structure is lighter and thinner.
Fig. 7 is a third schematic flowchart of an information processing method according to an embodiment of the present invention, and as shown in fig. 7, the information processing method includes the following steps:
step 701: the first electronic equipment is connected with second electronic equipment, wherein the second electronic equipment is provided with a display unit, and the display unit can display a two-dimensional graphical interface.
In an embodiment of the present invention, the first electronic device is a head-mounted device, for example: intelligent glasses, intelligent helmet. The first electronic equipment can present a virtual display screen through the lens module, and the virtual display screen can perform 3D preview on graphic data in the second electronic equipment as an expansion screen of the second electronic equipment.
In the embodiment of the present invention, the first electronic device establishes a connection with the second electronic device, and the connection may be a wired connection or a wireless connection, and is preferably a wireless connection. The first electronic device and the second electronic device may establish an indirect connection through a relay in the internet, or may establish a direct connection between the first electronic device and the second electronic device. For example, a wireless local area network direct connection or a bluetooth connection or a D2D connection may be established between the first electronic device and the second electronic device.
FIG. 2 is a schematic diagram of a connection between a first electronic device and a second electronic device; in fig. 2, the smart glasses are a first electronic device, the PC is a second electronic device, and the first electronic device and the second electronic device are connected to each other through an adapter. And a hybrid display management application is run in the first electronic device and/or the second electronic device and is used for simultaneously managing the display of the entity display screen and the virtual display screen.
The second electronic device has a display unit, and in this embodiment, the second electronic device may be various electronic devices such as a notebook computer, a desktop computer, or a projector. The display unit corresponds to a specific physical display screen. The physical display screen can be a liquid crystal display screen, an electronic ink display screen or a projection display screen and the like.
In the embodiment of the invention, the display unit in the second electronic device displays a two-dimensional graphical interface. Here, the light emitted from the display unit is incident to the left and right eyes of the user, and the phases of the light for the left eye and the light for the right eye are identical, and thus, the picture viewed by the user is two-dimensional.
Step 702: the first electronic device obtains target graphical data based on the connection, wherein the target graphical data comprises all graphical data or part of graphical data of the graphical interface.
In an embodiment of the present invention, the target graphics data includes: the data processing method comprises the following steps of (1) displaying a model with a first shape in a three-dimensional space based on first-class data and second-class data, wherein the model corresponds to the graphical interface; the menu key having the second shape can be displayed in a three-dimensional space based on the second type data.
In the above scheme, the first type of data is all or part of the graphical data of the graphical interface.
As shown in fig. 3, the first type of data is part of the graphical data of the graphical interface, and four frames are displayed on the second electronic device (PC), wherein the frame at the lower right corner is transmitted to the first electronic device for 3D display. Of course, the first type of data may also be all the graphic data of the graphical interface, and at this time, the second electronic device transmits all the graphic data of the graphical interface to the first electronic device for 3D display. The user can flexibly select the graphic data needing 3D display from all the graphic data of the graphic interface.
Step 703: and displaying a target object through the lens module by using the target graphic data, wherein the target object is displayed in a three-dimensional space.
In the embodiment of the present invention, since the target graphic data includes the first type data and the second type data, the target object displayed by the lens module includes: a mold having a first shape and a menu key having a second shape. As shown in fig. 3, the first-shaped model is a cubic model, and the second-shaped menu key is a rectangular menu key. Here, when the first electronic device does not acquire the target graphics data from the second electronic device side, only the menu key may be displayed by the lens module.
In an embodiment of the present invention, the first electronic device has a lens module, and the light corresponding to the target graphic data is projected to the eyes of the user through the lens module, so that the user views the target object, and the target object is displayed in a three-dimensional space.
In the embodiment of the invention, the processor controls the light source to emit corresponding light rays based on the target graphic data, and the light rays are refracted by the lens module and thrown into eyes of a user to form a virtual target object in the retina of the user.
In the embodiment of the present invention, the display of the target object in the three-dimensional space means that the target object is displayed in a three-dimensional manner, and in order to achieve a three-dimensional display effect, the phases of the light projected to the left eye of the user and the light projected to the right eye of the user by the lens module are not consistent, so that the user can have a three-dimensional viewing experience.
In the embodiment of the invention, the first electronic device is provided with the orientation tracking device, and when the first electronic device is worn on the head of a user, the orientation of the head of the user can be detected in real time through the orientation tracking device, such as head raising, head lowering, head turning left and head turning right of the user. Since the first electronic device knows the orientation of the user's head at any time, it is possible to display a virtual target object within the user's field of view.
In one embodiment, as shown in fig. 4, in order to better display the target object, a 3D preview window is further displayed around the target object, the 3D model is displayed inside the 3D preview window, and the menu key is located at the bottom of the 3D preview window. The user can quickly locate the 3D model through the 3D preview window.
Here, the 3D preview window has three display modes, which are a first display mode, a second display mode, and a third display mode, respectively; wherein:
when the 3D preview window is in the first display mode, no matter which direction the head of the user faces, the 3D preview window always moves adaptively along with the head movement of the user, and the 3D preview window is ensured to be positioned in the visual field of the user.
When the 3D preview window is in the second display mode, the 3D preview window always remains in a relative positional relationship with the second electronic device regardless of which orientation the user's head is oriented and regardless of which orientation the second electronic device is moved to, e.g., the 3D preview window is always located at a particular position to the left of the second electronic device.
When the 3D preview window is in the third display mode, the absolute position of the 3D preview window remains unchanged regardless of which orientation the user's head is oriented and regardless of which orientation the second electronic device is moved to.
Of course, the user may set the position of the 3D preview window to a fixed state to reduce the misoperation caused by the user being unable to accurately position the 3D preview window. When the visual range of the user deviates from the 3D preview window, the first electronic equipment sends a prompt to the user to prompt the user to deviate from the 3D preview window. Here, the presentation may be realized by sound, image, vibration, or the like.
The user can change the orientation of the 3D preview window through operation, and can also change the orientation of the 3D model in the 3D preview window. Of course, the user may also hide the 3D preview window or display the 3D preview window by operation. Here, the operation may be realized by, but not limited to, a gesture operation, a mouse operation from the second electronic device, an operation from the remote control device.
In the embodiment of the invention, the shape of the 3D preview window can be a cube, a cylinder and the like, and a user can self-define the shape of the 3D preview window according to requirements.
In one embodiment, in order to facilitate a user to better manipulate a virtual object in a three-dimensional space, a three-dimensional bounding box is displayed while the target graphic data is displayed through the lens module, wherein the three-dimensional bounding box is used for indicating an area range in which the target graphic data can be displayed by the user. As shown in fig. 5, the three-dimensional bounding box is rendered by a stereoscopic mesh, and all virtual objects are rendered within the three-dimensional bounding box.
Step 704: receiving updated target graphics data sent by the second electronic device, wherein the updated target graphics data is generated by the second electronic device based on a second operation of a user; adjusting display parameters of the model or menu key based on the updated target graphical data.
In the embodiment of the present invention, the second input operation may be, but is not limited to, the following operations: mouse operation of the second electronic device, keyboard operation of the second electronic device, and the like.
Taking the mouse operation as an example, the second electronic device is connected to the mouse, and the user operates the mouse to trigger the second input operation, such as turning the model 30 degrees to the left, presenting the bottom of the model in front, and so on. In order to make the virtual display screen consistent with the operation of the user, the second electronic device updates the target graphic data in real time based on the second input operation, and then sends the target graphic data updated in real time to the first electronic device for display.
In the embodiment of the invention, the first electronic device is used as the 3D expansion screen of the second electronic device, and can be dynamically updated in real time according to the graphic data on the second electronic device. When the 3D model is presented through the first electronic device, the user can intuitively know the size of the model, and if a plurality of models exist, the relative sizes among the models can be easily compared, so that the operation of the user on the models can be more accurate.
In the embodiment of the invention, the first electronic device is responsible for receiving the graphic data of the second electronic device and realizing 3D display, and the second electronic device is responsible for a specific data processing process, so that the number of chips of the first electronic device is less, and the structure is lighter and thinner.
Fig. 8 is a schematic structural composition diagram of an electronic device according to an embodiment of the present invention, where the electronic device is a first electronic device, and as shown in fig. 8, the electronic device includes:
a communication interface 81, configured to establish a connection with a second electronic device, where the second electronic device has a display unit, and the display unit is capable of displaying a two-dimensional graphical interface; obtaining target graphical data based on the connection, the target graphical data including all or part of graphical data of the graphical interface;
and the lens module 82 is used for displaying the target object by using the target graphic data, wherein the target object is displayed in a three-dimensional space.
In an embodiment of the present invention, the target graphics data includes: a first type of data and a second type of data, wherein,
the lens module 82 is specifically configured to: displaying a model having a first shape in a three-dimensional space based on the first type of data, the model corresponding to the graphical interface; and displaying a menu key with a second shape in a three-dimensional space based on the second type of data.
In an embodiment of the present invention, the electronic device further includes:
an input device 83 for detecting a first input operation of the user for the model or menu key;
the communication interface 81 is further configured to send operation information corresponding to the first input operation to the second electronic device; sending operation information corresponding to the first input operation to the second electronic equipment;
the lens module 82 is further configured to: adjusting display parameters of the model or menu key based on the updated target graphical data.
In this embodiment of the present invention, the communication interface 81 is further configured to receive updated target graphics data sent by the second electronic device, where the updated target graphics data is generated by the second electronic device based on a second operation of the user;
the lens module 82 is further configured to: adjusting display parameters of the model or menu key based on the updated target graphical data.
In this embodiment of the present invention, the lens module 82 is further configured to: and displaying the target graphic data and a three-dimensional bounding box at the same time, wherein the three-dimensional bounding box is used for indicating the area range which can be displayed by the target graphic data of the user.
In this embodiment of the present invention, the lens module 82 is further configured to: and displaying a 3D preview window around the target object, displaying the 3D model in the 3D preview window, and positioning the menu key at the bottom of the 3D preview window. The user can quickly locate the 3D model through the 3D preview window.
Here, the 3D preview window has three display modes, which are a first display mode, a second display mode, and a third display mode, respectively; wherein:
when the 3D preview window is in the first display mode, no matter which direction the head of the user faces, the 3D preview window always moves adaptively along with the head movement of the user, and the 3D preview window is ensured to be positioned in the visual field of the user.
When the 3D preview window is in the second display mode, the 3D preview window always remains in a relative positional relationship with the second electronic device regardless of which orientation the user's head is oriented and regardless of which orientation the second electronic device is moved to, e.g., the 3D preview window is always located at a particular position to the left of the second electronic device.
When the 3D preview window is in the third display mode, the absolute position of the 3D preview window remains unchanged regardless of which orientation the user's head is oriented and regardless of which orientation the second electronic device is moved to.
Of course, the user may set the position of the 3D preview window to a fixed state to reduce the misoperation caused by the user being unable to accurately position the 3D preview window. When the visual range of the user deviates from the 3D preview window, the first electronic equipment sends a prompt to the user to prompt the user to deviate from the 3D preview window. Here, the presentation may be realized by sound, image, vibration, or the like.
The user can change the orientation of the 3D preview window through operation, and can also change the orientation of the 3D model in the 3D preview window. Of course, the user may also hide the 3D preview window or display the 3D preview window by operation. Here, the operation may be realized by, but not limited to, a gesture operation, a mouse operation from the second electronic device, an operation from the remote control device.
In the embodiment of the invention, the shape of the 3D preview window can be a cube, a cylinder and the like, and a user can self-define the shape of the 3D preview window according to requirements.
Those skilled in the art will understand that the implementation functions of each unit in the electronic device shown in fig. 8 can be understood by referring to the related description of the information processing method.
The technical schemes described in the embodiments of the present invention can be combined arbitrarily without conflict.
In the embodiments provided in the present invention, it should be understood that the disclosed method and intelligent device may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one second processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention.
Claims (8)
1. An information processing method, characterized in that the method comprises:
the method comprises the following steps that connection between first electronic equipment and second electronic equipment is established, wherein the second electronic equipment is provided with a display unit, and the display unit can display a two-dimensional graphical interface;
the first electronic equipment acquires target graphic data based on the connection, the target graphic data comprises first type data and second type data, the first type data comprises all graphic data or part of graphic data of the graphical interface, a model with a first shape can be displayed in a three-dimensional space based on the first type data, and the model corresponds to the graphical interface; displaying a menu key with a second shape in a three-dimensional space based on the second type of data;
displaying a target object through a lens module by using the target graphic data, wherein the target object is displayed in a three-dimensional space; and displaying a three-dimensional preview window around the target object, displaying the model in the three-dimensional preview window, and positioning the menu key at the bottom of the three-dimensional preview window.
2. The information processing method according to claim 1, characterized by further comprising:
detecting a first input operation of a user for the model or the menu key;
sending operation information corresponding to the first input operation to the second electronic equipment;
receiving updated target graphic data sent by the second electronic device;
adjusting display parameters of the model or menu key based on the updated target graphical data.
3. The information processing method according to claim 1, characterized by further comprising:
receiving updated target graphics data sent by the second electronic device, wherein the updated target graphics data is generated by the second electronic device based on a second operation of a user;
adjusting display parameters of the model or menu key based on the updated target graphical data.
4. The information processing method according to claim 1, characterized by further comprising:
and displaying a three-dimensional boundary box while displaying the target graphic data through the lens module, wherein the three-dimensional boundary box is used for indicating the area range which can be displayed by the target graphic data of the user.
5. An electronic device, wherein the electronic device is a first electronic device, comprising:
the communication interface is used for establishing connection with second electronic equipment, wherein the second electronic equipment is provided with a display unit, and the display unit can display a two-dimensional graphical interface; obtaining target graphic data based on the connection, wherein the target graphic data comprises a first type of data and a second type of data, and the first type of data comprises all graphic data or part of graphic data of the graphical interface;
the lens module is used for displaying a target object by utilizing the target graphic data, wherein the target object is displayed in a three-dimensional space, and a three-dimensional preview window is displayed around the target object;
the lens module is specifically used for: displaying a model with a first shape in a three-dimensional space based on the first type of data, wherein the model corresponds to the graphical interface, and the model is displayed inside the three-dimensional preview window; displaying a menu key with a second shape in a three-dimensional space based on the second type of data; and the menu key is positioned at the bottom of the three-dimensional preview window.
6. The electronic device of claim 5, further comprising:
the input device is used for detecting a first input operation of a user for the model or the menu key;
the communication interface is further configured to send operation information corresponding to the first input operation to the second electronic device; sending operation information corresponding to the first input operation to the second electronic equipment;
the lens module is further used for: adjusting display parameters of the model or menu keys based on the updated target graphical data.
7. The electronic device of claim 5,
the communication interface is further used for receiving updated target graphic data sent by the second electronic device, wherein the updated target graphic data is generated by the second electronic device based on a second operation of a user;
the lens module is further used for: adjusting display parameters of the model or menu key based on the updated target graphical data.
8. The electronic device of claim 5, wherein the lens module is further configured to: and displaying the target graphic data and a three-dimensional bounding box at the same time, wherein the three-dimensional bounding box is used for indicating the area range which can be displayed by the target graphic data of the user.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710202059.2A CN107133028B (en) | 2017-03-30 | 2017-03-30 | Information processing method and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710202059.2A CN107133028B (en) | 2017-03-30 | 2017-03-30 | Information processing method and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107133028A CN107133028A (en) | 2017-09-05 |
CN107133028B true CN107133028B (en) | 2021-07-16 |
Family
ID=59714942
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710202059.2A Active CN107133028B (en) | 2017-03-30 | 2017-03-30 | Information processing method and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107133028B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109471603A (en) * | 2017-09-07 | 2019-03-15 | 华为终端(东莞)有限公司 | A kind of interface display method and device |
CN109996348B (en) * | 2017-12-29 | 2022-07-05 | 中兴通讯股份有限公司 | Method, system and storage medium for interaction between intelligent glasses and intelligent equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102419481A (en) * | 2010-09-28 | 2012-04-18 | 财团法人工业技术研究院 | Dual-mode image display device and image luminance adjusting method |
CN103713387A (en) * | 2012-09-29 | 2014-04-09 | 联想(北京)有限公司 | Electronic device and acquisition method |
CN104270623A (en) * | 2014-09-28 | 2015-01-07 | 联想(北京)有限公司 | Display method and electronic device |
CN105488840A (en) * | 2015-11-26 | 2016-04-13 | 联想(北京)有限公司 | Information processing method and electronic equipment |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101577795A (en) * | 2009-06-17 | 2009-11-11 | 深圳华为通信技术有限公司 | Method and device for realizing real-time viewing of panoramic picture |
US20110193857A1 (en) * | 2010-02-05 | 2011-08-11 | Vasily Filippov | Methods and apparatus for rendering a collection of widgets on a mobile device display |
US9026946B2 (en) * | 2011-08-08 | 2015-05-05 | Blackberry Limited | Method and apparatus for displaying an image |
CN104272349B (en) * | 2012-06-20 | 2018-03-02 | 皇家飞利浦有限公司 | Multiphase machine equipment tracks |
CN103019595B (en) * | 2012-12-05 | 2016-03-16 | 北京百度网讯科技有限公司 | Terminal device and method for switching theme thereof |
JP6367560B2 (en) * | 2014-01-20 | 2018-08-01 | ローランドディー.ジー.株式会社 | 3D modeling apparatus and 3D modeling method |
CN103927171B (en) * | 2014-04-14 | 2017-02-15 | 广州市久邦数码科技有限公司 | Method and system for implementing multi-screen preview on stereoscopic desktops |
CN104391453A (en) * | 2014-10-22 | 2015-03-04 | 北京恒泰实达科技股份有限公司 | Visual control room signal control method and system |
CN105677189B (en) * | 2016-02-19 | 2020-02-18 | 腾讯科技(深圳)有限公司 | Method and device for controlling application |
-
2017
- 2017-03-30 CN CN201710202059.2A patent/CN107133028B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102419481A (en) * | 2010-09-28 | 2012-04-18 | 财团法人工业技术研究院 | Dual-mode image display device and image luminance adjusting method |
CN103713387A (en) * | 2012-09-29 | 2014-04-09 | 联想(北京)有限公司 | Electronic device and acquisition method |
CN104270623A (en) * | 2014-09-28 | 2015-01-07 | 联想(北京)有限公司 | Display method and electronic device |
CN105488840A (en) * | 2015-11-26 | 2016-04-13 | 联想(北京)有限公司 | Information processing method and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN107133028A (en) | 2017-09-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10725297B2 (en) | Method and system for implementing a virtual representation of a physical environment using a virtual reality environment | |
US11563886B2 (en) | Automated eyewear device sharing system | |
US11869156B2 (en) | Augmented reality eyewear with speech bubbles and translation | |
JPWO2016203792A1 (en) | Information processing apparatus, information processing method, and program | |
US11740852B2 (en) | Eyewear including multi-user, shared interactive experiences | |
EP3304273B1 (en) | User terminal device, electronic device, and method of controlling user terminal device and electronic device | |
CN110622110B (en) | Method and apparatus for providing immersive reality content | |
US11195341B1 (en) | Augmented reality eyewear with 3D costumes | |
US11422380B2 (en) | Eyewear including virtual scene with 3D frames | |
US11392199B2 (en) | Eyewear with shared gaze-responsive viewing | |
KR20230022239A (en) | Augmented reality experience enhancements | |
US20210406542A1 (en) | Augmented reality eyewear with mood sharing | |
US11803234B2 (en) | Augmented reality with eyewear triggered IoT | |
CN107133028B (en) | Information processing method and electronic equipment | |
US20230007227A1 (en) | Augmented reality eyewear with x-ray effect | |
CN116670722B (en) | Augmented reality collaboration system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |