CN116016892A - Image display method and device of intelligent glasses, electronic equipment and storage medium - Google Patents

Image display method and device of intelligent glasses, electronic equipment and storage medium Download PDF

Info

Publication number
CN116016892A
CN116016892A CN202211530724.8A CN202211530724A CN116016892A CN 116016892 A CN116016892 A CN 116016892A CN 202211530724 A CN202211530724 A CN 202211530724A CN 116016892 A CN116016892 A CN 116016892A
Authority
CN
China
Prior art keywords
image
glasses
display screen
display
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211530724.8A
Other languages
Chinese (zh)
Inventor
杨凯
李斌选
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shixiang Technology Co Ltd
Original Assignee
Guangzhou Shixiang Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shixiang Technology Co Ltd filed Critical Guangzhou Shixiang Technology Co Ltd
Priority to CN202211530724.8A priority Critical patent/CN116016892A/en
Publication of CN116016892A publication Critical patent/CN116016892A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to an image display method and device of intelligent glasses, electronic equipment and storage medium, wherein the method comprises the following steps: acquiring two-dimensional image data of the main control end; the two-dimensional image data are used for indicating a two-dimensional interface of the main control end; rendering the two-dimensional image data to obtain a first image and a second image which are identical; and sending the first image and the second image to the glasses end so that the first image is displayed on a first display screen of the glasses end and the second image is displayed on a second display screen of the glasses end, thereby displaying the image quality.

Description

Image display method and device of intelligent glasses, electronic equipment and storage medium
Technical Field
The present disclosure relates to the technical field of smart glasses, and in particular, to an image display method and apparatus for smart glasses, an electronic device, and a storage medium.
Background
The smart glasses are devices capable of completing functions of photographing, photographing video, playing video call with friends through voice or action manipulation, and include VR (Virtual Reality) glasses, AR (Augmented Reality) glasses and MR (Mixed Reality) glasses.
At present, when the intelligent glasses display two-dimensional picture images, images displayed in left and right screen areas on a display screen of the intelligent glasses are inconsistent, so that human eyes cannot perform image combination on the images, and the image display quality is reduced.
Disclosure of Invention
Accordingly, an object of the present application is to provide an image display method, apparatus, electronic device, and storage medium for smart glasses, which can improve image display quality.
According to a first aspect of embodiments of the present application, there is provided an image display method of smart glasses, including the steps of:
acquiring two-dimensional image data of the main control end; the two-dimensional image data are used for indicating a two-dimensional interface of the main control end;
rendering the two-dimensional image data to obtain a first image and a second image which are identical;
and sending the first image and the second image to the glasses end so that the first image is displayed on a first display screen of the glasses end and the second image is displayed on a second display screen of the glasses end.
According to a second aspect of embodiments of the present application, there is provided an image display device of smart glasses, including:
the image data acquisition module is used for acquiring the two-dimensional image data of the main control end; the two-dimensional image data are used for indicating a two-dimensional interface of the main control end;
the image obtaining module is used for rendering the two-dimensional image data to obtain a first image and a second image which are identical;
and the image display module is used for sending the first image and the second image to the glasses end so that the first image is displayed on a first display screen of the glasses end and the second image is displayed on a second display screen of the glasses end.
According to a third aspect of embodiments of the present application, there is provided an electronic device, including: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the image display method of the smart glasses as claimed in any one of the above.
According to a fourth aspect of embodiments of the present application, there is provided a computer-readable storage medium having stored thereon a computer program,
the computer program, when executed by a processor, implements the image display method of the smart glasses as described in any one of the above.
According to the embodiment of the application, the two-dimensional image data of the main control end are obtained; the two-dimensional image data are used for indicating a two-dimensional interface of the main control end; rendering the two-dimensional image data to obtain a first image and a second image which are identical; and sending the first image and the second image to the glasses end so that the first image is displayed on a first display screen of the glasses end and the second image is displayed on a second display screen of the glasses end, thereby improving the image display quality.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
For a better understanding and implementation, the present invention is described in detail below with reference to the drawings.
Drawings
Fig. 1 is a flowchart of an image display method of smart glasses according to an embodiment of the present application;
fig. 2 is a flowchart of step S20 in the image display method of the smart glasses according to an embodiment of the present application;
fig. 3 is a flowchart of step S30 in an image display method of smart glasses according to an embodiment of the present application;
fig. 4 is a flowchart of an image display method of smart glasses according to another embodiment of the present application;
fig. 5 is a flowchart of step S50 in an image display method of smart glasses according to another embodiment of the present application;
fig. 6 is a block diagram of an image display device of smart glasses according to an embodiment of the present application;
fig. 7 is a schematic block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the following detailed description of the embodiments of the present application will be given with reference to the accompanying drawings.
It should be understood that the described embodiments are merely some, but not all, of the embodiments of the present application. All other embodiments, based on the embodiments herein, which would be apparent to one of ordinary skill in the art without making any inventive effort, are intended to be within the scope of the present application.
The terminology used in the embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the embodiments of the application. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims. In the description of this application, it should be understood that the terms "first," "second," "third," and the like are used merely to distinguish between similar objects and are not necessarily used to describe a particular order or sequence, nor should they be construed to indicate or imply relative importance. The specific meaning of the terms in this application will be understood by those of ordinary skill in the art as the case may be.
Furthermore, in the description of the present application, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
For better understanding of the technical solutions of the present application, some technical smart glasses will be briefly described herein. The intelligent glasses comprise a main control end and a glasses end; the main control end comprises a main control chip, and the main control chip is used for processing signals transmitted by the glasses end and controlling the glasses end. The glasses end comprises a display screen and a rear camera, wherein the display screen is used for receiving and displaying image information transmitted by the main control chip, and the rear camera is used for receiving and shooting control signals transmitted by the main control chip. Specifically, the rear camera is arranged on the outer side of the display screen, can shoot the surrounding environment, and transmits the shot image to the main control chip, so that the user can interact with the surrounding environment when wearing the intelligent glasses.
In the embodiment of the application, the main control end further comprises an image display control chip, and the image display control chip can execute the image display method of the intelligent glasses provided in the embodiment of the application. Optionally, the image display control chip may be externally arranged on the intelligent glasses, and is connected with the intelligent glasses in a wired or wireless manner, so as to execute the image display method of the intelligent glasses provided in the embodiment of the application.
Example 1
Fig. 1 is a flowchart illustrating an image display method of smart glasses according to an embodiment of the present application. The image display method of the intelligent glasses provided by the embodiment of the application comprises the following steps:
s10: acquiring two-dimensional image data of a main control end; the two-dimensional image data are used for indicating a two-dimensional interface of the main control end.
The two-dimensional image data may be main interface data of the main control terminal, or application interface data of an application program of the main control terminal. The two-dimensional interface can be a main interface of the main control end or an application interface of the application program. For example, the main interface may be a boot interface of the master, the application may be a game APP, and the application interface data may be a game screen.
In the embodiment of the application, after the two-dimensional image is displayed on the main control end, two-dimensional image data corresponding to the two-dimensional image are acquired.
S20: and rendering the two-dimensional image data to obtain a first image and a second image which are identical.
In the embodiment of the application, the two-dimensional image data can be rendered through the rendering engine to obtain the identical first image and second image. The rendering engine may be an OpenGL engine, a Unity 3D engine, a illusion engine, or the like.
S30: and sending the first image and the second image to the glasses end so that the first image is displayed on the first display screen of the glasses end and the second image is displayed on the second display screen of the glasses end.
In the embodiment of the present application, the first image may be a left eye image, and the second image may be a right eye image. The glasses end comprises a first display screen and a second display screen, wherein the first display screen can be a display screen on the left side of the glasses end and is used for displaying a left eye image. The second display screen may be a display screen on the right side of the glasses end for displaying the right-eye image.
By applying the embodiment of the application, the two-dimensional image data of the main control end are obtained; the two-dimensional image data are used for indicating a two-dimensional interface of the main control end; rendering the two-dimensional image data to obtain a first image and a second image which are identical; and sending the first image and the second image to the glasses end so that the first image is displayed on the first display screen of the glasses end and the second image is displayed on the second display screen of the glasses end, thereby improving the image display quality.
In an alternative embodiment, referring to fig. 2, step S20 of rendering two-dimensional image data to obtain identical first and second images includes steps S21 to S22, specifically as follows:
s21: creating a virtual camera and textures, and rendering the two-dimensional image data through the virtual camera to obtain a rendered image.
In embodiments of the present application, textures may be created from predefined texture width information and height information. The texture width information may be a width of the glasses side display screen, and the texture width information may be a height of the glasses side display screen. The virtual camera may be created according to preset rendering attributes including sample rate, depth value, color, transparency, etc.
S22: and filling the rendered image into the texture, and copying the filled texture to obtain a first image and a second image which are identical.
In this embodiment of the present application, the texture is used to receive the rendered image rendered by the virtual camera, and how to copy the filled texture specifically belongs to the prior art, and is not described herein again.
By creating a virtual camera and textures, rendering the two-dimensional image data, rendering performance can be improved, and a high-quality first image and a high-quality second image can be obtained.
In an alternative embodiment, referring to fig. 3, step S30 of transmitting the first image and the second image to the glasses end so that the first image is displayed on the first display screen of the glasses end and the second image is displayed on the second display screen of the glasses end includes steps S31 to S32, specifically as follows:
s31: and acquiring a first display area identifier of the first display screen, a second display area identifier of the second display screen, a first image identifier and a second image identifier.
The first display area may be all display areas of the first display screen, or may be part of display areas of the first display screen. Specifically, the partial region is a circular region, an elliptical region, a square region, or a rectangular region at the geometric center of the first display screen. The first display area identification is used to uniquely mark the first display area.
The second display area may be the entire display area of the second display screen, or may be a partial display area of the second display screen. Specifically, the partial region is a circular region, an elliptical region, a square region, or a rectangular region at the geometric center of the second display screen. The second display area identification is used to uniquely mark the second display area.
S32: according to the mapping relation between the preset display area identification and the image identification, a first image is sent to a first display screen, and a second image is sent to a second display screen, so that the first image is displayed on the first display screen of the glasses end, and the second image is displayed on the second display screen of the glasses end.
One display area identifier corresponds to a unique image identifier in a preset mapping relation between the display area identifier and the image identifier.
In the embodiment of the application, according to the first image identifier of the first image, a corresponding first display area identifier can be obtained, so that the first image is sent to the first display area of the first display screen to be displayed. And according to the second image identification of the second image, a corresponding second display area identification can be obtained, so that the second image is sent to a second display area of a second display screen for display.
Through the mapping relation between the display area identification and the image identification, the image can be accurately displayed in the corresponding display area.
In an alternative embodiment, referring to fig. 4, the image display method of the smart glasses includes steps S40 to S50, specifically as follows:
s40: acquiring key data of a peripheral handle;
s50: and updating the two-dimensional image data of the main control terminal according to the key data.
In the embodiment of the application, the peripheral handle responds to the triggering operation of the user on each function key of the peripheral handle, key data are generated, the key data can be instruction information corresponding to the function keys, and the triggering operation can be a single-click operation, a double-click operation, a sliding operation or the like. For example, in response to a click operation of the function key a by the user, instruction information corresponding to the function key a is generated.
And acquiring key data from the peripheral handle, and updating the two-dimensional image data of the main control terminal according to the key data. Specifically, the key data may be sent to the master control end, and the master control end performs corresponding processing on the two-dimensional image data according to the key data, so as to update the two-dimensional image data. For example, if the key data indicates a page turning operation, the main control terminal obtains and displays the next two-dimensional image.
Through obtaining the key data of the peripheral handle, the interactive operation of the peripheral handle and the main control terminal can be realized.
In an alternative embodiment, referring to fig. 5, the two-dimensional image data includes a virtual application interface and a virtual object, and step S50 updates the two-dimensional image data of the master according to the key data, including steps S51 to S52, specifically as follows:
s51: and obtaining screen coordinates corresponding to the key data according to the key data and the preset mapping relation between the key data and the screen coordinates.
In this embodiment of the present application, the screen coordinates refer to the coordinate position of the display screen of the main control end, where a function control exists. In particular, the functionality control may be a game icon. Each key data in the preset mapping relation between the key data and the screen coordinates corresponds to one screen coordinate in the display screen of the main control terminal.
S52: and controlling the virtual object to execute corresponding actions according to the key data and the screen coordinates, and updating the two-dimensional image data of the main control terminal according to the actions.
In the embodiment of the present application, the virtual object refers to a virtual character in the virtual application interface. And determining screen coordinates according to the key data, so that the function control at the screen coordinates can be triggered to control the virtual character in the virtual application picture to execute corresponding actions. For example, the virtual character is controlled to move forward, to move right, or to release skills, etc. And continuously acquiring the two-dimensional image of the corresponding frame in the process of executing the corresponding action by the virtual character in the virtual application picture so as to update the two-dimensional image data of the main control terminal.
Through the mapping relation between the key data and the screen coordinates, the interactive operation between the peripheral handle and the main control terminal can be automatically and quickly realized.
Example 2
The following are examples of apparatus that may be used to perform the method of example 1 of the present application. For details not disclosed in the device embodiments of the present application, please refer to the method in embodiment 1 of the present application.
Fig. 6 is a schematic structural diagram of an image display device of the smart glasses according to an embodiment of the present application. The image display device 6 of the smart glasses provided in the embodiment of the present application includes:
an image data acquisition module 61, configured to acquire two-dimensional image data of a master control end; the two-dimensional image data are used for indicating a two-dimensional interface of the main control end;
an image obtaining module 62, configured to render two-dimensional image data to obtain a first image and a second image that are identical;
the image display module 63 is configured to send the first image and the second image to the glasses end, so that the first image is displayed on the first display screen of the glasses end, and the second image is displayed on the second display screen of the glasses end.
Optionally, the image obtaining module includes:
the image rendering unit is used for creating a virtual camera and textures, and rendering the two-dimensional image data through the virtual camera to obtain a rendered image;
and the texture filling unit is used for filling the rendering image into textures, and copying the filled textures to obtain a first image and a second image which are identical.
Optionally, the image display module includes:
the area identification acquisition unit is used for acquiring a first display area identification of the first display screen, a second display area identification of the second display screen, a first image identification and a second image identification;
the image sending unit is used for sending the first image to the first display screen and sending the second image to the second display screen according to the preset mapping relation between the display area identifier and the image identifier, so that the first image is displayed on the first display screen of the glasses end, and the second image is displayed on the second display screen of the glasses end.
Optionally, the image display device of the smart glasses includes:
the key data acquisition module is used for acquiring key data of the peripheral handle;
and the image data updating module is used for updating the two-dimensional image data of the main control terminal according to the key data.
Optionally, the image data updating module includes:
the screen coordinate obtaining unit is used for obtaining screen coordinates corresponding to the key data according to the key data and the preset mapping relation between the key data and the screen coordinates;
and the action control unit is used for controlling the virtual object to execute corresponding actions according to the key data and the screen coordinates, and updating the two-dimensional image data of the main control terminal according to the actions.
By applying the embodiment of the application, the two-dimensional image data of the main control end are obtained; the two-dimensional image data are used for indicating a two-dimensional interface of the main control end; rendering the two-dimensional image data to obtain a first image and a second image which are identical; and sending the first image and the second image to the glasses end so that the first image is displayed on the first display screen of the glasses end and the second image is displayed on the second display screen of the glasses end, thereby improving the image display quality.
Example 3
The following are device embodiments of the present application that may be used to perform the method of embodiment 1 of the present application. For details not disclosed in the apparatus embodiments of the present application, please refer to the method in embodiment 1 of the present application.
Referring to fig. 7, the present application further provides an electronic device 300, which may be specifically a computer, a mobile phone, a tablet computer, an interactive tablet, and the like, in an exemplary embodiment of the present application, the electronic device 300 is an interactive tablet, and the interactive tablet may include: at least one processor 301, at least one memory 302, at least one display, at least one network interface 303, a user interface 304, and at least one communication bus 305.
The user interface 304 is mainly used for providing an input interface for a user, and acquiring data input by the user. Optionally, the user interface may also include a standard wired interface, a wireless interface.
The network interface 303 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Wherein a communication bus 305 is used to enable connected communications between these components.
Wherein the processor 301 may include one or more processing cores. The processor uses various interfaces and lines to connect various portions of the overall electronic device, perform various functions of the electronic device, and process data by executing or executing instructions, programs, code sets, or instruction sets stored in memory, and invoking data stored in memory. Alternatively, the processor may be implemented in hardware in at least one of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), and a modem etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display layer; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor and may be implemented by a single chip.
The Memory 302 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). The memory may be used to store instructions, programs, code sets, or instruction sets. The memory may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the above-described respective method embodiments, etc.; the storage data area may store data or the like referred to in the above respective method embodiments. The memory may optionally also be at least one storage device located remotely from the aforementioned processor. The memory as a computer storage medium may include an operating system, a network communication module, a user interface module, and an operating application program.
The processor may be configured to call an application program of the video resolution adjustment method stored in the memory, and specifically execute the method steps of the foregoing embodiment 1, and the specific execution process may refer to the specific description shown in embodiment 1, which is not repeated herein.
Example 4
The present application further provides a computer readable storage medium, on which a computer program is stored, where instructions are adapted to be loaded by a processor and execute the method steps of the above-described embodiment 1, and the specific execution process may refer to the specific description shown in the embodiment, which is not repeated herein. The storage medium can be an electronic device such as a personal computer, a notebook computer, a smart phone, a tablet computer and the like.
For the device embodiments, reference is made to the description of the method embodiments for the relevant points, since they essentially correspond to the method embodiments. The above-described apparatus embodiments are merely illustrative, in which components illustrated as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purposes of the present application. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, etc., such as Read Only Memory (ROM) or flash RAM. Memory is an example of a computer-readable medium.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises an element.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and changes may be made to the present application by those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. which are within the spirit and principles of the present application are intended to be included within the scope of the claims of the present application.

Claims (10)

1. The image display method of the intelligent glasses is characterized in that the intelligent glasses comprise a main control end and a glasses end; the method comprises the following steps:
acquiring two-dimensional image data of the main control end; the two-dimensional image data are used for indicating a two-dimensional interface of the main control end;
rendering the two-dimensional image data to obtain a first image and a second image which are identical;
and sending the first image and the second image to the glasses end so that the first image is displayed on a first display screen of the glasses end and the second image is displayed on a second display screen of the glasses end.
2. The image display method of the smart glasses according to claim 1, wherein:
the step of rendering the two-dimensional image data to obtain a first image and a second image which are identical, includes:
creating a virtual camera and textures, and rendering the two-dimensional image data through the virtual camera to obtain a rendered image;
and filling the rendering image into the texture, and copying the filled texture to obtain a first image and a second image which are identical.
3. The image display method of the smart glasses according to claim 1, wherein:
the step of sending the first image and the second image to the glasses end so that the first image is displayed on a first display screen of the glasses end and the second image is displayed on a second display screen of the glasses end includes:
acquiring a first display area identifier of the first display screen, a second display area identifier of the second display screen, a first image identifier and a second image identifier;
according to a preset mapping relation between the display area identifiers and the image identifiers, the first image is sent to the first display screen, the second image is sent to the second display screen, so that the first image is displayed on the first display screen of the glasses end, and the second image is displayed on the second display screen of the glasses end.
4. The image display method of smart glasses according to claim 1, further comprising:
acquiring key data of a peripheral handle;
and updating the two-dimensional image data of the main control terminal according to the key data.
5. The image display method of smart glasses according to claim 4, wherein:
the two-dimensional image data comprises a virtual application interface and a virtual object;
the step of updating the two-dimensional image data of the main control end according to the key data comprises the following steps:
obtaining screen coordinates corresponding to the key data according to the key data and the preset mapping relation between the key data and the screen coordinates;
and controlling the virtual object to execute corresponding actions according to the key data and the screen coordinates, and updating the two-dimensional image data of the main control terminal according to the actions.
6. An image display device of intelligent glasses, characterized by comprising:
the image data acquisition module is used for acquiring the two-dimensional image data of the main control end; the two-dimensional image data are used for indicating a two-dimensional interface of the main control end;
the image obtaining module is used for rendering the two-dimensional image data to obtain a first image and a second image which are identical;
and the image display module is used for sending the first image and the second image to the glasses end so that the first image is displayed on a first display screen of the glasses end and the second image is displayed on a second display screen of the glasses end.
7. The image display device of smart glasses according to claim 6, wherein:
the image acquisition module includes:
an image rendering unit for creating a virtual camera and textures, and rendering the two-dimensional image data by the virtual camera to obtain a rendered image;
and the texture filling unit is used for filling the rendering image into the texture, and copying the filled texture to obtain a first image and a second image which are identical.
8. The image display device of smart glasses according to claim 7, wherein:
the image display module includes:
the area identifier obtaining unit is used for obtaining a first display area identifier of the first display screen, a second display area identifier of the second display screen, a first image identifier and a second image identifier;
the image sending unit is used for sending the first image to the first display screen and sending the second image to the second display screen according to a preset mapping relation between the display area identifier and the image identifier, so that the first image is displayed on the first display screen of the glasses end, and the second image is displayed on the second display screen of the glasses end.
9. An electronic device, comprising: a processor, a memory and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any one of claims 1 to 5 when the computer program is executed.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 5.
CN202211530724.8A 2022-12-01 2022-12-01 Image display method and device of intelligent glasses, electronic equipment and storage medium Pending CN116016892A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211530724.8A CN116016892A (en) 2022-12-01 2022-12-01 Image display method and device of intelligent glasses, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211530724.8A CN116016892A (en) 2022-12-01 2022-12-01 Image display method and device of intelligent glasses, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116016892A true CN116016892A (en) 2023-04-25

Family

ID=86018194

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211530724.8A Pending CN116016892A (en) 2022-12-01 2022-12-01 Image display method and device of intelligent glasses, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116016892A (en)

Similar Documents

Publication Publication Date Title
US10460512B2 (en) 3D skeletonization using truncated epipolar lines
EP3332565B1 (en) Mixed reality social interaction
EP3827411B1 (en) Conditional modification of augmented reality object
CN109377546B (en) Virtual reality model rendering method and device
US11282264B2 (en) Virtual reality content display method and apparatus
CN109829981B (en) Three-dimensional scene presentation method, device, equipment and storage medium
EP3882862A1 (en) Picture rendering method and apparatus, and storage medium and electronic apparatus
CN107223270B (en) Display data processing method and device
CN112933597B (en) Image processing method, image processing device, computer equipment and storage medium
US20130187905A1 (en) Methods and systems for capturing and moving 3d models and true-scale metadata of real world objects
US11294535B2 (en) Virtual reality VR interface generation method and apparatus
US20170186243A1 (en) Video Image Processing Method and Electronic Device Based on the Virtual Reality
CA3114601A1 (en) A cloud-based system and method for creating a virtual tour
CN109725956B (en) Scene rendering method and related device
KR20160108158A (en) Method for synthesizing a 3d backgroud content and device thereof
CN111414225A (en) Three-dimensional model remote display method, first terminal, electronic device and storage medium
CN109448050B (en) Method for determining position of target point and terminal
CN113318428B (en) Game display control method, nonvolatile storage medium, and electronic device
CN108846900B (en) Method and system for improving spatial sense of user in room source virtual three-dimensional space diagram
CN106447756B (en) Method and system for generating user-customized computer-generated animations
CN105808035A (en) Icon display method and apparatus
CN111145358A (en) Image processing method, device and hardware device
CN112511815B (en) Image or video generation method and device
KR102551914B1 (en) Method and system for generating interactive object viewer
CN116016892A (en) Image display method and device of intelligent glasses, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination