CN111464781A - Image display method, image display device, storage medium, and electronic apparatus - Google Patents

Image display method, image display device, storage medium, and electronic apparatus Download PDF

Info

Publication number
CN111464781A
CN111464781A CN202010214797.0A CN202010214797A CN111464781A CN 111464781 A CN111464781 A CN 111464781A CN 202010214797 A CN202010214797 A CN 202010214797A CN 111464781 A CN111464781 A CN 111464781A
Authority
CN
China
Prior art keywords
augmented reality
camera
image
reality glasses
display area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010214797.0A
Other languages
Chinese (zh)
Inventor
杜鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010214797.0A priority Critical patent/CN111464781A/en
Publication of CN111464781A publication Critical patent/CN111464781A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The disclosure provides an image display method, an image display device, a storage medium and an electronic device, and relates to the technical field of virtual reality and augmented reality. The method is applied to augmented reality glasses which comprise a camera, and comprises the following steps: acquiring an image about a real scene acquired by the camera; displaying the image in a display area of the augmented reality glasses; wherein the orientation of the camera is different from the orientation of the lenses of the augmented reality glasses. The method and the device have the advantages that the user can observe things of the current environment from multiple visual angles, and good use experience is achieved.

Description

Image display method, image display device, storage medium, and electronic apparatus
Technical Field
The present disclosure relates to the field of virtual reality and augmented reality technologies, and in particular, to an image display method, an image display apparatus, a computer-readable storage medium, and an electronic device.
Background
In daily life, people often encounter situations where the environment in multiple directions, such as the front, the left side or the right side, needs to be observed, for example, in the scenes of riding, skiing and running, and people need to keep track of the surrounding environment information while observing the front. However, since the range of the visual angle of the human eye is limited, when looking ahead, the environment of other visual angles is often not noticed. Therefore, it is usually necessary to turn around, move the line of sight, or confirm environmental information of other viewing angles by other auxiliary tools, such as a rearview mirror. However, the above method may cause the user to be unaware of the obstacle in the blind area of the viewing angle, which may cause a dangerous situation and cause a serious safety hazard.
Therefore, how to enable the user to effectively acquire the surrounding environment information is a problem to be solved urgently in the prior art.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure provides an image display method, an image display apparatus, a computer-readable storage medium, and an electronic device, thereby improving a problem that a user cannot timely and accurately acquire surrounding environment information in some scenes to at least a certain extent.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to a first aspect of the present disclosure, there is provided an image display method applied to augmented reality glasses, the augmented reality glasses including a camera, the method including: acquiring an image about a real scene acquired by the camera; displaying the image in a display area of the augmented reality glasses; wherein the orientation of the camera is different from the orientation of the lenses of the augmented reality glasses.
According to a second aspect of the present disclosure, there is provided an image display device applied to augmented reality glasses, the augmented reality glasses including a camera, the device including: an image acquisition module for acquiring images about a real scene acquired by the camera; an image display module to display the image in a display area of the augmented reality glasses; wherein the orientation of the camera is different from the orientation of the lenses of the augmented reality glasses.
According to a third aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described image display method.
According to a fourth aspect of the present disclosure, there is provided an augmented reality device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the image display method described above via execution of the executable instructions.
The technical scheme of the disclosure has the following beneficial effects:
according to the image display method, the image display device, the computer-readable storage medium and the electronic equipment, the image of the real scene collected by the camera is obtained; displaying an image in a display area of augmented reality glasses; wherein, the orientation of the camera is different from the orientation of the lens of the augmented reality glasses. On one hand, the cameras with the different lens orientations than the augmented reality glasses are arranged on the augmented reality equipment, so that a user can acquire the front environment image and confirm the environment images in other visual angle ranges at the same time without rotating the head or moving the sight line, the problem of potential safety hazards caused by visual angle blind zones is avoided, and the life safety of the user is improved; on the other hand, this exemplary embodiment combines together augmented reality glasses and camera for the user need not only to wear augmented reality glasses with the help of other appurtenance, just can acquire the image of surrounding environment, simple structure, and the operation is succinct, and user experience is better.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 shows a schematic diagram of an electronic device of the present exemplary embodiment;
FIG. 2 shows a schematic diagram of augmented reality glasses in this exemplary embodiment;
fig. 3 shows a flowchart of an image display method of the present exemplary embodiment;
FIG. 4 shows a schematic view of another augmented reality glasses of the present exemplary embodiment;
fig. 5 shows a sub-flowchart of an image display method of the present exemplary embodiment;
fig. 6 shows a block diagram of a structure of an image display device of the present exemplary embodiment.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
An exemplary embodiment of the present disclosure provides an augmented reality device. Next, the unit structure inside the augmented reality glasses 100 in fig. 1 and 2 is exemplified. Those skilled in the art will appreciate that augmented reality glasses 100 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interfacing relationship between the various components is shown schematically and does not constitute a structural limitation on the augmented reality glasses 100. In other embodiments, the augmented reality glasses 100 may also interface differently than in fig. 1, or a combination of multiple interfacing.
As shown in fig. 1, the augmented reality glasses 100 may specifically include a storage unit 110, a processing unit 120, and a display unit 130.
The storage unit 110 is used for storing executable instructions, and may include, for example, an operating system code, a program code, and data generated during program execution, such as image data captured by the imaging unit 110, user data in APP, and the like. Referring to fig. 2, the storage unit 110 may be disposed in the mirror body between the two lenses, or may be disposed at other positions. The Storage unit 110 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk Storage device, a Flash memory device, a Universal Flash Storage (UFS), and the like.
The Processing Unit 120 may include a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), an Application Processor (AP), a modem Processor, an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor and/or a Neural Network Processor (NPU), and the like. The different processors may be implemented as separate units or may be integrated in one processing unit. Referring to fig. 2, the processing unit 120 may be disposed in the mirror body between the two lenses, or may be disposed at other positions. Processing unit 120 may execute executable instructions on storage unit 110 to execute corresponding program commands.
In an alternative embodiment, as shown in fig. 1, the Display unit 130 may include a micro Display (Display)1301 and a lens (L ens), the micro Display 1301 being used for providing Display content, such as a self-luminous active device, e.g., a light emitting diode panel, or a liquid crystal Display with external light source illumination, and the lens 1302 being used for enabling human eyes to view a real scene, such that the real scene image and the virtual image are superimposed.
In an alternative embodiment, the augmented reality glasses 100 may further include an image capturing unit 140, which is composed of a lens, a photosensitive element, and the like. Referring to fig. 2, which may be located at a position between two lenses, when the user wears the augmented reality glasses 100, the camera unit 140 faces the front of the user, a still image or a video of the front may be captured, for example, the user performs a gesture operation on the front, and the camera unit 140 may capture a gesture image of the user. Further, as shown in fig. 1, the image capturing unit 140 may include a depth camera 1401, which may be, for example, a TOF (Time Of Flight) camera, a binocular camera, and the like, and may detect depth information (i.e., an axial distance from the augmented reality glasses 100) Of each part or each object in the scene image, so as to obtain richer image information, for example, after the gesture image is captured, accurate gesture recognition may be implemented according to the depth information Of the gesture.
In an alternative embodiment, the augmented reality glasses 100 may further include an audio unit 150 for converting a digital audio signal into an analog audio signal for output, converting an analog audio input into a digital audio signal, and encoding and decoding the audio signal. In some embodiments, the audio unit 150 may be disposed in the processing unit 120, or some functional modules of the audio unit 150 may be disposed in the processing unit 120. As shown in fig. 1, the audio unit 150 may generally include a microphone 1501 and an earphone 1502, the microphone 1501 may be disposed at the bottom of the augmented reality glasses 100 near the mouth of the user, and the earphone 1502 may be disposed at the middle rear end of the temple of the augmented reality glasses 100 near the ear of the user.
In AN alternative embodiment, the augmented reality glasses 100 may further include a Communication unit 160, which may provide solutions for Wireless Communication including Wireless local Area network (Wireless L Area Networks, W L AN) (such as Wireless fidelity (Wi-Fi) network), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like, so that the augmented reality glasses 100 may be connected to the internet or form a connection with other devices.
In an alternative embodiment, the augmented reality glasses 100 may further include a sensor unit 170, which is composed of different types of sensors for implementing different functions. For example, the press sensor 1701 may detect the strength of the pressing operation by the user; the infrared sensor 1702 may detect the wearing state of the augmented reality glasses 100 by transmitting and receiving an infrared signal; the hall sensor 1703 can determine the bending state of the augmented reality glasses 100 by detecting whether the hall effect is generated, and the like.
In an optional embodiment, the augmented reality glasses 100 may further include a USB (Universal serial bus) interface 180, which conforms to a USB standard specification, and specifically may be a MiniUSB interface, a microsub interface, a USBTypeC interface, or the like. The USB interface 180 may be used to connect a charger to charge the augmented reality glasses 100, may also be connected to an earphone to play audio through the earphone, and may also be used to connect other electronic devices, such as a computer and a peripheral device.
In an alternative embodiment, the augmented reality glasses 100 may further include a charging management unit 190 for receiving a charging input from a charger to charge the battery 1901. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management unit 190 may receive a charging input of a wired charger through the USB interface 180. In some wireless charging embodiments, the charging management unit 190 may receive a wireless charging input through a wireless charging coil of the augmented reality glasses 100. The charging management unit 190 may also provide power to the device while charging the battery 1901.
First, the present exemplary embodiment provides an image display method, which may be applied to augmented reality glasses, where in order to facilitate wearing, the augmented reality glasses may include a glasses frame, a nose pad, and other physical structural components required for wearing, and in addition, the augmented reality glasses may be provided with a camera for acquiring an environment image around the augmented reality glasses. As shown in fig. 2, the augmented reality glasses 210 may be worn on the head of the user, and the augmented reality glasses 210 may be provided with a camera 220, for example, as shown in the figure, the camera 220 is provided on the side of the augmented reality glasses 210. The application scenario of the method of the embodiment may be as follows: when the user wears the augmented reality glasses to perform sports such as running, riding or skiing, the image of the surrounding environment can be displayed on the display interface of the augmented reality glasses, so that the user can better know the surrounding environment and the like.
Fig. 3 shows a flowchart of an image display method in the present exemplary embodiment, which may include the following steps S310 to S320:
step S310, acquiring an image about a real scene acquired by a camera;
step S320 of displaying an image in a display area of the augmented reality glasses;
wherein, the orientation of the camera is different from the orientation of the lens of the augmented reality glasses.
In practical applications, when the user is looking ahead, there may be environmental conditions that require viewing the left, right or other directional viewing angles, especially in some sports scenes, such as running, cycling or skiing, where the speed is high. If the rotary head observes the environment in other directions, the obstacles in front are easy to ignore, and danger is caused. Based on this, the present exemplary embodiment can know the environmental information of each direction through the environmental image captured by the camera by providing the camera on the augmented reality glasses. When the user wears the augmented reality glasses, the corresponding real environment image in the real environment where the user is located can be obtained through the camera on the augmented reality glasses, and the real environment image is displayed in the display area of the augmented reality glasses. Particularly when applied, the present exemplary embodiment may determine whether to display an image of a real scene or which region to display an image of a real scene according to a user requirement, for example, if the user does not need to acquire an image of a surrounding environment, the camera may not be operated; if the user needs to acquire the image of the surrounding environment, the camera can be triggered to be started through a specific gesture or specific operation on a physical key of the augmented reality glasses and the like; or if the augmented reality glasses are provided with a plurality of cameras, which camera is started to acquire images of the real scene can be determined according to user requirements.
Typically, a user, after wearing augmented reality glasses, can see through the lenses of the augmented reality glasses an image of an environment in a viewing angle to which the lenses are oriented, e.g., an environment in a viewing angle area in front of the user's eyes. Considering that the camera needs to capture an environment image at a different viewing angle from the viewing angle at which the lens faces, the present exemplary embodiment may set the facing direction of the camera to be a different direction from the facing direction of the lens of the augmented reality glasses, specifically, the facing direction of the camera may be set according to the structure of the augmented reality glasses and the user requirement, for example, when the augmented reality glasses are in a glasses structure, the camera may be set at the outer side of two glasses legs; when augmented reality glasses are helmet structure, can set up the camera around the helmet body of helmet, set up the camera like in left side, right side or rear side optional position. It should be noted that, depending on the application scenario, the augmented reality glasses may be provided with one or more cameras, for example, one camera is respectively provided on the left side, the right side, and the rear side of the virtual reality helmet.
In an exemplary embodiment, according to different requirements, the augmented reality glasses may be provided with multiple types of cameras, for example, when the environmental details are high, a high-pixel camera may be used; for lower environmental details, a low pixel camera may be used. In addition, any one or more functional cameras of a wide-angle camera, a telephoto camera or a macro camera may also be employed.
The exemplary embodiment can comprehensively consider and determine what kind of camera is adopted according to a specific application scene, for example, a wide field of view or rich content is shot in a relatively open environment, and a relatively large viewing range is required, so that a wide-angle camera can be adopted; or when the user moves faster and needs to know the environment at a longer distance in time to obtain a larger imaging image, a long-focus camera can be adopted; or in a scene with a narrow space, when a close-range environment needs to be observed carefully, a macro camera and the like can be adopted. It should be noted that, based on actual application requirements, the above multiple different types of cameras may also be combined, so that the augmented reality glasses can be applied to more application scenarios, which is not specifically limited by the present disclosure.
In an exemplary embodiment, the step S320 may include:
displaying an image in a preset sub-area of a display area;
wherein the preset subarea is smaller than the lens area of the augmented reality glasses.
In practical application, a user can see two parts of images by wearing augmented reality glasses provided with cameras, wherein one part is that the user sees the images of a real environment through lenses of the augmented reality glasses; the second is an image which is acquired and processed by the camera and then is displayed in the display area, that is, the image of the real environment shot in the exemplary embodiment. In general, the display area is provided in the lens area, and therefore, if the captured image of the real environment is displayed in an excessively large display size in the display area, the user may be affected from affecting the real environment viewed through the lens. Based on this, in the present exemplary embodiment, a preset sub-region may be set in the display region, and the captured image of the real environment is displayed in the preset sub-region, where the preset sub-region is smaller than the lens region of the augmented reality glasses. In addition, according to actual needs, transparency processing can be performed on the shot image, so that the image presented in the display area has certain transparency, the shot actual environment image is properly amplified, the image seen by the user through the lens cannot be greatly influenced, and good use experience is provided for the user.
In summary, in the present exemplary embodiment, by acquiring an image about a real scene captured by a camera; displaying an image in a display area of augmented reality glasses; wherein, the orientation of the camera is different from the orientation of the lens of the augmented reality glasses. On one hand, the cameras with the different lens orientations than the augmented reality glasses are arranged on the augmented reality equipment, so that a user can acquire the front environment image and confirm the environment images in other visual angle ranges at the same time without rotating the head or moving the sight line, the problem of potential safety hazards caused by visual angle blind zones is avoided, and the life safety of the user is improved; on the other hand, this exemplary embodiment combines together augmented reality glasses and camera for the user need not only to wear augmented reality glasses with the help of other appurtenance, just can acquire the image of surrounding environment, simple structure, and the operation is succinct, and user experience is better.
In an exemplary embodiment, the augmented reality glasses include a first camera and a second camera, the first camera is disposed on a first side of the augmented reality glasses and faces the first side, and the second camera is disposed on a second side of the augmented reality glasses and faces the second side.
In practical applications, the augmented reality glasses may generally include two glasses bodies (i.e., two lenses), and as shown in fig. 4, the augmented reality glasses include a first glasses body 410 and a second glasses body 420, which correspond to the right eye and the left eye of the user, respectively, and the user can see the environment image presented in a certain viewing angle region (a region 411 corresponding to the first glasses body 410 and a region 421 corresponding to the second glasses body 420) right in front through the first glasses body 410 and the second glasses body 420. In the present exemplary embodiment, the first side mirror body may refer to a side of the first mirror body 410, for example, a first camera is disposed at an outer side 430 of the right side temple of the augmented reality glasses, facing the first side, for acquiring an environment image of the right side area 431; the second side mirror may refer to a side of the second mirror 420, for example, a second camera is disposed at an outer side 440 of the left side temple of the augmented reality glasses, facing the second side, for obtaining an environmental image of the left region 441. It should be noted that the above example is only an illustrative example, and the first mirror may also be a mirror corresponding to the left eye, which is not limited in the present disclosure. In addition, although the orientation of the camera is different from the orientation of the lenses of the augmented reality glasses, there may be a case where the display environment image acquired by the camera coincides with the environment viewed by the user through the glasses, for example, the area 411 overlaps with both the area 421 and the area 431, the area 421 overlaps with both the area 411 and the area 441, and the like. In order to ensure that the image acquired by the camera and the image seen by the human eyes through the lens can completely cover the surrounding environment and have less area overlap, the present exemplary embodiment may further perform accurate calculation and the like on the setting position of the camera according to the range of the viewing angle of the human eyes and the range of the image acquired by the camera, for example, if the binocular horizontal field angle of the human eyes is 120 °, the field angles of the left and right cameras are not less than 60 °, and the calculation formula may be expressed as: and (4) 180-binocular horizontal field angle, and further determining the position and the field angle of the camera based on the field angle.
In an exemplary embodiment, the step S310 may include:
a first image captured by a first camera and a second image captured by a second camera are acquired.
For example, as shown in fig. 4, if the camera disposed at 430 is used as the first camera and the camera disposed at 440 is used as the second camera, the image captured at the region 431 is the first image, and the image captured at the region 441 is the second image.
In an exemplary embodiment, the step S320 may include the following steps:
displaying a first image in a first preset sub-area of a display area, and displaying a second image in a second preset sub-area of the display area;
the first preset subarea and the second preset subarea are smaller than the lens area of the augmented reality glasses.
The first image and the second image acquired by the cameras on the two sides can be respectively displayed in different sub-areas of the display interface of the augmented reality device, that is, the first image is displayed in the first preset sub-area, and the second image is displayed in the second preset sub-area. The positions of the first preset sub-region and the second preset sub-region can be set systematically or set by users according to the positions and directions of the cameras, for example, when the first camera is arranged on the right side of the augmented reality glasses, and the second camera is arranged on the left side of the augmented reality glasses, the acquired first image is a right-side real environment image, the acquired second image is a left-side real environment image, and in the display region, the first sub-region can be set to be located on the right side of the second sub-region, so that the user can judge the direction.
In an exemplary embodiment, as shown in fig. 5, the step S310 may include the following steps:
step S510, acquiring continuous multi-frame images collected by a camera;
the corresponding step S320 may include:
step S520, displaying the continuous multi-frame images in real time in the display area of the augmented reality glasses.
Considering that the exemplary embodiment is particularly applicable to fast and timely understanding of the surrounding environment in fast sports scenes such as running, cycling, skiing, and the like, the exemplary embodiment may capture images of a plurality of consecutive frames of real environment through the camera and display the images in the display area of the augmented reality glasses to realize real-time display of the surrounding environment. Namely, the camera in the augmented reality device can show the surrounding environment image in the form of video stream. In particular, according to the user requirement or the display requirement of the video stream, the frame rate can also be set by self-definition, for example, the frame rate can be set to be not lower than 24 frames per second.
Exemplary embodiments of the present disclosure also provide an image display device. Applied to augmented reality glasses including a camera, as shown in fig. 6, the image display device 600 may include: an image acquisition module 610 for acquiring images about a real scene captured by a camera; an image display module 620 for displaying an image in a display area of the augmented reality glasses; wherein, the orientation of the camera is different from the orientation of the lens of the augmented reality glasses.
In an exemplary embodiment, an image display module includes: a first image display unit for displaying an image in a preset sub-area of a display area; wherein the preset subarea is smaller than the lens area of the augmented reality glasses.
In an exemplary embodiment, the augmented reality glasses include a first camera and a second camera, the first camera is disposed on a first side of the augmented reality glasses and faces the first side, and the second camera is disposed on a second side of the augmented reality glasses and faces the second side.
In an exemplary embodiment, the image acquisition module includes: and the image acquisition unit is used for acquiring a first image acquired by the first camera and a second image acquired by the second camera.
In an exemplary embodiment, an image display module includes: a second image display unit for displaying the first image in a first preset sub-area of the display area and displaying the second image in a second preset sub-area of the display area; the first preset subarea and the second preset subarea are smaller than the lens area of the augmented reality glasses.
In an exemplary embodiment, the image acquisition module includes: the frame image acquisition unit is used for acquiring continuous multi-frame images collected by the camera; the image display module includes: and the frame image display unit is used for displaying continuous multi-frame images in real time in the display area of the augmented reality glasses.
In an exemplary embodiment, the camera includes any one or more of a wide angle camera, a tele camera, or a macro camera.
The specific details of each module in the above apparatus have been described in detail in the method section, and details that are not disclosed may refer to the method section, and thus are not described again.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product including program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the above-mentioned "exemplary methods" section of this specification, when the program product is run on the terminal device, for example, any one or more of the steps in fig. 3 or fig. 5 may be performed.
In the present exemplary embodiment, a program product for implementing the above-described method according to an exemplary embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including AN object oriented programming language such as Java, C + +, or the like, as well as conventional procedural programming languages, such as the "C" language or similar programming languages.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (10)

1. An image display method is applied to augmented reality glasses, and is characterized in that the augmented reality glasses comprise a camera, and the method comprises the following steps:
acquiring an image about a real scene acquired by the camera;
displaying the image in a display area of the augmented reality glasses;
wherein the orientation of the camera is different from the orientation of the lenses of the augmented reality glasses.
2. The method of claim 1, wherein the displaying the image in the display area of the augmented reality glasses comprises:
displaying the image in a preset sub-area of the display area;
wherein the preset sub-region is smaller than a lens region of the augmented reality glasses.
3. The method of claim 1, wherein the augmented reality glasses comprise a first camera and a second camera, the first camera is disposed on a first side of the augmented reality glasses and faces the first side, and the second camera is disposed on a second side of the augmented reality glasses and faces the second side.
4. The method of claim 3, wherein the acquiring images of the real scene captured by the camera comprises:
a first image captured by the first camera and a second image captured by the second camera are acquired.
5. The method of claim 4, wherein the displaying the image in the display area of the augmented reality glasses comprises:
displaying the first image in a first preset sub-area of the display area, and displaying the second image in a second preset sub-area of the display area;
the first preset subarea and the second preset subarea are smaller than the lens area of the augmented reality glasses.
6. The method of claim 1, wherein the acquiring images of the real scene captured by the camera comprises:
acquiring continuous multi-frame images collected by the camera;
the displaying the image in a display area of the augmented reality glasses includes:
displaying the continuous multi-frame images in real time in a display area of the augmented reality glasses.
7. The method of claim 1, wherein the camera comprises any one or more of a wide angle camera, a tele camera, or a macro camera.
8. An image display device applied to augmented reality glasses, the augmented reality glasses comprising a camera, the device comprising:
an image acquisition module for acquiring images about a real scene acquired by the camera;
an image display module to display the image in a display area of the augmented reality glasses;
wherein the orientation of the camera is different from the orientation of the lenses of the augmented reality glasses.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 7.
10. An augmented reality device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1 to 7 via execution of the executable instructions.
CN202010214797.0A 2020-03-24 2020-03-24 Image display method, image display device, storage medium, and electronic apparatus Pending CN111464781A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010214797.0A CN111464781A (en) 2020-03-24 2020-03-24 Image display method, image display device, storage medium, and electronic apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010214797.0A CN111464781A (en) 2020-03-24 2020-03-24 Image display method, image display device, storage medium, and electronic apparatus

Publications (1)

Publication Number Publication Date
CN111464781A true CN111464781A (en) 2020-07-28

Family

ID=71680872

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010214797.0A Pending CN111464781A (en) 2020-03-24 2020-03-24 Image display method, image display device, storage medium, and electronic apparatus

Country Status (1)

Country Link
CN (1) CN111464781A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113724398A (en) * 2021-09-01 2021-11-30 北京百度网讯科技有限公司 Augmented reality method, apparatus, device and storage medium
CN115955547A (en) * 2022-12-30 2023-04-11 上海梵企光电科技有限公司 Method and system for adjusting camera of XR glasses

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120206322A1 (en) * 2010-02-28 2012-08-16 Osterhout Group, Inc. Ar glasses with event and sensor input triggered user action capture device control of ar eyepiece facility
CN103957383A (en) * 2014-04-22 2014-07-30 中国科学院深圳先进技术研究院 Back vision system based on smart glasses
CN105187707A (en) * 2015-06-11 2015-12-23 广东欧珀移动通信有限公司 Shooting method based on rotary camera and intelligent glasses, and system
CN106843456A (en) * 2016-08-16 2017-06-13 深圳超多维光电子有限公司 A kind of display methods, device and virtual reality device followed the trail of based on attitude
CN107065183A (en) * 2017-03-21 2017-08-18 广东光阵光电科技有限公司 A kind of driving at night enhancing visibility method and Portable glasses formula device
CN108427193A (en) * 2017-02-14 2018-08-21 深圳梦境视觉智能科技有限公司 Augmented reality display system
CN208297844U (en) * 2018-06-29 2018-12-28 广州市和佳电子科技有限公司 A kind of AR glasses
CN110109256A (en) * 2019-06-24 2019-08-09 京东方科技集团股份有限公司 Glasses and its control method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120206322A1 (en) * 2010-02-28 2012-08-16 Osterhout Group, Inc. Ar glasses with event and sensor input triggered user action capture device control of ar eyepiece facility
CN103957383A (en) * 2014-04-22 2014-07-30 中国科学院深圳先进技术研究院 Back vision system based on smart glasses
CN105187707A (en) * 2015-06-11 2015-12-23 广东欧珀移动通信有限公司 Shooting method based on rotary camera and intelligent glasses, and system
CN108366206A (en) * 2015-06-11 2018-08-03 广东欧珀移动通信有限公司 A kind of image pickup method and system based on rotating camera and intelligent glasses
CN106843456A (en) * 2016-08-16 2017-06-13 深圳超多维光电子有限公司 A kind of display methods, device and virtual reality device followed the trail of based on attitude
CN108427193A (en) * 2017-02-14 2018-08-21 深圳梦境视觉智能科技有限公司 Augmented reality display system
CN107065183A (en) * 2017-03-21 2017-08-18 广东光阵光电科技有限公司 A kind of driving at night enhancing visibility method and Portable glasses formula device
CN208297844U (en) * 2018-06-29 2018-12-28 广州市和佳电子科技有限公司 A kind of AR glasses
CN110109256A (en) * 2019-06-24 2019-08-09 京东方科技集团股份有限公司 Glasses and its control method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
上海浦江教育出版社: "《中国物流科技发展报告(2012-2013)》", 30 November 2013, 上海浦江教育出版社 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113724398A (en) * 2021-09-01 2021-11-30 北京百度网讯科技有限公司 Augmented reality method, apparatus, device and storage medium
CN115955547A (en) * 2022-12-30 2023-04-11 上海梵企光电科技有限公司 Method and system for adjusting camera of XR glasses
CN115955547B (en) * 2022-12-30 2023-06-30 上海梵企光电科技有限公司 Camera adjustment method and system for XR glasses

Similar Documents

Publication Publication Date Title
CN109582141B (en) Method for controlling display screen according to eyeball focus and head-mounted electronic equipment
AU2020250124B2 (en) Image processing method and head mounted display device
US11024083B2 (en) Server, user terminal device, and control method therefor
WO2016157677A1 (en) Information processing device, information processing method, and program
US20150331486A1 (en) Image processing device, image processing method and program
WO2017118309A1 (en) Closed wearable panoramic image-capturing and processing system, and operation method therefor
KR102404790B1 (en) Method and apparatus for changing focus of camera
CN110998666B (en) Information processing device, information processing method, and program
KR20200051591A (en) Information processing apparatus, information processing method, and program
CN111710050A (en) Image processing method and device for virtual reality equipment
CN111291338B (en) User identification method, user identification device, storage medium and head-mounted device
US11327317B2 (en) Information processing apparatus and information processing method
CN111464781A (en) Image display method, image display device, storage medium, and electronic apparatus
CN111479148B (en) Wearable device, glasses terminal, processing terminal, data interaction method and medium
JP6259544B2 (en) Method and apparatus for displaying framing information
JP2023502552A (en) WEARABLE DEVICE, INTELLIGENT GUIDE METHOD AND APPARATUS, GUIDE SYSTEM, STORAGE MEDIUM
JP7400721B2 (en) Information processing device, information processing method and program
CN104239877B (en) The method and image capture device of image procossing
CN213876195U (en) Glasses frame and intelligent navigation glasses
CN111736692B (en) Display method, display device, storage medium and head-mounted device
CN107958478B (en) Rendering method of object in virtual reality scene and virtual reality head-mounted equipment
CN113686350A (en) Road information display method and device and intelligent wearable equipment
WO2019113935A1 (en) Closed wearable panoramic image capturing and processing system and operating method therefor
KR20150039352A (en) Electronic device and control method thereof
US20170155892A1 (en) Wearable stereoscopic camera system for 3d virtual reality imaging and networked area learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200728

RJ01 Rejection of invention patent application after publication