CN109831662B - Real-time picture projection method and device of AR (augmented reality) glasses screen, controller and medium - Google Patents

Real-time picture projection method and device of AR (augmented reality) glasses screen, controller and medium Download PDF

Info

Publication number
CN109831662B
CN109831662B CN201910223256.1A CN201910223256A CN109831662B CN 109831662 B CN109831662 B CN 109831662B CN 201910223256 A CN201910223256 A CN 201910223256A CN 109831662 B CN109831662 B CN 109831662B
Authority
CN
China
Prior art keywords
display
glasses
screen
interface
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910223256.1A
Other languages
Chinese (zh)
Other versions
CN109831662A (en
Inventor
张复尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yutou Technology Hangzhou Co Ltd
Original Assignee
Yutou Technology Hangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yutou Technology Hangzhou Co Ltd filed Critical Yutou Technology Hangzhou Co Ltd
Priority to CN201910223256.1A priority Critical patent/CN109831662B/en
Publication of CN109831662A publication Critical patent/CN109831662A/en
Priority to PCT/CN2020/079222 priority patent/WO2020192451A1/en
Application granted granted Critical
Publication of CN109831662B publication Critical patent/CN109831662B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to a real-time picture projection method, a device, a controller and a medium of an AR (augmented reality) glasses screen, wherein the method comprises the steps of detecting whether the AR glasses are connected with an external extended display or not; if yes, acquiring a camera picture through a camera arranged on the AR glasses, and acquiring a virtual UI through a main screen of the AR glasses; performing picture synthesis based on the camera picture and the virtual UI, generating a synthesis result picture and sending the synthesis result picture to the extended display; and displaying differentiated content on the extended display and the main screen of the AR glasses. According to the invention, through the differentiated content display of the AR glasses screen and the extended display, the glasses main screen does not need to display a camera picture, and the projection is realized without additionally arranging a server and the like, so that the development cost is reduced, the rapid deployment can be realized, and the user experience is improved.

Description

Real-time picture projection method and device of AR (augmented reality) glasses screen, controller and medium
Technical Field
The invention relates to the technical field of augmented reality, in particular to a real-time picture projection method and device of an AR (augmented reality) glasses screen, a controller and a medium.
Background
In recent years, Augmented Reality (AR) technology has been rapidly developed, and AR products have been gradually developed, but most products are mainly oriented to professional or high-end users, and many people have difficulty in understanding or contacting the products, so that AR glasses can be used to project content presented in a screen, namely a real-time picture of the screen of the AR glasses, to an extended display, so as to realize publicity on products such as the AR technology and the AR glasses, and the extended display is connected to an external display of the AR glasses.
The AR glasses superpose the content presented in the screen between a virtual User Interface (UI) in the glasses screen and the real view field content of the glasses User, the AR glasses screen is often transparent or semitransparent, the glasses screen is only responsible for displaying the corresponding virtual UI content, and the real view field scene is used as the background of the virtual UI, so that the final result picture projected by the AR glasses is a synthesized result picture of a scene instant picture shot by the AR glasses camera and the virtual UI presented in the glasses screen, wherein the scene instant picture shot by the AR glasses camera is the real view field content of the glasses User, and is called as a camera picture for short.
The prior art is generally realized by adopting the following two technical schemes:
(1) screen mirroring technology: importing the pictures of the camera of the android AR glasses into the glasses main display and displaying the pictures together with the virtual UI content in the glasses main display, and outputting the result pictures to the extended display through screen mirroring, wherein the android AR glasses refer to AR glasses carrying an android system. However, the fact that the real-time real pictures of the AR glasses are projected to the extended display through the pure screen mirror image leads the user of the AR glasses to see the pictures from the glasses camera when projecting the glasses screen picture, and also leads the virtual UI of the AR glasses not to be accurately attached to the real scene, which results in poor user experience.
(2) Background server synthesis technology: the AR glasses simultaneously send two groups of video streams to the background video stream server through a video stream technology, wherein one group of video streams is from the pictures of the camera of the AR glasses, and the other group of video streams is from the mirror images of the virtual UI screen in the glasses screen. And combining the two groups of video streams into a picture in a back-end video stream server, and sending the picture to an extended display for displaying. The technical means can avoid the reduction of the user experience of the AR glasses caused by pure screen mirroring, but the synthesis of the video stream pictures needs to be supported by a rear-end video stream server, the synthesis of the video stream pictures also generates delay, an additional terminal with an independent computing unit is needed to decode the video stream and present the video stream in an extended display when the result pictures are displayed, the technical scheme needs to rely on intermediate equipment, and meanwhile, extra burden exists in the development and deployment of related support software, so that the cost is high, the method is not suitable for light-weight localized rapid deployment, in addition, most of the video stream services rely on public network and wireless network support, and the Wi-Fi direct connection of local small local area networks and single point is realized, and even the wired transmission support is extremely limited.
Disclosure of Invention
The invention aims to provide a real-time image projection method, a real-time image projection device, a real-time image projection controller and a real-time image projection medium for an AR (augmented reality) glasses screen.
In order to solve the above technical problem, according to a first embodiment of the present invention, there is provided a real-time picture projection method for an AR glasses screen, including:
detecting whether the AR glasses are connected with an external expansion display or not;
if yes, acquiring a camera picture through a camera arranged on the AR glasses, and acquiring a virtual UI through a main screen of the AR glasses;
performing picture synthesis based on the camera picture and the virtual UI, generating a synthesis result picture and sending the synthesis result picture to the extended display;
and displaying differentiated content on the extended display and the main screen of the AR glasses.
Further, whether the detection AR glasses are connected with the external expansion display or not includes:
whether the glasses are connected with an external expansion display or not is detected through a first interface, and the first interface comprises a display manager interface and a media router interface.
Further, the picture composition based on the camera picture and the virtual UI includes:
defining a second interface subclass, wherein the second interface comprises a display interface;
in the initial establishing period of the life cycle of the second interface, associating a layer for the second interface;
presenting the camera picture on the bottommost layer of the second interface layer;
and synchronously presenting the virtual UI in the second interface layer, and performing image superposition on the camera image and the virtual UI to generate the synthesized result image.
Further, the synchronously presenting the virtual UI in the second interface layer includes:
obtaining a drawing cache of the virtual UI, wherein the drawing cache is a full-screen virtual UI drawing cache or a local virtual UI drawing cache;
generating a bitmap based on the drawing cache;
and newly establishing a view expansion subclass as a drawing board, declaring that the background color drawn by the canvas is transparent, drawing the bitmap in the drawing board, adding the bitmap into the second interface layer, and overlapping the bitmap on the camera picture.
Further, when the drawing cache to be acquired is a local virtual UI drawing cache,
constructing an extended subclass of a view or frame layout as a view container;
monitoring a callback event, and acquiring a corresponding local virtual UI drawing cache when a view is changed and placing the local virtual UI drawing cache in the view container;
and generating a bitmap based on the local virtual UI drawing cache in the view container, and drawing the bitmap on the drawing board.
Further, when the screen resolution of the AR glasses and the resolution of the extended display are not identical, the method further includes:
setting the height and width of the view container to be consistent with the display width and display height of the extended display;
acquiring the actual display height and the actual display width of the view container in the AR glasses screen;
acquiring the display height and the display width of the extended display in the drawing board;
dividing the display height of the extended display by the actual display height of the view container in the AR glasses screen to obtain a height multiple;
dividing the display height of the expansion display by the actual display width of the view container in the AR glasses screen to obtain a width multiple;
and scaling the drawing board in an equal ratio according to the height multiple and the width multiple.
Further, the displaying the differentiated content of the extended display and the main screen of the AR glasses includes:
binding the defined second interface subclass with the extended display detected through the first interface;
displaying the synthesis result picture in the extended display, and simultaneously, displaying the virtual UI on a main screen of the AR glasses.
Further, the AR glasses are in communication connection with the extended display through wired, wireless Wi-Fi, Wi-Fi direct connection, Google projection, wireless Bluetooth, GSM, CDMA, local area network or Internet.
According to a second embodiment of the present invention, there is provided a real-time picture projecting device of an AR glasses screen, including:
the display detection module is configured to detect whether the AR glasses are connected with the external expansion display;
the image acquisition module is configured to acquire a camera picture through a camera arranged on the AR glasses when the AR glasses are connected with the external extended display, and acquire the virtual UI through a main screen of the AR glasses;
the picture synthesis module is configured to synthesize pictures based on the camera picture and the virtual UI, generate a synthesis result picture and send the synthesis result picture to the extended display;
a differentiated display module configured to display differentiated content on the extended display and a main screen of the AR glasses.
Further, the display detection module is specifically configured to:
whether the AR glasses are connected with the external expansion display or not is detected through a first interface, and the first interface comprises a display manager interface and a media router interface.
Further, the picture composition module includes:
the interface definition submodule is configured to define a second interface subclass, and the second interface comprises a display interface;
the layer association submodule is configured to associate a layer with the second native interface in an initial creation period of the second interface life cycle;
the first picture presenting sub-module is configured to present the camera picture on the bottommost layer of the second interface layer;
and the second picture presenting sub-module is configured to synchronously present the virtual UI in the second interface layer, and the camera picture and the virtual UI are subjected to picture superposition to generate the synthesized result picture.
Further, the second screen presenting sub-module includes:
a drawing cache obtaining unit configured to obtain a drawing cache of the virtual UI, where the drawing cache is a full-screen virtual UI drawing cache or a local virtual UI drawing cache;
a bitmap generation unit configured to generate a bitmap based on the drawing cache;
and the layer overlapping unit is configured to newly establish a view expansion subclass as a drawing board, declare that the background color drawn by the canvas is transparent, draw the bitmap in the drawing board, add the bitmap into the second interface layer, and overlap the bitmap on the camera picture.
Further, when the drawing cache is a local virtual UI drawing cache, the second screen presenting sub-module further includes:
a container construction subunit configured to construct an extended subclass of a view or frame layout as a view container;
the container storage subunit is configured to monitor a callback event, acquire a corresponding local virtual UI drawing cache when the view is changed, and place the local virtual UI drawing cache in the view container;
and the drawing subunit is configured to generate a bitmap based on the local virtual UI drawing cache in the view container, and draw the bitmap on the drawing board.
Further, when the screen resolution of the AR glasses and the resolution of the extended display are not identical, the apparatus further includes:
a parameter setting unit configured to set a height and a width of the view container to coincide with a display width and a display height of the extended display;
a first parameter acquiring unit configured to acquire an actual display height and an actual display width of the view container in the AR glasses screen;
the second parameter acquisition unit is configured to acquire the display height and the display width of the extended display in the drawing board;
a height multiple determining unit configured to divide the display height of the extended display by the actually displayed height of the view container in the AR glasses screen to obtain a height multiple;
a width multiple determining module configured to divide the display height of the extended display by the actual display width of the view container in the AR glasses screen to obtain a width multiple;
and the geometric scaling unit is configured to perform geometric scaling on the drawing board according to the height multiple and the width multiple.
Further, the differentiation display module includes:
the binding unit is configured to bind the defined second interface subclass and the extended display detected through the first interface;
a difference display unit configured to display the synthesis result screen in the extended display while displaying the virtual UI on a main screen of the AR glasses.
Further, the AR glasses are in communication connection with the extended display through wired, wireless Wi-Fi, Wi-Fi direct connection, Google projection, wireless Bluetooth, GSM, CDMA, local area network or Internet.
According to a third embodiment of the invention, a controller is provided, comprising a memory and a processor, the memory storing a computer program enabling, when executed by the processor, the implementation of the steps of the method.
According to a fourth implementation of the invention, a computer-readable storage medium is provided for storing a computer program, which when executed by a computer or processor implements the steps of the method.
Compared with the prior art, the invention has obvious advantages and beneficial effects. By means of the technical scheme, the real-time image projection method, the device, the controller and the medium of the AR glasses screen can achieve considerable technical progress and practicability, have wide industrial utilization value and at least have the following advantages:
according to the display method, the display of the content is differentiated through the AR glasses screen and the extended display, the dependence on screen mirror image is not required, the camera picture does not need to be displayed on the glasses main screen, and the watching experience of a glasses user is not reduced; the invention also does not need to additionally arrange a server and the like to realize projection, reduces the development cost, can realize rapid deployment and improves the user experience.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical means of the present invention more clearly understood, the present invention may be implemented in accordance with the content of the description, and in order to make the above and other objects, features, and advantages of the present invention more clearly understood, the following preferred embodiments are described in detail with reference to the accompanying drawings.
Drawings
Fig. 1 is a schematic view of a real-time image projection scene of an AR glasses screen according to an embodiment of the present invention;
FIG. 2 is a flowchart of a real-time image projection method based on an android AR glasses screen according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an extended screen according to an embodiment of the present invention;
fig. 4 is a schematic diagram illustrating a virtual UI synchronously presented on the second android native interface layer according to an embodiment of the present invention;
fig. 5 is a schematic view of a real-time image projection apparatus of an AR glasses screen according to an embodiment of the present invention.
[ notation ] to show
1: the display detection module 2: image acquisition module
3: the picture composition module 4: differential display module
10: real-time picture projection device of AR glasses screen
Detailed Description
To further illustrate the technical means and effects of the present invention adopted to achieve the predetermined objects, the following detailed description will be given to the embodiments and effects of a real-time image projection method, device, controller and medium for an AR glasses screen according to the present invention with reference to the accompanying drawings and preferred embodiments. For convenience of description, the embodiments of the present invention are described in detail based on an android platform, and those skilled in the art will understand that the embodiments of the present invention may be implemented on a non-android platform (e.g., IOS, Windows, etc.).
One or more embodiments of the present invention provide a real-time image projection method for an AR glasses screen based on android, where content presented in the screen is projected to an extended display by using the AR glasses based on android, and an application scene is shown in fig. 1.
The method of the embodiment of the invention specifically comprises the following steps as shown in fig. 2:
and step S1, detecting whether the android AR glasses are connected with the external expansion display.
The android system provides a native interface for detecting whether the mobile device has an interface for accessing the extended display, and after the device accesses the extended display in the android system, the device can obtain related extended display information through a display manager (DisplayManager) or a media router (MediaRouter). Thus, as an example, the step S1 includes:
whether the android AR glasses are connected with an external extended display or not is detected through a first android native interface, wherein the first android native interface comprises a DisplayManager interface and a MediaRouter interface, and the first android native interface is an embodiment of a first interface in the claims. Where the displaymanager interface is used to manage multiple displays and their associated attributes, the MediaRouter interface allows applications to control the routing of media channels and streams from the current device to external speakers and destination devices.
The android AR glasses can be in communication connection with an external expansion display in the following mode:
wired: such as a High-Definition Multimedia Interface (HDMI) cable, a Universal Serial Bus (USB) cable, etc.;
wireless Wi-Fi (Wireless Fidelity): also called wireless hotspots in chinese;
Wi-Fi Direct (Wi-Fi Direct);
the Miracast standard is a Wi-Fi direct-based wireless display standard, and 3C (Computer, Communications, Consumer Electronics) devices supporting the standard can share video pictures in a wireless manner, for example, a mobile phone can directly play videos or photos on a television or other devices through Miracast without any connecting wires or wireless hot spots.
Google casting (Google Cast), a service of Google, is used to Cast a screen of an application program supporting Google Cast, such as YouTube (representing a video website name), onto an android television;
wireless Bluetooth;
global System/Standard for mobile communication(s) GSM);
code Division Multiple Access (CDMA for short);
a local area network;
the Internet.
The android AR glasses and the external expansion display support multiple communication connection modes, development difficulty is low, and deployment is rapid.
And step S2, if yes, acquiring a camera picture through a camera arranged on the AR glasses, and acquiring the virtual UI through a main screen of the AR glasses.
The field camera pictures collected by the AR glasses camera are consistent with the real field contents of the glasses user, and the main screen of the AR glasses collects the virtual UI, namely the virtual user interface in the glasses screen. By way of example, a Camera screen can be captured through an android Camera (Camera1 or Camera2) interface, and related functions and attributes of the Camera can be managed and operated.
Step S3, performing screen composition based on the camera screen and the virtual UI, generating a composition result screen, and sending the composition result screen to the extended display, as shown in fig. 3;
by way of example, display content may be built for an extended display through the android Presentation (Presentation), which is a special dialog box whose purpose is to display content on an auxiliary display screen, the Presentation being associated with a target display at the time of creation and configuring its context and resource configuration according to the displayed metrics. In step S3, the screen composition based on the camera screen and the virtual UI specifically includes the following steps:
step S31, defining a second android native interface subclass, wherein the second android native interface comprises a Presentation interface, and the second android native interface is an embodiment of a second interface in the claims;
step S32, associating an image layer for the second native interface in the initial creation period of the life cycle of the second android native interface;
step S33, presenting the camera picture on the bottommost layer of the second android native interface layer, specifically by calling an interface of the camera in an application and declaring an operation authority of the camera;
and step S34, synchronously presenting the virtual UI in the second android native interface layer, and performing picture superposition on the camera picture and the virtual UI to generate the synthesized result picture.
After the AR glasses camera picture is introduced into the extended display, the virtual UI in the AR glasses screen needs to be synchronously and dynamically displayed in the extended display, so that the content viewed by the glasses user through the AR glasses can be really presented in the extended display for others to watch. The real-time dynamic synchronization of the virtual UI in the AR glasses screen in the extended display is an action of continuously obtaining a drawing cache for the UI layer of the AR glasses main screen, and a process of redrawing in the extended display screen through a Canvas (Canvas) tool, where the Canvas is an android native interface, and allows a developer to render custom graphics on the Canvas or modify existing views and customize their appearances. As an example, in step S34, synchronously presenting the virtual UI in the second android native interface layer, as shown in fig. 4, includes:
step S341, obtaining a drawing cache of the virtual UI, wherein the drawing cache is a full screen virtual UI drawing cache or a local virtual UI drawing cache. The method has the advantages that local obtaining of the drawing cache of the virtual UI in the screen can be considered according to development requirements, or the full-screen virtual UI drawing cache in the screen is obtained, an android provides an interface for obtaining the drawing cache for the view, and a developer only needs to obtain the view cache for a specific view or a root view statement according to the development requirements and generate a corresponding bitmap.
Step S342, generating a bitmap based on the drawing cache;
and S343, newly building a view expansion subclass as a drawing board, declaring that the background color drawn by the canvas is transparent, drawing the bitmap in the drawing board, adding the bitmap into the second android native interface layer, and overlapping the bitmap on the camera picture.
In order to reduce the operation burden on the device caused by obtaining the drawing cache without interruption, a continuous drawing cache obtaining action may be performed only when the view is changed, that is, obtaining the local virtual UI drawing cache, and when the drawing cache to be obtained is the local virtual UI drawing cache, the step S34 further includes:
step S3411, constructing a view or an extended subclass of the framework layout as a view container, and declaring and obtaining a drawing cache of the view container in the view container.
Step S3412, when the view is changed, the callback event is monitored, and the corresponding local virtual UI drawing cache is obtained and placed in the view container.
It should be noted that, in step S341, the corresponding full-screen virtual UI drawing cache may also be obtained when the view is changed by monitoring the callback event.
Step S3413, generating a bitmap based on the local virtual UI drawing cache in the view container, and drawing the bitmap on the drawing board. Through steps S3411-S3413, all the virtual UI drawing cache contents that need to be synchronized to the extended display are placed in the view container, and the view container will convert and draw the virtual UI contained in it on the corresponding drawing board. Virtual UI content outside the view container will not be drawn in the extended display. In this way, the developer can decide which virtual UIs in the AR glasses can be drawn in the extended display and which virtual UIs do not need to be drawn in the extended display, thereby reducing the operational burden on the device caused by obtaining the drawing cache without interruption.
The drawing cache of the view container is obtained through declaration in the view container, and the bitmap of the view container is drawn on a drawing board in Presentation through a callback event to be monitored by a drawing monitoring (onpredawlistener) interface and a rolling monitoring (onscrollchangedllistener) interface. OnPreDrawLi stener represents the interface definition of a callback to call when the view tree is to be drawn, and OnScrollChangedListener represents the interface definition of a callback to call when scrolling through some content in the view tree. The callback events of the two snoops can be realized to ensure that the action of acquiring the drawing cache is triggered only when any change occurs to the view, thereby saving equipment resources. The method comprises the steps that a bitmap is generated by an obtained drawing cache, a monitor is defined for the bitmap, the monitor is used for monitoring generation of a new bitmap, a call-back event is triggered when a new bitmap is generated, the monitor for realizing generation of the new bitmap is declared in a drawing board, the call-back event of the new bitmap is used for drawing the generated new bitmap on a current drawing board, the background color drawn by a canvas is declared to be transparent, the drawing board is added into a binding layer of Presentation and is superposed on a camera browsing picture, and therefore a superposed picture comprising an AR main screen glasses camera picture and an AR main screen glasses UI picture can be presented in an extended display at the same time.
And step S4, displaying differentiated contents of the extended display and the main screen of the AR glasses.
As an example, the step S4 includes:
step S41, binding the defined second android native interface subclass and the extended display detected through the first android native interface;
step S42, displaying the synthesized result picture in the extended display, and simultaneously displaying the virtual UI on the main screen of the AR glasses, thereby implementing differentiated content display between the extended display and the main screen of the AR glasses.
It should be noted that the user may use an extended display that does not communicate with the resolution of the AR glasses as the output of the projected picture. If the UI is simply rendered according to the resolution of the extended display, deformation may occur, so it is necessary to consider setting the relevant size according to the resolution of the glasses or the resolution of the extended display, or scaling proportionally according to the requirement to achieve a better output effect. As an example, when the screen resolution of the AR glasses and the resolution of the extended display are not consistent, the method further comprises:
step S51, setting the height and width of the view container to be consistent with the display width and display height of the extended display;
step S52, acquiring the actual display height and the actual display width of the view container in the AR glasses screen; as an example, the actual display height and width in the AR glasses of the view container returned in the declaration period in the measurement function onmeasure () may be created as a class called display size assistant, and a height and width may be defined in the size record class, the data type is integer, two obtaining methods and two setting methods are established to set and obtain the height and width, respectively, and the setting methods in the display size assistant are called to store the actual display height and width in the AR glasses of the view container, respectively.
Step S53, obtaining the display height and the display width of the extended display in the drawing board, as an example, the display height and the display width of the extended display may be obtained through a display index (DisplayMetrics) interface in the drawing board, where DisplayMetrics is an android native interface and is a structure for describing general information related to display, such as the size, density, and font scaling thereof.
Step S54, dividing the display height of the expansion display by the actual display height of the view container in the AR glasses screen to obtain a height multiple;
step S55, dividing the display height of the expansion display by the actual display width of the view container in the AR glasses screen to obtain a width multiple;
step S56, scaling the drawing board in equal proportion according to the height multiple and the width multiple, specifically scaling the drawing board in equal proportion through a horizontal scaling (ScaleX) interface and a vertical scaling (ScaleY) interface of the drawing board, so that the bitmap transferred from the AR glasses is presented in the extended display in an equal scaling manner when being redrawn, thereby achieving a good viewing experience of the extended display.
The embodiment of the invention can also be based on an android platform carried on android AR glasses, and the two-dimensional UI virtual picture in an android main screen is synchronized into an extended display according to an android UI framework, so that a developer can realize the synchronization of the three-dimensional virtual UI picture into the extended display through the framework, for example, a graphical language surface view (an android native view component which can present a complex three-dimensional image object) view is respectively added into the AR glasses and the extended display, the two GLSurfaceView views simultaneously render the same image object, and the rendering synchronization of the two GLSurfaceViews is maintained, thereby further enhancing the realization effect of the scheme.
The method provided by the embodiment of the invention does not depend on android screen mirror image, but adopts a mode of performing differentiated content display by the extended display and the main screen of the AR glasses, so that the main screen of the glasses can not display the browsing pictures of the camera, and the user of the glasses does not need to reduce experience when using the glasses. In addition, the embodiment of the invention does not depend on any background multimedia streaming service, and can support multimedia of related protocols through Wi-Fi direct connection, such as a smart television or a set top box supporting Google Cast and Mirrorcast, and also support wired connection, such as projection of AR glasses pictures into an extended display through connecting a high-definition multimedia interface cable by using an OTG protocol. USB On-The-Go, commonly abbreviated USB OTG, is a complementary standard to The USB 2.0 specification. It can make USB equipment, such as player or handset change from USB peripheral equipment to USB host machine to connect and communicate with other USB equipment. Under normal conditions, these USB devices and USB hosts supporting OTG, such as desktop computers or portable computers, still serve as USB peripherals. The embodiment of the invention also supports the projection of the multimedia streaming service from the cloud. And the system can also perform picture projection in the appointed display equipment in the local area network through the back-end multimedia streaming service in the local area network, and has wide application range.
According to a second embodiment of the present invention, a real-time image projection device 10 based on an android AR glasses screen is provided, as shown in fig. 5, the device includes a display detection module 1, an image acquisition module 2, an image synthesis module 3, and a differentiation display module 4, wherein the display detection module 1 is configured to detect whether the android AR glasses are connected to an external extended display; the image acquisition module 2 is configured to acquire a camera picture through a camera arranged on the AR glasses when the AR glasses are connected with the external extended display, and acquire a virtual UI through a main screen of the AR glasses; the picture synthesis module 3 is configured to perform picture synthesis based on the camera picture and the virtual UI, generate a synthesis result picture and send the synthesis result picture to the extended display; a differentiated display module 4 configured to perform differentiated content display on the extended display and the main screen of the AR glasses.
The android AR glasses can be in communication connection with an external expansion display in the following mode:
wired: such as a High-Definition Multimedia Interface (HDMI) cable, a Universal Serial Bus (USB) cable, etc.;
wireless Wi-Fi (Wireless Fidelity): also called wireless hotspots in chinese;
Wi-Fi Direct (Wi-Fi Direct);
the Miracast standard is a Wi-Fi direct-based wireless display standard, and 3C (Computer, Communications, Consumer Electronics) devices supporting the standard can share video pictures in a wireless manner, for example, a mobile phone can directly play videos or photos on a television or other devices through Miracast without any connecting wires or wireless hot spots.
Google casting (Google Cast), a service of Google Cast, is used to Cast a screen of an application supporting Google Cast, such as YouTube (representing a video website name), onto an android television;
wireless Bluetooth;
global System/Standard for mobile communication(s) GSM);
code Division Multiple Access (CDMA for short);
a local area network;
the Internet.
The android AR glasses and the external expansion display support multiple communication connection modes, development difficulty is low, and deployment is rapid.
As an example, the display detection module 1 is specifically configured to: whether the android AR glasses are connected with the external expansion display or not is detected through a first android native interface, and the first android native interface comprises a display manager interface and a media router interface.
As an example, display content may be constructed for the extended display through an android Presentation, where the screen synthesis module 3 includes an interface definition sub-module, a layer association sub-module, a first screen Presentation sub-module, and a second screen Presentation sub-module, where the interface definition sub-module is configured to define a second android native interface subclass, and the second android native interface includes a Presentation interface; the layer association submodule is configured to associate a layer with the second native interface in an initial creation period of the life cycle of the second android native interface; the first picture presenting sub-module is configured to present the camera picture on the bottommost layer of the second android native interface layer; and the second picture presenting sub-module is configured to synchronously present the virtual UI in the second android native interface layer, and the camera picture and the virtual UI are subjected to picture superposition to generate the synthesized result picture.
After the AR glasses camera picture is introduced into the extended display, the virtual UI in the AR glasses screen needs to be synchronously and dynamically displayed in the extended display, so that the content viewed by the glasses user through the AR glasses can be really presented in the extended display for others to watch. The real-time dynamic synchronization of the virtual UI in the screen of the AR glasses in the extended display is an action of continuously acquiring a drawing cache from the UI layer of the main screen of the AR glasses, and a process of redrawing in the screen of the extended display through a Canvas tool, wherein the Canvas is an android native interface, and allows a developer to render customized graphics on the Canvas or modify existing views and customize the appearances of the existing views. As an example, the second image rendering sub-module includes a drawing cache obtaining unit, a bitmap generating unit, and a layer overlaying unit, where the drawing cache obtaining unit is configured to obtain a drawing cache of the virtual UI, the drawing cache is a full-screen virtual UI drawing cache or a local virtual UI drawing cache, and may consider to perform local obtaining of the drawing cache on the virtual UI in the screen according to development requirements, or obtain the full-screen virtual UI drawing cache in the screen, and the android provides an interface for obtaining the drawing cache for the view, and a developer only needs to obtain the view cache for a specific view or a root view declaration according to development requirements and generate a corresponding bitmap. The bitmap generation unit is configured to generate a bitmap based on the drawing cache; and the layer superposition unit is configured to declare that the background color drawn by the canvas is transparent, draw the bitmap in the drawing board, add the bitmap into the second android native interface layer, and superpose the bitmap on the camera picture.
In order to reduce the operation burden on the device caused by uninterrupted drawing cache acquisition, a continuous drawing cache acquisition action may be performed only when the view is changed, that is, a local virtual UI drawing cache is acquired, and when the drawing cache to be acquired is the local virtual UI drawing cache, the second picture presentation sub-module further includes a container construction sub-unit, a container storage sub-unit, and a drawing sub-unit, where the container construction sub-unit is configured to construct a view or an extended sub-category of a frame layout as a view container; the container storage subunit is configured to monitor a callback event, and when the view is changed, obtain a corresponding local virtual UI drawing cache and place the local virtual UI drawing cache in the view container; the rendering subunit is configured to generate a bitmap based on the local virtual UI drawing cache in the view container, and to render the bitmap on the drawing board.
When the drawing cache is a local virtual UI drawing cache, the drawing cache obtaining unit places all the contents of the virtual UI drawing cache which need to be synchronized into the extended display into a view container, and the view container converts the virtual UIs contained in the view container into corresponding drawing boards and draws the virtual UIs on the corresponding drawing boards. Virtual UI content outside the view container will not be drawn in the extended display. In this way, the developer can decide which virtual UIs in the AR glasses can be drawn in the extended display and which virtual UIs do not need to be drawn in the extended display, thereby reducing the operational burden on the device caused by obtaining the drawing cache without interruption.
The drawing cache of the view container is obtained through declaration in the view container, and the bitmap of the view container is drawn on a drawing board in Presentation through a callback event to be monitored by a drawing monitoring (onpredawlistener) interface and a rolling monitoring (onscrollchangedllistener) interface. OnPreDrawListener represents the interface definition of a callback to be called when the view tree is to be drawn, and OnScrollChangedListener represents the interface definition of a callback to be called when scrolling through some content in the view tree. The callback events of the two snoops can be realized to ensure that the action of acquiring the drawing cache is triggered only when any change occurs to the view, thereby saving equipment resources. The method comprises the steps that a bitmap is generated by an obtained drawing cache, a monitor is defined for the bitmap, the monitor is used for monitoring generation of a new bitmap, a call-back event is triggered when a new bitmap is generated, the monitor for realizing generation of the new bitmap is declared in a drawing board, the call-back event of the new bitmap is used for drawing the generated new bitmap on a current drawing board, the background color drawn by a canvas is declared to be transparent, the drawing board is added into a binding layer of Presentation and is superposed on a camera browsing picture, and therefore a superposed picture comprising an AR main screen glasses camera picture and an AR main screen glasses UI picture can be presented in an extended display at the same time.
As an example, the differential display module 4 includes a binding unit and a differential display unit, where the binding unit is configured to bind the defined second android native interface subclass and the extended display detected through the first android native interface; the difference display unit is configured to display the synthesis result picture in the extended display while displaying the virtual UI on a main screen of the AR glasses, thereby enabling differentiated content display of the extended display and the main screen of the AR glasses.
It should be noted that the user may use an extended display that does not communicate with the resolution of the AR glasses as the output of the projected picture. If the UI is simply rendered according to the resolution of the extended display, deformation may occur, so it is necessary to consider setting the relevant size according to the resolution of the glasses or the resolution of the extended display, or scaling proportionally according to the requirement to achieve a better output effect. As an example, when the screen resolution of the AR glasses and the resolution of the extended display are not consistent, the apparatus further includes a parameter setting unit, a first parameter acquiring unit, a second parameter acquiring unit, a height multiple determining unit, a width multiple determining module, and an equal ratio scaling unit, wherein the parameter setting unit is configured to create a view container whose height and width are set to be consistent with a display width and a display height of the extended display; the first parameter acquisition unit is configured to acquire an actual display height and an actual display width of the view container in the AR glasses screen; the second parameter obtaining unit is configured to obtain the display height and the display width of the extended display in the drawing board; the height multiple determining unit is configured to divide the display height of the extended display by the actually displayed height of the view container in the AR glasses screen to obtain a height multiple; the width multiple determining module is configured to divide the display height of the extended display by the actual display width of the view container in the AR glasses screen to obtain a width multiple; the geometric scaling unit is configured to perform geometric scaling on the drawing board according to the height multiple and the width multiple.
The embodiment of the invention also provides a controller, which comprises a memory and a processor, wherein the memory stores a computer program, and the program can realize the steps of the real-time image projection method based on the android AR glasses screen when being executed by the processor.
An embodiment of the present invention further provides a computer-readable storage medium for storing a computer program, where the computer program, when executed by a computer or a processor, implements the steps of the real-time image projection method based on the android AR glasses screen.
According to the embodiment of the invention, through the differentiated content display of the AR glasses screen and the extended display, the dependence on the mirror image of an android screen is not required, the main screen of the glasses does not need to display a camera picture, and the watching experience of a glasses user is not reduced; the invention also does not need to additionally arrange a server and the like to realize projection, reduces the development cost, can realize rapid deployment and improves the user experience.
Although embodiments of the present invention are described in terms of an android system, those skilled in the art will appreciate that one or more embodiments of the present invention can be implemented on an operating system that uses the same or similar functionality as the interfaces described above.
Further, although the embodiments of the present invention are described based on AR glasses, the present invention is not limited thereto. Those skilled in the art can understand that the projection of products such as VR glasses, AR helmets, etc. which realize superposition of virtual user interfaces and real fields of view are all suitable for the technical solution of the present invention.
Although the present invention has been described with reference to a preferred embodiment, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (16)

1. A real-time picture projection method of an AR glasses screen is characterized by comprising the following steps:
detecting whether the AR glasses are connected with an external expansion display or not;
if yes, acquiring a camera picture through a camera arranged on the AR glasses, and acquiring a virtual UI through a main screen of the AR glasses;
defining a second interface subclass, wherein the second interface comprises a display interface;
in the initial establishing period of the life cycle of the second interface, associating a layer for the second interface;
presenting the camera picture on the bottommost layer of the second interface layer;
synchronously presenting the virtual UI in the second interface layer, and performing image superposition on the camera image and the virtual UI to generate a synthesized result image and send the synthesized result image to the extended display;
and displaying differentiated content on the extended display and the main screen of the AR glasses.
2. The real-time picture projection method of the AR glasses screen according to claim 1,
whether survey AR glasses and be connected with external extension display includes:
whether the AR glasses are connected with the external expansion display or not is detected through a first interface, and the first interface comprises a display manager interface and a media router interface.
3. The real-time picture projection method of the AR glasses screen according to claim 1,
the synchronously presenting the virtual UI in the second interface layer includes:
obtaining a drawing cache of the virtual UI, wherein the drawing cache is a full-screen virtual UI drawing cache or a local virtual UI drawing cache;
generating a bitmap based on the drawing cache;
and newly establishing a view expansion subclass as a drawing board, declaring that the background color drawn by the canvas is transparent, drawing the bitmap in the drawing board, adding the bitmap into the second interface layer, and overlapping the bitmap on the camera picture.
4. The real-time picture projection method of the AR glasses screen according to claim 3,
when the drawing cache to be acquired is the local virtual UI drawing cache,
constructing an extended subclass of a view or frame layout as a view container;
monitoring a callback event, and acquiring a corresponding local virtual UI drawing cache when a view is changed and placing the local virtual UI drawing cache in the view container;
and generating a bitmap based on the local virtual UI drawing cache in the view container, and drawing the bitmap on the drawing board.
5. The real-time picture projection method of the AR glasses screen according to claim 4,
when the screen resolution of the AR glasses and the resolution of the extended display are not consistent, the method further comprises:
setting the height and width of the view container to be consistent with the display width and display height of the extended display;
acquiring the actual display height and the actual display width of the view container in the AR glasses screen;
acquiring the display height and the display width of the extended display in the drawing board;
dividing the display height of the extended display by the actual display height of the view container in the AR glasses screen to obtain a height multiple;
dividing the display height of the expansion display by the actual display width of the view container in the AR glasses screen to obtain a width multiple;
and scaling the drawing board in an equal ratio according to the height multiple and the width multiple.
6. The real-time picture projection method of the AR glasses screen according to claim 2,
the displaying differentiated content of the main screen of the AR glasses and the extended display, comprising:
binding the defined second interface subclass with the extended display detected through the first interface;
displaying the synthesis result picture in the extended display, and simultaneously, displaying the virtual UI on a main screen of the AR glasses.
7. The real-time picture projection method of the AR glasses screen of any one of claims 1 to 6,
the AR glasses are in communication connection with the extended display through wired, wireless Wi-Fi, Wi-Fi direct connection, Google projection, wireless Bluetooth, GSM, CDMA, local area network or Internet.
8. A real-time picture projection device of an AR glasses screen, comprising:
the display detection module is configured to detect whether the AR glasses are connected with the external expansion display;
the image acquisition module is configured to acquire a camera picture through a camera arranged on the AR glasses when the AR glasses are connected with the external extended display, and acquire the virtual UI through a main screen of the AR glasses;
a picture composition module comprising:
the interface definition submodule is configured to define a second interface subclass, and the second interface comprises a display interface;
the layer association submodule is configured to associate a layer with the second interface in an initial creation period of a life cycle of the second interface;
the first picture presenting sub-module is configured to present the camera picture on the bottommost layer of the second interface layer;
the second picture presenting sub-module is configured to synchronously present the virtual UI in the second interface layer, and the camera picture and the virtual UI are subjected to picture superposition to generate a synthesized result picture and send the synthesized result picture to the extended display;
a differentiated display module configured to display differentiated content on the extended display and a main screen of the AR glasses.
9. The real-time picture projection apparatus of the AR glasses screen of claim 8,
the display detection module is specifically configured to:
whether the AR glasses are connected with the external expansion display or not is detected through a first interface, and the first interface comprises a display manager interface and a media router interface.
10. The real-time picture projection apparatus of the AR glasses screen of claim 8,
the second picture presentation sub-module includes:
a drawing cache obtaining unit configured to obtain a drawing cache of the virtual UI, where the drawing cache is a full-screen virtual UI drawing cache or a local virtual UI drawing cache;
a bitmap generation unit configured to generate a bitmap based on the drawing cache;
and the layer overlapping unit is configured to newly establish a view expansion subclass as a drawing board, declare that the background color drawn by the canvas is transparent, draw the bitmap in the drawing board, add the bitmap into the second interface layer, and overlap the bitmap on the camera picture.
11. The real-time picture projection apparatus of the AR glasses screen of claim 10,
when the drawing cache to be acquired is a local virtual UI drawing cache, the second screen presenting sub-module further includes:
a container construction subunit configured to construct an extended subclass of a view or frame layout as a view container;
the container storage subunit is configured to monitor a callback event, acquire a corresponding local virtual UI drawing cache when the view is changed, and place the local virtual UI drawing cache in the view container;
and the drawing subunit is configured to generate a bitmap based on the local virtual UI drawing cache in the view container, and draw the bitmap on the drawing board.
12. The real-time picture projection apparatus of the AR glasses screen of claim 11,
when the screen resolution of the AR glasses and the resolution of the extended display are not consistent, the apparatus further comprises:
a parameter setting unit configured to set a height and a width of the view container to coincide with a display width and a display height of the extended display;
a first parameter acquiring unit configured to acquire an actual display height and an actual display width of the view container in the AR glasses screen;
the second parameter acquisition unit is configured to acquire the display height and the display width of the extended display in the drawing board;
a height multiple determining unit configured to divide the display height of the extended display by the actually displayed height of the view container in the AR glasses screen to obtain a height multiple;
a width multiple determining module configured to divide the display height of the extended display by the actual display width of the view container in the AR glasses screen to obtain a width multiple;
and the geometric scaling unit is configured to perform geometric scaling on the drawing board according to the height multiple and the width multiple.
13. The real-time picture projection apparatus of the AR glasses screen of claim 9,
the differentiation display module includes:
the binding unit is configured to bind the defined second interface subclass and the extended display detected through the first interface;
a difference display unit configured to display the synthesis result screen in the extended display while displaying the virtual UI on a main screen of the AR glasses.
14. The real-time picture projection apparatus of the AR glasses screen of any one of claims 8 to 13,
the AR glasses are in communication connection with the extended display through wired, wireless Wi-Fi, Wi-Fi direct connection, Google projection, wireless Bluetooth, GSM, CDMA, local area network or Internet.
15. A controller comprising a memory and a processor, characterized in that the memory stores a computer program which, when executed by the processor, is capable of carrying out the steps of the method of any one of claims 1 to 7.
16. A computer-readable storage medium for storing a computer program, characterized in that the program realizes the steps of the method according to any one of claims 1 to 7 when executed by a computer or processor.
CN201910223256.1A 2019-03-22 2019-03-22 Real-time picture projection method and device of AR (augmented reality) glasses screen, controller and medium Active CN109831662B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910223256.1A CN109831662B (en) 2019-03-22 2019-03-22 Real-time picture projection method and device of AR (augmented reality) glasses screen, controller and medium
PCT/CN2020/079222 WO2020192451A1 (en) 2019-03-22 2020-03-13 Real-time picture projection method and apparatus of ar glasses screen, controller and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910223256.1A CN109831662B (en) 2019-03-22 2019-03-22 Real-time picture projection method and device of AR (augmented reality) glasses screen, controller and medium

Publications (2)

Publication Number Publication Date
CN109831662A CN109831662A (en) 2019-05-31
CN109831662B true CN109831662B (en) 2021-10-08

Family

ID=66871004

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910223256.1A Active CN109831662B (en) 2019-03-22 2019-03-22 Real-time picture projection method and device of AR (augmented reality) glasses screen, controller and medium

Country Status (2)

Country Link
CN (1) CN109831662B (en)
WO (1) WO2020192451A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020248233A1 (en) * 2019-06-14 2020-12-17 深圳市大疆创新科技有限公司 Data processing method, system, mobile platform, terminal and storage medium
CN110866979A (en) * 2019-11-14 2020-03-06 联想(北京)有限公司 Data processing method, device, computing equipment and medium
CN111246261A (en) * 2020-02-28 2020-06-05 北京视博云信息技术有限公司 Content delivery method, device and system
CN112243219A (en) * 2020-10-15 2021-01-19 北京字节跳动网络技术有限公司 Display device, terminal control method and device, terminal and storage medium
CN114020231B (en) * 2021-11-11 2023-12-26 京东方科技集团股份有限公司 User interface display method and device
CN114363489B (en) * 2021-12-29 2022-11-15 珠海惠中智能技术有限公司 Augmented reality system with camera and eye display device direct coupling
CN114900530B (en) * 2022-04-22 2023-05-05 冠捷显示科技(厦门)有限公司 Display equipment and meta space virtual-actual switching and integrating system and method thereof
CN116074487A (en) * 2023-01-31 2023-05-05 杭州易现先进科技有限公司 Screen projection method, device and storage medium of AR (augmented reality) glasses

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103558691A (en) * 2013-11-01 2014-02-05 王洪亮 3D intelligent device and 3D image display method thereof
CN105959666A (en) * 2016-06-30 2016-09-21 乐视控股(北京)有限公司 Method and device for sharing 3d image in virtual reality system
CN106201259A (en) * 2016-06-30 2016-12-07 乐视控股(北京)有限公司 A kind of method and apparatus sharing full-view image in virtual reality system
CN106790553A (en) * 2016-12-24 2017-05-31 珠海市魅族科技有限公司 The interface sharing method and device of virtual reality device
CN107272224A (en) * 2017-08-03 2017-10-20 苏州医视医疗科技有限公司 Intelligent glasses with two-way regulating function
CN207651021U (en) * 2018-04-11 2018-07-24 成都普望智能科技有限公司 Wisdom training, test and appraisal and examination system based on AR/VR technologies
CN108421240A (en) * 2018-03-31 2018-08-21 成都云门金兰科技有限公司 Court barrage system based on AR
CN109496293A (en) * 2018-10-12 2019-03-19 北京小米移动软件有限公司 Extend content display method, device, system and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10001645B2 (en) * 2014-01-17 2018-06-19 Sony Interactive Entertainment America Llc Using a second screen as a private tracking heads-up display
US10168788B2 (en) * 2016-12-20 2019-01-01 Getgo, Inc. Augmented reality user interface
US20180275766A1 (en) * 2017-03-27 2018-09-27 MindMaze Holdiing SA System, method and apparatus for providing a user interface
CN106997618A (en) * 2017-04-14 2017-08-01 陈柳华 A kind of method that virtual reality is merged with real scene
KR102431712B1 (en) * 2017-09-04 2022-08-12 삼성전자 주식회사 Electronic apparatus, method for controlling thereof and computer program product thereof
CN108924538B (en) * 2018-05-30 2021-02-26 太若科技(北京)有限公司 Screen expanding method of AR device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103558691A (en) * 2013-11-01 2014-02-05 王洪亮 3D intelligent device and 3D image display method thereof
CN105959666A (en) * 2016-06-30 2016-09-21 乐视控股(北京)有限公司 Method and device for sharing 3d image in virtual reality system
CN106201259A (en) * 2016-06-30 2016-12-07 乐视控股(北京)有限公司 A kind of method and apparatus sharing full-view image in virtual reality system
CN106790553A (en) * 2016-12-24 2017-05-31 珠海市魅族科技有限公司 The interface sharing method and device of virtual reality device
CN107272224A (en) * 2017-08-03 2017-10-20 苏州医视医疗科技有限公司 Intelligent glasses with two-way regulating function
CN108421240A (en) * 2018-03-31 2018-08-21 成都云门金兰科技有限公司 Court barrage system based on AR
CN207651021U (en) * 2018-04-11 2018-07-24 成都普望智能科技有限公司 Wisdom training, test and appraisal and examination system based on AR/VR technologies
CN109496293A (en) * 2018-10-12 2019-03-19 北京小米移动软件有限公司 Extend content display method, device, system and storage medium

Also Published As

Publication number Publication date
CN109831662A (en) 2019-05-31
WO2020192451A1 (en) 2020-10-01

Similar Documents

Publication Publication Date Title
CN109831662B (en) Real-time picture projection method and device of AR (augmented reality) glasses screen, controller and medium
US11895426B2 (en) Method and apparatus for capturing video, electronic device and computer-readable storage medium
WO2021204296A1 (en) Remote display method for three-dimensional model, first terminal, electronic device and storage medium
CN106708452B (en) Information sharing method and terminal
CN110569013B (en) Image display method and device based on display screen
WO2016150281A1 (en) Method, mobile terminal and system for displaying preview video file
EP3024223B1 (en) Videoconference terminal, secondary-stream data accessing method, and computer storage medium
CN112035195B (en) Application interface display method and device, electronic equipment and storage medium
CN113778360B (en) Screen projection method and electronic equipment
JPWO2015072194A1 (en) Display control apparatus, display control method, and program
CN111796826B (en) Bullet screen drawing method, device, equipment and storage medium
CN113645476A (en) Picture processing method and device, electronic equipment and storage medium
CN111352560B (en) Screen splitting method and device, electronic equipment and computer readable storage medium
CN114374853B (en) Content display method, device, computer equipment and storage medium
WO2022052742A1 (en) Multi-terminal screen combination method, apparatus and device, and computer storage medium
CN107995538B (en) Video annotation method and system
CN111221444A (en) Split screen special effect processing method and device, electronic equipment and storage medium
CN112202958B (en) Screenshot method and device and electronic equipment
CN109241304B (en) Picture processing method, device and equipment
CN116847147A (en) Special effect video determining method and device, electronic equipment and storage medium
CN111311477B (en) Image editing method and device and corresponding storage medium
CN111367598B (en) Method and device for processing action instruction, electronic equipment and computer readable storage medium
CN113114955B (en) Video generation method and device and electronic equipment
CN113587812B (en) Display equipment, measuring method and device
WO2023134537A1 (en) Split-screen special effect prop generating method and apparatus, device, and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant