CN109308742A - A kind of method and apparatus running 2D application in the 3D scene of virtual reality - Google Patents
A kind of method and apparatus running 2D application in the 3D scene of virtual reality Download PDFInfo
- Publication number
- CN109308742A CN109308742A CN201810904316.1A CN201810904316A CN109308742A CN 109308742 A CN109308742 A CN 109308742A CN 201810904316 A CN201810904316 A CN 201810904316A CN 109308742 A CN109308742 A CN 109308742A
- Authority
- CN
- China
- Prior art keywords
- application
- texture
- scene
- virtual reality
- virtual screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 230000003993 interaction Effects 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 11
- 238000012545 processing Methods 0.000 claims description 10
- 238000006243 chemical reaction Methods 0.000 claims description 5
- 238000013507 mapping Methods 0.000 claims description 5
- 238000004364 calculation method Methods 0.000 claims description 2
- 230000006870 function Effects 0.000 description 11
- 230000008569 process Effects 0.000 description 9
- 230000008859 change Effects 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000003139 buffering effect Effects 0.000 description 2
- 150000001875 compounds Chemical class 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000001144 postural effect Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 239000011800 void material Substances 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
The method and apparatus that the present invention provides a kind of to run 2D application in the 3D scene of virtual reality, the first texture and the second texture are created in the 3D application of virtual reality, virtual screen is created based on first texture, the display panel of 3D scene display is used for based on second texture creation, the displaying content of first texture is copied in real time in second texture, with the displaying content in the display panel in the real-time exhibition corresponding virtual screen of the first texture, the figure layer generated when 2D application will be run to map in the virtual screen, to run 2D application in 3D scene;Compared with prior art, user is substantially improved on VR all-in-one machine using the experience of tradition 2D application in the present invention, user can go operation 2D application on the huge curtain of a 3D scene of virtual reality as operating handset, the head position of user can also be tracked simultaneously, the picture of 2D application after capable of seeing anti-distortion, improves the usage experience of user.
Description
Technical field
The present invention relates to technical field of virtual reality, more particularly to one kind is at virtual reality (VirtualReality, VR)
3D (three-dimensional) scene in operation 2D (two dimension) apply technology.
Background technique
Currently, the mode usually not 3D scene of 2D application is run in virtual reality, only in SurfaceFlinger
The 2D figure layer applied directly is copied into two parts by (figure layer Composite service) layer, scaled to be put into screen the right and left.Pass through
The picture of this kind of mode, the 2D application that user sees is not handled by anti-distortion, is also tracked without head pose, also, existing
The mode for this split screen having makes picture smaller and relatively obscures,
Therefore, how to realize and run 2D application in the 3D scene of virtual reality, user is made to obtain more true experience,
As one of those skilled in the art's urgent problem to be solved.
Summary of the invention
The method and apparatus that the object of the present invention is to provide a kind of to run 2D application in the 3D scene of virtual reality.
According to an aspect of the present invention, a kind of method running 2D application in the 3D scene of virtual reality is provided,
In, method includes the following steps:
The first texture and the second texture are created in the 3D application of virtual reality, virtual screen is created based on first texture
Curtain is used for the display panel of 3D scene display based on second texture creation;
The displaying content of first texture is copied in real time in second texture, with real in the display panel
When show the displaying content in the corresponding virtual screen of first texture;
The figure layer generated when 2D application will be run to map in the virtual screen, answered with running the 2D in 3D scene
With.
Preferably, this method further include:
It intercepts the system message of the handle of virtual reality device and is sent to the 3D application, the 3D, which is applied, utilizes handle
Ray is interacted with the display panel for showing the 2D application, obtains interaction point;
The interaction point is converted into applying the touch information and sending of the virtual screen to the 2D, to described
2D application is operated.
Preferably, this method further include:
Specially treated is carried out to the life cycle of 3D application, so that starting the 2D after 3D application operation
In application, the 3D application does not enter pause or halted state.
It is highly preferred that this method further include:
The 3D is exited in application, first closing all 2D applications operated in the 3D scene, then exit the 3D and answer
With.
According to another aspect of the present invention, a kind of fortune that 2D application is run in the 3D scene of virtual reality is additionally provided
Luggage is set, wherein the running gear includes:
Creating device is based on described first for creating the first texture and the second texture in the 3D of virtual reality application
Texture creates virtual screen, and the display panel of 3D scene display is used for based on second texture creation;
Copy device, for copying the displaying content of first texture in second texture in real time, in institute
State the displaying content in display panel in the corresponding virtual screen of the first texture described in real-time exhibition;
Mapping device maps in the virtual screen for will run the figure layer generated when 2D application, in 3D scene
The middle operation 2D application.
Preferably, the running gear further include:
Capture device, it is described for intercepting the system message of the handle of virtual reality device and being sent to the 3D application
3D, which is applied, to be interacted using handle ray with the display panel for showing the 2D application, and interaction point is obtained;
Conversion equipment, for being converted into the interaction point to the touch information and sending of the virtual screen to the 2D
Using to be operated to 2D application.
Preferably, the running gear further include:
Processing unit, the life cycle for applying to the 3D carry out specially treated, so that running in 3D application
Afterwards, start the 2D in application, 3D application does not enter pause or halted state.
It is highly preferred that the running gear further include:
Extractor is applied for exiting the 3D in application, first closing all 2D operated in the 3D scene, then
Exit the 3D application.
According to a further aspect of the invention, a kind of computer readable storage medium is additionally provided, it is described computer-readable
Storage medium is stored with computer code, and when the computer code is performed, such as preceding described in any item methods are performed.
According to a further aspect of the invention, a kind of computer program product is additionally provided, when the computer journey
When sequence product is executed by computer equipment, such as preceding described in any item methods are performed.
According to a further aspect of the invention, a kind of computer equipment is additionally provided, the computer equipment includes:
One or more processors;
Memory, for storing one or more computer programs;
When one or more of computer programs are executed by one or more of processors so that it is one or
Multiple processors realize such as preceding described in any item methods.
Compared with prior art, the present invention creates the first texture and the second texture in the 3D application of virtual reality, is based on
First texture creates virtual screen, and the display panel of 3D scene display is used for based on second texture creation, will be described
The displaying content of first texture is copied in real time in second texture, first described in the real-time exhibition in the display panel
The displaying content in the corresponding virtual screen of texture, the figure layer that generates maps to described virtual when will run 2D application
In screen, to run the 2D application in 3D scene;User is substantially improved on VR all-in-one machine using the body of tradition 2D application
It tests, user can go operation 2D application on the huge curtain of a 3D scene of virtual reality as operating handset, while may be used also
To track the head position of user, it can be seen that the picture of the 2D application after anti-distortion improves the usage experience of user.
Detailed description of the invention
By reading a detailed description of non-restrictive embodiments in the light of the attached drawings below, of the invention other
Feature, objects and advantages will become more apparent upon:
Fig. 1 shows the method flow that 2D application is run in the 3D scene of virtual reality of one aspect according to the present invention
Figure;
Fig. 2 shows the signals that 2D application is run in the 3D scene of virtual reality in accordance with a preferred embodiment of the present invention
Figure;
Fig. 3 shows the effect that 2D application is run in the 3D scene of virtual reality of another preferred embodiment according to the present invention
Fruit schematic diagram;
Fig. 4 shows the device signal that 2D application is run in the 3D scene of virtual reality according to a further aspect of the present invention
Figure.
The same or similar appended drawing reference represents the same or similar component in attached drawing.
Specific embodiment
It should be mentioned that some exemplary embodiments are described as before exemplary embodiment is discussed in greater detail
The processing or method described as flow chart.Although operations are described as the processing of sequence by flow chart, therein to be permitted
Multioperation can be implemented concurrently, concomitantly or simultaneously.In addition, the sequence of operations can be rearranged.When it
The processing can be terminated when operation completion, it is also possible to have the additional step being not included in attached drawing.The processing
It can correspond to method, function, regulation, subroutine, subprogram etc..
Alleged " computer equipment " within a context, also referred to as " computer ", referring to can be by running preset program or referring to
Enable to execute numerical value and calculate and/or the intelligent electronic device of the predetermined process process such as logic calculation, may include processor with
Memory executes the survival prestored in memory instruction by processor to execute predetermined process process, or by ASIC,
The hardware such as FPGA, DSP execute predetermined process process, or are realized by said two devices combination.Computer equipment includes but unlimited
In server, PC, laptop, tablet computer, smart phone etc..
The computer equipment includes user equipment and the network equipment.Wherein, the user equipment includes but is not limited to electricity
Brain, smart phone, PDA etc.;The network equipment includes but is not limited to that single network server, multiple network servers form
Server group or the cloud consisting of a large number of computers or network servers for being based on cloud computing (Cloud Computing), wherein
Cloud computing is one kind of distributed computing, a super virtual computer consisting of a loosely coupled set of computers.Its
In, the computer equipment can isolated operation realize the present invention, also can access network and by with other calculating in network
The present invention is realized in the interactive operation of machine equipment.Wherein, network locating for the computer equipment include but is not limited to internet,
Wide area network, Metropolitan Area Network (MAN), local area network, VPN network etc..
It should be noted that the user equipment, the network equipment and network etc. are only for example, other are existing or from now on may be used
The computer equipment or network that can occur such as are applicable to the present invention, should also be included within the scope of protection of the present invention, and to draw
It is incorporated herein with mode.
Method (some of them are illustrated by process) discussed hereafter can be by hardware, software, firmware, centre
Part, microcode, hardware description language or any combination thereof are implemented.Implement when with software, firmware, middleware or microcode
When, program code or code segment to implement necessary task can be stored in machine or computer-readable medium and (for example deposit
Storage media) in.Necessary task can be implemented in (one or more) processor.
Specific structure and function details disclosed herein are only representative, and are for describing the present invention show
The purpose of example property embodiment.But the present invention can be implemented by many alternative forms, and be not interpreted as
It is limited only by the embodiments set forth herein.
Although it should be understood that may have been used term " first ", " second " etc. herein to describe each unit,
But these units should not be limited by these terms.The use of these items is only for by a unit and another unit
It distinguishes.For example, without departing substantially from the range of exemplary embodiment, it is single that first unit can be referred to as second
Member, and similarly second unit can be referred to as first unit.Term "and/or" used herein above include one of them or
Any and all combinations of more listed associated items.
It should be understood that when a unit referred to as " connects " or when " coupled " to another unit, can directly connect
Another unit is connect or be coupled to, or may exist temporary location.In contrast, " directly connect when a unit is referred to as
Connect " or " direct-coupling " to another unit when, then temporary location is not present.It should explain in a comparable manner and be used to retouch
State the relationship between unit other words (such as " between being in ... " compared to " between being directly in ... ", " and with ... it is adjacent
Closely " compared to " with ... be directly adjacent to " etc.).
Term used herein above is not intended to limit exemplary embodiment just for the sake of description specific embodiment.Unless
Context clearly refers else, otherwise singular used herein above "one", " one " also attempt to include plural number.Also answer
When understanding, term " includes " and/or "comprising" used herein above provide stated feature, integer, step, operation,
The presence of unit and/or component, and do not preclude the presence or addition of other one or more features, integer, step, operation, unit,
Component and/or combination thereof.
It should further be mentioned that the function action being previously mentioned can be attached according to being different from some replace implementations
The sequence indicated in figure occurs.For example, related function action is depended on, the two width figures shown in succession actually may be used
Substantially simultaneously to execute or can execute in a reverse order sometimes.
Present invention is further described in detail with reference to the accompanying drawing.
Fig. 1 shows the method flow that 2D application is run in the 3D scene of virtual reality of one aspect according to the present invention
Figure.
The method comprising the steps of S101, S102 and S103.
In step s101, running gear 1 creates the first texture and the second texture in the 3D application of virtual reality, is based on
First texture creates virtual screen, and the display panel of 3D scene display is used for based on second texture creation.
Specifically, in step s101, running gear 1 creates the first texture Tex1 and the in the 3D application of virtual reality
Two texture Tex2;A surface (figure layer) is created subsequently, based on the first texture Tex1, creates one based on the surface
Virtual screen (VirtualDisplay);The display panel of 3D scene display is used for based on the second texture Tex2 creation.
Here, the surface is a handle of original image buffer area (raw buffer).
It is existing that those skilled in the art will be understood that the modes such as above-mentioned creation texture, creation virtual screen are referred to
Creation mode in virtual reality carries out.
In step s 102, running gear 1 copies the displaying content of first texture to second texture in real time
In, with the displaying content in the corresponding virtual screen of the first texture described in the real-time exhibition in the display panel.
Specifically, in step s 102, running gear 1 copies the displaying content of the first texture Tex1 to institute in real time
It states in the second texture Tex2, for example, using FrameBuffer (frame buffering) function of OpenGL ES, in real time by the first texture
The content copy to be shown in Tex1 into the second texture Tex2, in this way, in the display panel can real-time exhibition should
Displaying content in the corresponding virtual screen of first texture, that is, user can see in virtual screen in 3D scene in real time
Picture.
Here, FrameBuffer mechanism imitates the function of video card, video card hardware Structural abstraction is fallen, can be passed through
The read-write of FrameBuffer directly operates video memory.FrameBuffer can be regarded as one of display memory by user
Image maps that after the process address space, so that it may directly it is written and read, and write operation can be existed with immediate response
On screen.This operation be it is abstract, it is unified.User need not be concerned about that the position of physics video memory, paging mechanism etc. are specific thin
Section, these are completed by FrameBuffer device drives.
Here, OpenGL ES (OpenGL for Embedded Systems, embedded system open graphic library) is
The subset of OpenGL 3-D graphic API, is designed for embedded devices such as mobile phone, PDA and game hosts.OpenGL(Open
Graphics Library, open graphic library) it is to define the specification across programming language, cross-platform programming interface,
It is for three-dimensional image (two-dimensional also can).OpenGL is a graphic package interface for profession, is powerful a, called side
Just underlying graphics library.
In step s 103, running gear 1 maps to the figure layer generated when 2D application is run in the virtual screen, with
The 2D application is run in 3D scene.
Specifically, 2D application can generate a figure layer at runtime, it is generally the case that the figure layer can quilt
SurfaceFlinger merging is shown to main screen, and herein, in step s 103, running gear 1 will run raw when 2D application
At figure layer map in the virtual screen, for example, by modification SurfaceFliger module in correlative code so that
The figure layer that operation 2D is generated when applying is shown in aforementioned virtual screen, to run the 2D application in 3D scene.
Here, SurfaceFlinger is an independent service, it is responsible for managing the surface of application end, will owns
Surface it is compound, be one layer between shape library and application.Each apply is completed respectively in the surface of own
After kind graphic operation, request SurfaceFlinger is shown to screen, and surfaceflinger will fold all surface
It adds up, and reflects and arrive framebuffer.
It is shown at this point, 2D is applied on foreground, 3D is applied to be shown on backstage, and 2D application operates in one piece in 3D application scenarios
Above panel, and the scene of 3D application can arbitrarily switch.Also, with the postural change of user, running gear 1 can be real-time
The head position of user is tracked on ground, to change the 3D scene in real time, so that the usage experience of user is truer.Similarly,
After above-mentioned processing, the 2D application that user is seen is the picture of the 2D application after anti-distortion, and user can be in void
Intend going operation 2D application on the huge curtain of a 3D scene of reality as operating handset, user is substantially improved on VR all-in-one machine
The experience applied using traditional 2D.
Fig. 2 shows the schematic diagrames for running 2D application in the 3D scene of virtual reality according to preceding method.
Running gear 1 creates the first texture and the second texture in the 3D application of virtual reality, is based on first texture
Virtual screen is created, the display panel of 3D scene display is used for based on second texture creation;By the exhibition of first texture
Show that content is copied in real time in second texture, with the corresponding institute of the first texture described in the real-time exhibition in the display panel
State the displaying content in virtual screen;The figure layer that generates maps in the virtual screen when will run 2D application, with
The 2D application is run in 3D scene.
Fig. 3 shows the effect picture that 2D application is run in the 3D scene of virtual reality according to preceding method.2D application
It operates in one in 3D application scenarios piece of display panel.The scene of 3D application can arbitrarily switch.
Preferably, this method for example can be applied in Android android system.Those skilled in the art will be understood that
If other systems that are existing or being likely to occur from now on are equally applicable to the present invention, should also be included within the scope of protection of the present invention,
And it is incorporated herein by reference herein.
Preferably, this method further includes step S104 (not shown) and S105 (not shown).In step S104, operation dress
It sets the system message of the handle of 1 interception virtual reality device and is sent to the 3D application, the 3D, which is applied, utilizes handle ray
It is interacted with the display panel for showing the 2D application, obtains interaction point;In step s105, running gear 1 is by the friendship
Mutually point is converted into applying the touch information and sending of the virtual screen to the 2D, is operated with applying to the 2D.
Specifically, when in 3D scene operation have 2D in application, the 3D application lost focus, therefore, system
Press key message, handle state etc. information, system will not active transmission give 3D application, require at this time a module come
Corresponding system message is intercepted, and is sent to 3D application.Therefore, in step S104, running gear 1 for example passes through preset section
Modulus block intercepts the system message of the handle of virtual reality device and is sent to the 3D application, and the 3D, which is applied, utilizes handle
Ray is interacted with the display panel for showing the 2D application, obtains interaction point.Here, can regard 2D application as 3D
The presentation of a certain specific object in scape, system may determine that the ray that handle issues is to interact with 2D application, or answer with 3D
With interaction.
Then, in step s105, the interaction point is converted into the touch message to the virtual screen by running gear 1
And it is sent to the 2D application, to operate to 2D application, which includes but is not limited to click, double-click, right button, cunning
Dynamic, touch zooms in or out.
Those skilled in the art will be understood that the above-mentioned operation to 2D application is only for example, other are existing or from now on may
The operation to 2D application occurred, is such as applicable to the present invention, should also be included within the scope of protection of the present invention, and herein to draw
Mode is incorporated herein.
Preferably, this method further includes step S106 (not shown).
In step s 106, running gear 1 carries out specially treated to the life cycle that the 3D is applied, so that in the 3D
After operation, start the 2D in application, 3D application does not enter pause or halted state.
Specifically, due to system default, only one application can obtain focus, for example, in Android android system
It is that default only one application can obtain focus, therefore, as starting 2D in application, 3D application can lose focus, and if
3D application loses focus, can stop drawing under normal circumstances, can not carry out scene rendering.Therefore, herein, in step s 106,
Running gear 1 carries out specially treated to the life cycle that the 3D is applied, for example, by adjusting the related generation in framework
Code, so that starting the 2D in application, 3D application does not enter pause pause or stopping after 3D application operation
Stop state.In this way, 3D scene can continue to render, while the state of 2D application can be seen in real time, and interact.And it is right
The life cycle of Ying Di, 2D application can then not have to carry out specially treated, using default setting.
Preferably, this method further includes step S107 (not shown).
In step s 107, running gear 1 exits the 3D in application, first closing all operate in the 3D scene
2D application, then exit the 3D application.Otherwise, 2D is run in 3D scene when next time in application, 2D application can not know throw
Be mapped to where, and this 2D application can also be projected without virtual screen.
Fig. 4 shows the device signal that 2D application is run in the 3D scene of virtual reality according to a further aspect of the present invention
Figure.
Running gear 1 includes creating device 401, copy device 402 and mapping device 403.
Creating device 401 creates the first texture and the second texture in the 3D application of virtual reality, is based on first line
Reason creation virtual screen, the display panel of 3D scene display is used for based on second texture creation.
Specifically, creating device 401 creates the first texture Tex1 and the second texture Tex2 in the 3D application of virtual reality;
A surface is created subsequently, based on the first texture Tex1, a virtual screen is created based on the surface (figure layer)
(VirtualDisplay);The display panel of 3D scene display is used for based on the second texture Tex2 creation.
Here, the surface is a handle of original image buffer area (raw buffer).
It is existing that those skilled in the art will be understood that the modes such as above-mentioned creation texture, creation virtual screen are referred to
Creation mode in virtual reality carries out.
Copy device 402 copies the displaying content of first texture in second texture in real time, described
The displaying content in display panel in the corresponding virtual screen of the first texture described in real-time exhibition.
Specifically, copy device 402 copies the displaying content of the first texture Tex1 to second texture in real time
In Tex2, for example, will be opened up in the first texture Tex1 in real time using FrameBuffer (frame buffering) function of OpenGL ES
The content copy shown is into the second texture Tex2, in this way, can real-time exhibition first texture pair in the display panel
Displaying content in the virtual screen answered, that is, user can see the picture in virtual screen in 3D scene in real time.
Here, FrameBuffer mechanism imitates the function of video card, video card hardware Structural abstraction is fallen, can be passed through
The read-write of FrameBuffer directly operates video memory.FrameBuffer can be regarded as one of display memory by user
Image maps that after the process address space, so that it may directly it is written and read, and write operation can be existed with immediate response
On screen.This operation be it is abstract, it is unified.User need not be concerned about that the position of physics video memory, paging mechanism etc. are specific thin
Section, these are completed by FrameBuffer device drives.
Here, OpenGL ES (OpenGL for Embedded Systems, embedded system open graphic library) is
The subset of OpenGL 3-D graphic API, is designed for embedded devices such as mobile phone, PDA and game hosts.OpenGL(Open
Graphics Library, open graphic library) it is to define the specification across programming language, cross-platform programming interface,
It is for three-dimensional image (two-dimensional also can).OpenGL is a graphic package interface for profession, is powerful a, called side
Just underlying graphics library.
Mapping device 403 maps to the figure layer generated when 2D application is run in the virtual screen, in 3D scene
Run the 2D application.
Specifically, 2D application can generate a figure layer at runtime, it is generally the case that the figure layer can quilt
SurfaceFlinger merging is shown to main screen, and herein, mapping device 403 is mapped the figure layer generated when 2D is applied is run
Extremely in the virtual screen, for example, by the correlative code in modification SurfaceFliger module, so that operation 2D is applied
The figure layer of Shi Shengcheng is shown in aforementioned virtual screen, to run the 2D application in 3D scene.
Here, SurfaceFlinger is an independent service, it is responsible for managing the surface of application end, will owns
Surface it is compound, be one layer between shape library and application.Each apply is completed respectively in the surface of own
After kind graphic operation, request SurfaceFlinger is shown to screen, and surfaceflinger will fold all surface
It adds up, and reflects and arrive framebuffer.
It is shown at this point, 2D is applied on foreground, 3D is applied to be shown on backstage, and 2D application operates in one piece in 3D application scenarios
Above panel, and the scene of 3D application can arbitrarily switch.Also, with the postural change of user, running gear 1 can be real-time
The head position of user is tracked on ground, to change the 3D scene in real time, so that the usage experience of user is truer.Similarly,
After above-mentioned processing, the 2D application that user is seen is the picture of the 2D application after anti-distortion, and user can be in void
Intend going operation 2D application on the huge curtain of a 3D scene of reality as operating handset, user is substantially improved on VR all-in-one machine
The experience applied using traditional 2D.
Preferably, which for example can be applied in Android android system.Those skilled in the art should be able to
Understand, if other systems that are existing or being likely to occur from now on are equally applicable to the present invention, should also be included in the scope of the present invention
Within, and be incorporated herein by reference herein.
Preferably, which further includes 405 (not shown) of 404 (not shown) of capture device and conversion equipment.Interception
Device 404 intercepts the system message of the handle of virtual reality device and is sent to the 3D application, and the 3D, which is applied, utilizes handle
Ray is interacted with the display panel for showing the 2D application, obtains interaction point;Conversion equipment 405 converts the interaction point
The touch information and sending of the pairs of virtual screen is applied to the 2D, is operated with applying to the 2D.
Specifically, when in 3D scene operation have 2D in application, the 3D application lost focus, therefore, system
Press key message, handle state etc. information, system will not active transmission give 3D application, require at this time a module come
Corresponding system message is intercepted, and is sent to 3D application.Therefore, capture device 404 for example passes through preset interception module, interception
The system message of the handle of virtual reality device is simultaneously sent to the 3D application, and the 3D is applied using handle ray and shown institute
The display panel for stating 2D application interacts, and obtains interaction point.Here, can regard 2D application as a certain tool in 3D scene
The presentation of body object, system may determine that the ray that handle issues is to interact with 2D application, or interact with 3D application.
Then, the interaction point is converted into the touch information and sending of the virtual screen to institute by conversion equipment 405
2D application is stated, to operate to 2D application, the operation including but not limited to click, double-click, right button, sliding, touch are put
Big or diminution etc..
Those skilled in the art will be understood that the above-mentioned operation to 2D application is only for example, other are existing or from now on may
The operation to 2D application occurred, is such as applicable to the present invention, should also be included within the scope of protection of the present invention, and herein to draw
Mode is incorporated herein.
Preferably, which further includes 406 (not shown) of processing unit, the Life Cycle for applying to the 3D
Phase carry out specially treated so that the 3D application operation after, start the 2D in application, the 3D application do not enter pause or
Halted state.
Specifically, due to system default, only one application can obtain focus, for example, in Android android system
It is that default only one application can obtain focus, therefore, as starting 2D in application, 3D application can lose focus, and if
3D application loses focus, can stop drawing under normal circumstances, can not carry out scene rendering.Therefore, herein, processing unit 406 is right
The life cycle of the 3D application carries out specially treated, for example, by adjusting the correlative code in framework, so that in institute
After stating 3D application operation, start the 2D in application, 3D application does not enter pause pause or stops stop state.In this way,
3D scene can continue to render, while can see the state of 2D application in real time, and interact.And accordingly, 2D application
Life cycle can then not have to carry out specially treated, using default setting.
Preferably, which further includes 407 (not shown) of extractor.
Extractor 407 exits the 3D in application, first closing all 2D applications operated in the 3D scene, then move back
The 3D application out.Otherwise, run 2D in 3D scene when next time in application, 2D application can not know to project where, and
This 2D application can also be projected without virtual screen.
The present invention also provides a kind of computer readable storage medium, the computer-readable recording medium storage has calculating
Machine code, when the computer code is performed, such as preceding described in any item methods are performed.
The present invention also provides a kind of computer program products, when the computer program product is executed by computer equipment
When, such as preceding described in any item methods are performed.
The present invention also provides a kind of computer equipment, the computer equipment includes:
One or more processors;
Memory, for storing one or more computer programs;
When one or more of computer programs are executed by one or more of processors so that it is one or
Multiple processors realize such as preceding described in any item methods.
It should be noted that the present invention can be carried out in the assembly of software and/or software and hardware, for example, this hair
Specific integrated circuit (ASIC) can be used in bright each device or any other is realized similar to hardware device.In one embodiment
In, software program of the invention can be executed to implement the above steps or functions by processor.Similarly, of the invention
Software program (including relevant data structure) can be stored in computer readable recording medium, for example, RAM memory,
Magnetic or optical driver or floppy disc and similar devices.In addition, some of the steps or functions of the present invention may be implemented in hardware, example
Such as, as the circuit cooperated with processor thereby executing each step or function.
It is obvious to a person skilled in the art that invention is not limited to the details of the above exemplary embodiments, Er Qie
In the case where without departing substantially from spirit or essential attributes of the invention, the present invention can be realized in other specific forms.Therefore, no matter
From the point of view of which point, the present embodiments are to be considered as illustrative and not restrictive, and the scope of the present invention is by appended power
Benefit requires rather than above description limits, it is intended that all by what is fallen within the meaning and scope of the equivalent elements of the claims
Variation is included in the present invention.Any reference signs in the claims should not be construed as limiting the involved claims.This
Outside, it is clear that one word of " comprising " does not exclude other units or steps, and odd number is not excluded for plural number.That states in system claims is multiple
Unit or device can also be implemented through software or hardware by a unit or device.The first, the second equal words are used to table
Show title, and does not indicate any particular order.
Claims (11)
1. a kind of method for running 2D application in the 3D scene of virtual reality, wherein this method comprises:
The first texture and the second texture are created in the 3D application of virtual reality, virtual screen is created based on first texture,
The display panel of 3D scene display is used for based on second texture creation;
The displaying content of first texture is copied in real time in second texture, to be opened up in real time in the display panel
Show the displaying content in the corresponding virtual screen of first texture;
The figure layer generated when 2D application will be run to map in the virtual screen, to run the 2D application in 3D scene.
2. according to the method described in claim 1, wherein, this method further include:
It intercepts the system message of the handle of virtual reality device and is sent to the 3D application, the 3D, which is applied, utilizes handle ray
It is interacted with the display panel for showing the 2D application, obtains interaction point;
The interaction point is converted into applying the touch information and sending of the virtual screen to the 2D, to answer the 2D
With being operated.
3. method according to claim 1 or 2, wherein this method further include:
Specially treated is carried out to the life cycle of 3D application, so that starting the 2D application after 3D application operation
When, the 3D application does not enter pause or halted state.
4. according to the method described in claim 3, wherein, this method further include:
The 3D is exited in application, first closing all 2D applications operated in the 3D scene, then exit the 3D application.
5. a kind of running gear for running 2D application in the 3D scene of virtual reality, wherein the running gear includes:
Creating device is based on first texture for creating the first texture and the second texture in the 3D of virtual reality application
Virtual screen is created, the display panel of 3D scene display is used for based on second texture creation;
Copy device, for copying the displaying content of first texture in second texture in real time, in the exhibition
Show the displaying content in panel in the corresponding virtual screen of the first texture described in real-time exhibition;
Mapping device maps in the virtual screen for will run the figure layer generated when 2D application, to transport in 3D scene
The row 2D application.
6. running gear according to claim 5, wherein the running gear further include:
Capture device, for intercepting the system message of the handle of virtual reality device and being sent to the 3D application, the 3D is answered
It is interacted with using handle ray with the display panel for showing the 2D application, obtains interaction point;
Conversion equipment, for the interaction point to be converted into answering the touch information and sending of the virtual screen to the 2D
With to be operated to 2D application.
7. running gear according to claim 5 or 6, wherein the running gear further include:
Processing unit, the life cycle for applying to the 3D carry out specially treated, so that opening after 3D application operation
The 2D is moved in application, 3D application does not enter pause or halted state.
8. running gear according to claim 7, wherein the running gear further include:
Extractor, for exiting the 3D in application, first closing all 2D applications operated in the 3D scene, then exit
The 3D application.
9. a kind of computer readable storage medium, the computer-readable recording medium storage has computer code, when the meter
Calculation machine code is performed, and method according to any one of claims 1 to 4 is performed.
10. a kind of computer program product, when the computer program product is executed by computer equipment, such as claim 1
It is performed to method described in any one of 4.
11. a kind of computer equipment, the computer equipment include:
One or more processors;
Memory, for storing one or more computer programs;
When one or more of computer programs are executed by one or more of processors, so that one or more of
Processor realizes method according to any one of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810904316.1A CN109308742A (en) | 2018-08-09 | 2018-08-09 | A kind of method and apparatus running 2D application in the 3D scene of virtual reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810904316.1A CN109308742A (en) | 2018-08-09 | 2018-08-09 | A kind of method and apparatus running 2D application in the 3D scene of virtual reality |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109308742A true CN109308742A (en) | 2019-02-05 |
Family
ID=65225918
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810904316.1A Withdrawn CN109308742A (en) | 2018-08-09 | 2018-08-09 | A kind of method and apparatus running 2D application in the 3D scene of virtual reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109308742A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111459266A (en) * | 2020-03-02 | 2020-07-28 | 重庆爱奇艺智能科技有限公司 | Method and device for operating 2D application in virtual reality 3D scene |
CN112200901A (en) * | 2020-10-30 | 2021-01-08 | 南京爱奇艺智能科技有限公司 | Three-dimensional display method and device of target application and virtual reality equipment |
CN112785530A (en) * | 2021-02-05 | 2021-05-11 | 广东九联科技股份有限公司 | Image rendering method, device and equipment for virtual reality and VR equipment |
CN113342220A (en) * | 2021-05-11 | 2021-09-03 | 杭州灵伴科技有限公司 | Window rendering method, head-mounted display kit, and computer-readable medium |
WO2024066750A1 (en) * | 2022-09-29 | 2024-04-04 | 歌尔股份有限公司 | Display control method and apparatus, augmented reality head-mounted device, and medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150116310A1 (en) * | 2013-10-28 | 2015-04-30 | Vmware, Inc. | Method and system to virtualize graphic processing services |
CN105426191A (en) * | 2015-11-23 | 2016-03-23 | 深圳创维-Rgb电子有限公司 | User interface display processing method and device |
CN105447898A (en) * | 2015-12-31 | 2016-03-30 | 北京小鸟看看科技有限公司 | Method and device for displaying 2D application interface in virtual real device |
CN105528207A (en) * | 2015-12-03 | 2016-04-27 | 北京小鸟看看科技有限公司 | Virtual reality system, and method and apparatus for displaying Android application images therein |
CN106200956A (en) * | 2016-07-07 | 2016-12-07 | 北京时代拓灵科技有限公司 | A kind of field of virtual reality multimedia presents and mutual method |
CN107277483A (en) * | 2017-05-11 | 2017-10-20 | 深圳市冠旭电子股份有限公司 | A kind of virtual reality display methods, device and virtual reality glasses |
-
2018
- 2018-08-09 CN CN201810904316.1A patent/CN109308742A/en not_active Withdrawn
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150116310A1 (en) * | 2013-10-28 | 2015-04-30 | Vmware, Inc. | Method and system to virtualize graphic processing services |
CN105426191A (en) * | 2015-11-23 | 2016-03-23 | 深圳创维-Rgb电子有限公司 | User interface display processing method and device |
CN105528207A (en) * | 2015-12-03 | 2016-04-27 | 北京小鸟看看科技有限公司 | Virtual reality system, and method and apparatus for displaying Android application images therein |
CN105447898A (en) * | 2015-12-31 | 2016-03-30 | 北京小鸟看看科技有限公司 | Method and device for displaying 2D application interface in virtual real device |
CN106200956A (en) * | 2016-07-07 | 2016-12-07 | 北京时代拓灵科技有限公司 | A kind of field of virtual reality multimedia presents and mutual method |
CN107277483A (en) * | 2017-05-11 | 2017-10-20 | 深圳市冠旭电子股份有限公司 | A kind of virtual reality display methods, device and virtual reality glasses |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111459266A (en) * | 2020-03-02 | 2020-07-28 | 重庆爱奇艺智能科技有限公司 | Method and device for operating 2D application in virtual reality 3D scene |
CN112200901A (en) * | 2020-10-30 | 2021-01-08 | 南京爱奇艺智能科技有限公司 | Three-dimensional display method and device of target application and virtual reality equipment |
CN112785530A (en) * | 2021-02-05 | 2021-05-11 | 广东九联科技股份有限公司 | Image rendering method, device and equipment for virtual reality and VR equipment |
CN112785530B (en) * | 2021-02-05 | 2024-05-24 | 广东九联科技股份有限公司 | Image rendering method, device and equipment for virtual reality and VR equipment |
CN113342220A (en) * | 2021-05-11 | 2021-09-03 | 杭州灵伴科技有限公司 | Window rendering method, head-mounted display kit, and computer-readable medium |
CN113342220B (en) * | 2021-05-11 | 2023-09-12 | 杭州灵伴科技有限公司 | Window rendering method, head-mounted display suite and computer-readable medium |
WO2024066750A1 (en) * | 2022-09-29 | 2024-04-04 | 歌尔股份有限公司 | Display control method and apparatus, augmented reality head-mounted device, and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109308742A (en) | A kind of method and apparatus running 2D application in the 3D scene of virtual reality | |
EP3008701B1 (en) | Using compute shaders as front end for vertex shaders | |
EP2469474B1 (en) | Creation of a playable scene with an authoring system | |
US20100289804A1 (en) | System, mechanism, and apparatus for a customizable and extensible distributed rendering api | |
KR20210151114A (en) | Hybrid rendering | |
US8938093B2 (en) | Addition of immersive interaction capabilities to otherwise unmodified 3D graphics applications | |
US10628995B2 (en) | Anti-aliasing of graphical elements defined based on functions | |
US10134170B2 (en) | Stereoscopic rendering using vertix shader instancing | |
US20150339038A1 (en) | System and method for capturing occluded graphical user interfaces | |
JP2016529593A (en) | Interleaved tiled rendering of 3D scenes | |
JP2012190428A (en) | Stereoscopic image visual effect processing method | |
US20130127849A1 (en) | Common Rendering Framework and Common Event Model for Video, 2D, and 3D Content | |
US11042955B2 (en) | Manipulating display content of a graphical user interface | |
CN111459266A (en) | Method and device for operating 2D application in virtual reality 3D scene | |
Eisemann et al. | Visibility sampling on gpu and applications | |
Mortensen et al. | Real-time global illumination for vr applications | |
Bues et al. | VD1: a technical approach to a hybrid 2D and 3D desktop environment | |
TW202141429A (en) | Rendering using shadow information | |
Buhr et al. | Real-time aspects of VR systems | |
CN115715464A (en) | Method and apparatus for occlusion handling techniques | |
Karanjai | Optimizing Web Virtual Reality | |
US20160267642A1 (en) | Projecting a Virtual Copy of a Remote Object | |
Thelen | Advanced Visualization and Interaction Techniques for Large High-Resolution Displays | |
Nilsson | Hardware Supported Frame Correction in Touch Screen Systems-For a Guaranteed Low Processing Latency | |
Schulze-Döbold | Interactive volume rendering in virtual environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20190205 |
|
WW01 | Invention patent application withdrawn after publication |