CN109584148A - A kind of method and apparatus handling two-dimentional interface in VR equipment - Google Patents

A kind of method and apparatus handling two-dimentional interface in VR equipment Download PDF

Info

Publication number
CN109584148A
CN109584148A CN201811426537.9A CN201811426537A CN109584148A CN 109584148 A CN109584148 A CN 109584148A CN 201811426537 A CN201811426537 A CN 201811426537A CN 109584148 A CN109584148 A CN 109584148A
Authority
CN
China
Prior art keywords
interface
equipment
dimentional
focus
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811426537.9A
Other languages
Chinese (zh)
Inventor
王建峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing IQIYI Intelligent Technology Co Ltd
Original Assignee
Chongqing IQIYI Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing IQIYI Intelligent Technology Co Ltd filed Critical Chongqing IQIYI Intelligent Technology Co Ltd
Priority to CN201811426537.9A priority Critical patent/CN109584148A/en
Publication of CN109584148A publication Critical patent/CN109584148A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/08Projecting images onto non-planar surfaces, e.g. geodetic screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The method and apparatus that the object of the present invention is to provide a kind of to handle two-dimentional interface in VR equipment.According to the method for the present invention the following steps are included: obtaining the two-dimentional interface to be presented from other equipment;Determine the coordinate of the focus at the two-dimentional interface and VR handle ray intersection that are presented;According to the coordinate of the focus, the corresponding depth information in the two-dimentional interface is determined, to be corresponding three-dimensional planar by two-dimentional interface transformation.The invention has the following advantages that realizing being quickly converted from two-dimensional surface to three-dimensional planar, and, so that the user equipmenies such as mobile phone are able to respond user and use operation performed by the VR equipment such as handle, the demand interacted in VR equipment with the two-dimentional interface from other equipment is met, the user experience is improved.

Description

A kind of method and apparatus handling two-dimentional interface in VR equipment
Technical field
The present invention relates to the fields virtual reality (Virtual Reality, VR), more particularly to one kind to handle in VR equipment The method and apparatus at two-dimentional interface.
Background technique
In the prior art, the VR application of some three-dimensionals can provide the immersion impression more shaken for user.But for Some two-dimensional applications in the equipment such as mobile phone, since the interface in its application is two-dimensional, the general nothing of the scheme of the prior art Method is interacted by VR equipment and the interface of the two dimensional application.Therefore, based on the scheme of the prior art, for users such as mobile phones The interface of two dimensional application in equipment is unable to satisfy user by the VR such as handle equipment to hand over the interface of the two dimensional application Mutual demand, user experience are to be improved.
Summary of the invention
The method and apparatus that the object of the present invention is to provide a kind of to handle two-dimentional interface in VR equipment.
According to an aspect of the invention, there is provided a kind of method for handling two-dimentional interface in VR equipment, wherein described VR equipment includes VR handle, VR control equipment and VR screen, be the described method comprises the following steps:
A obtains the two-dimentional interface to be presented from other equipment;
B determines the coordinate of the focus at the two-dimentional interface and VR handle ray intersection that are presented;
C determines the corresponding depth information in the two-dimentional interface, thus by two-dimentional interface transformation according to the coordinate of the focus For corresponding three-dimensional planar.
According to an aspect of the present invention, a kind of interface processing dress that two-dimentional interface is handled in VR equipment is additionally provided It sets, wherein the interface processor unit includes:
Interface acquisition device, for obtaining the two-dimentional interface to be presented from other equipment;
Focus determining device, the coordinate of the focus for determining the two-dimentional interface and VR handle ray intersection that are presented;
Plane conversion device determines the corresponding depth information in the two-dimentional interface, thus will according to the coordinate of the focus Two-dimentional interface transformation is corresponding three-dimensional planar.
According to an aspect of the present invention, a kind of VR equipment is additionally provided, wherein the VR equipment includes according to the present invention Interface processor unit.
Compared with prior art, the invention has the following advantages that by being by the two-dimentional interface transformation from other equipment The three-dimensional planar with depth information presented in VR equipment, realizes being quickly converted from two-dimensional surface to three-dimensional planar; Also, makes the user equipmenies such as mobile phone be able to respond user according to the solution of the present invention and use behaviour performed by the VR equipment such as handle Make, meets the demand interacted in VR equipment with the two-dimentional interface from other equipment, the user experience is improved.
Detailed description of the invention
By reading a detailed description of non-restrictive embodiments in the light of the attached drawings below, of the invention other Feature, objects and advantages will become more apparent upon:
Fig. 1 illustrates a kind of method flow diagram that two-dimentional interface is handled in VR equipment according to the present invention;
Fig. 2 illustrates according to the present invention a kind of for handling the interface processor unit at two-dimentional interface in VR equipment Structural schematic diagram.
The same or similar appended drawing reference represents the same or similar component in attached drawing.
Specific embodiment
Present invention is further described in detail with reference to the accompanying drawing.
Fig. 1 shows a kind of method flow diagram that two-dimentional interface is handled in VR equipment according to the present invention.
Wherein, it is filled according to the method for the present invention by the inclusion of the interface processing in VR (Virtual Reality) equipment It sets to realize.The VR equipment includes VR handle, VR control equipment and VR screen.
Preferably, VR equipment according to the present invention can be interacted by network and user equipment.
The user equipment include but is not limited to any one can with user by keyboard, mouse, remote controler, touch tablet, Or the modes such as voice-operated device carry out the electronic product of human-computer interaction, for example, personal computer, tablet computer, smart phone, PDA, Game machine or IPTV etc..Wherein, network locating for the user equipment and the network equipment includes but is not limited to internet, wide area Net, Metropolitan Area Network (MAN), local area network, VPN network etc..
It should be noted that the user equipment, the network equipment and network are only for example, other are existing or from now on may be used Can occur user equipment, the network equipment and network be such as applicable to the present invention, should also be included in the scope of the present invention with It is interior, and be incorporated herein by reference.
Referring to Fig.1, in step sl, interface processor unit obtains the two-dimentional interface to be presented from other equipment.
Wherein, the two-dimentional interface includes various types of two-dimensional interfaces, for example, the application in mobile phone or tablet computer Interface or the user interface of operating system etc..
Wherein, interface processor unit obtains the mode at the two-dimentional interface to be presented from other equipment and includes:
1) it receives from the two-dimentional interface that other equipment are sent, as two-dimentional interface to be presented;
2) the pre-stored two-dimentional interface from other equipment is obtained, as two-dimentional interface to be presented.
Preferably, interface processor unit is cut out the two-dimentional interface from other equipment got, thus should The a part at two-dimentional interface is as two-dimentional interface to be presented.
Preferably, the two-dimentional interface from other equipment is presented in partial region of the interface processor unit in VR screen.
For example, VR screen is divided into two parts in left and right, interface processor unit presents in the part on the left side and comes from other The two-dimentional interface of equipment.
Continue to be illustrated Fig. 1, in step s 2, interface processor unit determines the two-dimentional interface presented and VR handle The coordinate of the focus of ray intersection.
Wherein, the interface processor unit determines the coordinate of the focus at the two-dimentional interface and VR handle ray intersection that are presented Mode include:
If 1) the corresponding three-dimensional planar in two-dimentional interface to be presented is fixed, interface processor unit solves three-dimensional planar Plane equation and the ray equation of VR handle ray determine three-dimensional planar then according to the plane equation and ray equation With the coordinate of the focus of VR handle ray.
For example, it is assumed that the position at the two-dimentional interface presented is fixed, the plane of the corresponding three-dimensional planar in two-dimentional interface Normal vector is expressed as n (a, b, c), and the coordinate of the central point of plane is P (0,0 ,-D), wherein D be three-dimensional planar central point and The distance of space coordinate origin.Then the plane equation of three-dimensional planar indicates are as follows:
A (x -0)+b (y-0)+c (z+D)=0 (1)
Plane PlaneX is obtained by solving the equation;The direction vector of handle ray is expressed as t (l, m, n), and starting point is sat It is designated as Q (0,0,0), then ray equation indicates are as follows:
(x-0)/l=(y-0)/m=(z-0)/m (2)
Ray RayX is obtained by solving the ray equation.Then, pass through solution:
IntersectPoint (x, y, z)=intersect (Planx, RayX) (3)
Obtain the coordinate of the focus of three-dimensional planar and ray, wherein the seat of IntersectPoint (x, y, z) expression focus Mark, intersect are the function for solving focus.
2) if the corresponding three-dimensional planar in two-dimentional interface to be presented is movement, and initial plane corresponds to three-dimensional put down The initial position in face, current plane correspond to the current location of three-dimensional planar, and interface processor unit solves the plane of current plane The ray equation of equation and VR handle ray determines current then according to the plane equation and ray equation of the current plane The coordinate of plane and the focus of VR handle ray, the exercise data of coordinate and VR the control equipment then according to the focus, really Determine the corresponding points in initial plane with the focus.
Preferably, interface processor unit will wear position of the eyes in virtual scene of the user that VR control equipment as Space coordinate origin.
Specifically, interface processor unit can be using the position of left eye or right eye in virtual scene as space coordinate origin. Preferably, interface processor unit is based on scheduled interpupillary distance (such as 6cm), by position of the central point of left and right or right eye in virtual scene It sets as space coordinate origin.
For example, it is assumed that the two-dimentional interface presented is always immediately ahead of the eyes of user for wearing VR control equipment, it is two-dimentional The planar process vector of the corresponding three-dimensional planar in interface is expressed as n (a, b, c), and the coordinate of the central point of plane is P (aD, bD, cD), Wherein, D is the central point of three-dimensional planar and the distance of space coordinate origin.Then the plane equation of current plane indicates are as follows:
A (x-aD)+b (y-bD)+c (z-cD)=0 (4)
Current plane PlaneX is obtained by solving the equation;The direction vector of handle ray is expressed as t (l, m, n), rises Point coordinate is Q (0,0,0), then ray equation is expressed as above-mentioned equation (2), obtains ray RayX by solving the ray equation. Then, by solve above-mentioned equation (3) obtain current plane and VR handle ray focus coordinate IntersectPoint (x, y,z)。
Then, pass through solution:
Origin_intersect_point (x, y, z)=IntersectPoint (x, y, z) * Rotate (- Headf_ Rotate_Matrix(hx,hy,hz,hw)) (5)
Obtain the coordinate in initial plane with the corresponding points of the focus, wherein Headf_Rotate_Matrix (hx, hy, Hz, hw) indicate head control equipment moving quaternion representation, origin_intersect_point (x, y, z) indicate initial plane In with IntersectPoint (x, y, z) corresponding focus coordinate.
It should be noted that the above-mentioned examples are merely illustrative of the technical solutions of the present invention, rather than to limit of the invention System, it should be appreciated by those skilled in the art that the seat of the focus at two-dimentional interface and VR handle ray intersection that any determination is presented Target implementation, should be included in the scope of the present invention.
Continue to be illustrated Fig. 1, in step s3, interface processor unit according to the coordinate of the focus, determine described in The corresponding depth information in two-dimentional interface, to be corresponding three-dimensional planar by two-dimentional interface transformation.
For example, using the interface that a two-dimensional game is applied in mobile phone as two-dimentional interface to be presented, interface processing Device is based on the coordinate that above-mentioned equation (5) obtain the focus of two-dimentional interface and VR handle ray intersection, and according to the seat of the focus Mark obtains the corresponding depth information in two dimension interface, to be corresponding three-dimensional planar by the two dimension interface transformation, so that in VR The interface of the game application presented in screen has depth information.
Preferably for any in the three-dimensional planar after conversion, by removing the letter of the depth in its three-dimensional coordinate Breath, obtains the point corresponding two-dimensional coordinate in two-dimentional interface.
For example, being (X, Y, Z) for the coordinate of the focus obtained by above-mentioned equation (3), depth information Z is removed, can be obtained Corresponding two-dimensional coordinate (X, the Y) into two-dimentional interface.
According to a preferred embodiment of the invention, when user is operated by VR handle, the method includes steps Rapid S4 (not shown), step S5 (not shown) and step S6 (not shown).
In step s 4, interface processor unit determines in three-dimensional planar according to motion-sensing data corresponding with the operation Location point to be processed.
Wherein, the motion-sensing data include the various data for representing motion state of VR equipment, such as VR control equipment Or quaternary number information of VR handle etc..
In step s 5, interface processor unit obtains the location point to be processed in two dimension according to location point to be processed Corresponding two-dimensional coordinate in interface.
Specifically, interface processor unit is converted into corresponding according to the corresponding three-dimensional coordinate of location point to be processed Two-dimensional coordinate obtains the location point to be processed corresponding two-dimensional coordinate in two-dimentional interface.
In step s 6, interface processor unit is by the two-dimensional coordinate of location point to be processed and operation corresponding with the operation Relevant information is sent to other equipment, to execute corresponding operation in other equipment.
Wherein, the operation relevant information includes the relevant various information of the operation that is executed to user by VR equipment, example Such as action name, operation duration etc..
For example, two-dimentional interface to be presented is the broadcast interface of mobile phone music player, interface processor unit passes through execution The two-dimensional broadcast interface is converted to corresponding three-dimensional planar and is in VR screen by the operation of step S1 to step S3 It is existing.User performs the operation for clicking broadcast button in interface by VR handle, then interface processor unit is corresponding according to the operation VR handle motion sensing data, by point corresponding with the broadcast button region in three-dimensional planar as position to be processed Point is converted into corresponding two-dimensional coordinate, it is to be processed to obtain this then according to the corresponding three-dimensional coordinate of location point to be processed Location point in two-dimentional interface corresponding two-dimensional coordinate, then by the two-dimensional coordinate of location point to be processed and with the operation pair The operation relevant information " click " answered is sent to its mobile phone, so that the mobile phone determines therefrom that point to be processed corresponds in interface Broadcast button, on-unit is " broadcastings ", to pass through the behaviour of VR handle to user in the music player of mobile phone It is responded to play music.
Preferably, the corresponding of the operation of operation and other equipment of the interface processor unit based on pre-stored VR equipment is closed System, determines operation corresponding operation in other equipment that user is executed by VR equipment, as operation relevant information.
For example, by the way that each generic operation of VR handle is corresponded to clicking in smart phone, the contact actions such as screen sliding, Processing is being pre-stored the corresponding relationship, to determine intelligence based on the corresponding relationship when user executes a certain operation by VR equipment Can in mobile phone with corresponding contact action.
According to the method for the present invention, by being the tool presented in VR equipment by the two-dimentional interface transformation from other equipment There is the three-dimensional planar of depth information, realizes being quickly converted from two-dimensional surface to three-dimensional planar;Also, side according to the present invention Case makes the user equipmenies such as mobile phone be able to respond user using operation performed by the VR equipment such as handle, meets in VR equipment The demand interacted with the two-dimentional interface from other equipment, the user experience is improved.
Fig. 2 illustrates according to the present invention a kind of for handling the interface processor unit at two-dimentional interface in VR equipment Structural schematic diagram.Interface processor unit according to the present invention includes interface acquisition device 1, focus determining device 2 and plane conversion Device 3.
Referring to Fig. 2, interface acquisition device 1 obtains the two-dimentional interface to be presented from other equipment.
Wherein, the two-dimentional interface includes various types of two-dimensional interfaces, for example, the application in mobile phone or tablet computer Interface or the user interface of operating system etc..
Wherein, interface acquisition device 1 obtains the mode at the two-dimentional interface to be presented from other equipment and includes:
1) it receives from the two-dimentional interface that other equipment are sent, as two-dimentional interface to be presented;
2) the pre-stored two-dimentional interface from other equipment is obtained, as two-dimentional interface to be presented.
Preferably, interface processor unit is cut out the two-dimentional interface from other equipment got, thus should The a part at two-dimentional interface is as two-dimentional interface to be presented.
Preferably, interface processor unit includes part presentation device.Part of the part interface processing unit in VR screen The two-dimentional interface from other equipment is presented in region.
For example, VR screen is divided into two parts in left and right, part interface processing unit is presented in the part on the left side and is come from The two-dimentional interface of other equipment.
Continue to be illustrated Fig. 1, focus determining device 2 determines the two-dimentional interface and VR handle ray intersection presented The coordinate of focus.
Wherein, the focus determining device 2 determines the seat of the focus at the two-dimentional interface and VR handle ray intersection that are presented Target mode includes:
If 1) the corresponding three-dimensional planar in two-dimentional interface to be presented is fixed, focus determining device 2 solves three-dimensional flat The plane equation in face and the ray equation of VR handle ray determine three-dimensional flat then according to the plane equation and ray equation The coordinate of the focus of face and VR handle ray.
2) if the corresponding three-dimensional planar in two-dimentional interface to be presented is movement, and initial plane corresponds to three-dimensional put down The initial position in face, current plane correspond to the current location of three-dimensional planar, and focus determining device 2 solves the plane of current plane The ray equation of equation and VR handle ray determines current then according to the plane equation and ray equation of the current plane The coordinate of plane and the focus of VR handle ray, the exercise data of coordinate and VR the control equipment then according to the focus, really Determine the corresponding points in initial plane with the focus.
Preferably, focus determining device 2 makees position of the eyes for wearing the user of VR control equipment in virtual scene For space coordinate origin.
Specifically, focus determining device 2 can be using the position of left eye or right eye in virtual scene as space coordinate origin. Preferably, focus determining device 2 is based on scheduled interpupillary distance (such as 6cm), by the central point of left and right or right eye in virtual scene Position is as space coordinate origin.
Continue to be illustrated Fig. 2, plane conversion device 3 determines the two-dimentional interface pair according to the coordinate of the focus The depth information answered, to be corresponding three-dimensional planar by two-dimentional interface transformation.
For example, focus determines using the interface that a two-dimensional game is applied in mobile phone as two-dimentional interface to be presented Device 2 is based on the coordinate that above-mentioned equation (5) obtain the focus of two-dimentional interface and VR handle ray intersection, and plane conversion device 3 The corresponding depth information in two dimension interface is obtained according to the coordinate of the focus, to be corresponding three-dimensional flat by the two dimension interface transformation Face, so that the interface of the game application presented in VR screen has depth information.
Preferably for any in the three-dimensional planar after conversion, interface processor unit its three-dimensional seat by removal Depth information in mark obtains the point corresponding two-dimensional coordinate in two-dimentional interface.
For example, being (X, Y, Z) for the coordinate of the focus obtained by above-mentioned equation (3), depth information Z is removed, can be obtained Corresponding two-dimensional coordinate (X, the Y) into two-dimentional interface.
According to a preferred embodiment of the invention, when user is operated by VR handle, the interface processing dress It sets including the first determining device, the first acquisition device and sending device.
First determining device determines position to be processed in three-dimensional planar according to motion-sensing data corresponding with the operation Point.
Wherein, the motion-sensing data include the various data for representing motion state of VR equipment, such as VR control equipment Or quaternary number information of VR handle etc..
It is corresponding in two-dimentional interface to obtain the location point to be processed according to location point to be processed for first acquisition device Two-dimensional coordinate.
Specifically, the first acquisition device is converted into corresponding according to the corresponding three-dimensional coordinate of location point to be processed Two-dimensional coordinate obtains the location point to be processed corresponding two-dimensional coordinate in two-dimentional interface.
The two-dimensional coordinate of location point to be processed and operation relevant information corresponding with the operation are sent to by sending device Other equipment, to execute corresponding operation in other equipment.
Wherein, the operation relevant information includes the relevant various information of the operation that is executed to user by VR equipment, example Such as action name, operation duration etc..
For example, two-dimentional interface to be presented is the broadcast interface of mobile phone music player, interface processor unit passes through execution Interface acquisition device 1 to interface conversion device 3 operation, by the two-dimensional broadcast interface be converted to corresponding three-dimensional planar and It is presented in VR screen.User performs the operation for clicking broadcast button in interface by VR handle, then the first determining device According to the corresponding VR handle motion sensing data of the operation, by in three-dimensional planar with the conduct of corresponding of the broadcast button region Location point to be processed, then the first acquisition device is converted into phase according to the corresponding three-dimensional coordinate of location point to be processed The two-dimensional coordinate answered obtains the location point to be processed corresponding two-dimensional coordinate in two-dimentional interface, and then sending device will be to The two-dimensional coordinate of the location point of processing and operation relevant information " click " corresponding with the operation are sent to its mobile phone, for this Mobile phone determines therefrom that the broadcast button that point to be processed corresponds in interface, and on-unit is " broadcasting ", thus in mobile phone Music player in user is responded by the operation of VR handle to play music.
Preferably, the corresponding of the operation of operation and other equipment of the interface processor unit based on pre-stored VR equipment is closed System, determines operation corresponding operation in other equipment that user is executed by VR equipment, as operation relevant information.
For example, by the way that each generic operation of VR handle is corresponded to clicking in smart phone, the contact actions such as screen sliding, Processing is being pre-stored the corresponding relationship, to determine intelligence based on the corresponding relationship when user executes a certain operation by VR equipment Can in mobile phone with corresponding contact action.
According to the solution of the present invention, by being the tool presented in VR equipment by the two-dimentional interface transformation from other equipment There is the three-dimensional planar of depth information, realizes being quickly converted from two-dimensional surface to three-dimensional planar;Also, side according to the present invention Case makes the user equipmenies such as mobile phone be able to respond user using operation performed by the VR equipment such as handle, meets in VR equipment The demand interacted with the two-dimentional interface from other equipment, the user experience is improved.
Software program of the invention can be executed to implement the above steps or functions by processor.Similarly, originally The software program (including relevant data structure) of invention can be stored in computer readable recording medium, for example, RAM is deposited Reservoir, magnetic or optical driver or floppy disc and similar devices.In addition, hardware can be used in fact in some steps of the invention or function It is existing, for example, as the circuit cooperated with processor thereby executing each function or step.
In addition, a part of the invention can be applied to computer program product, such as computer program instructions, when its quilt When computer executes, by the operation of the computer, it can call or provide according to the method for the present invention and/or technical solution. And the program instruction of method of the invention is called, it is possibly stored in fixed or moveable recording medium, and/or pass through Broadcast or the data flow in other signal-bearing mediums and transmitted, and/or be stored according to described program instruction operation In the working storage of computer equipment.Here, according to one embodiment of present invention including a device, which includes using Memory in storage computer program instructions and processor for executing program instructions, wherein when the computer program refers to When enabling by processor execution, method and/or skill of the device operation based on aforementioned multiple embodiments according to the present invention are triggered Art scheme.
It is obvious to a person skilled in the art that invention is not limited to the details of the above exemplary embodiments, Er Qie In the case where without departing substantially from spirit or essential attributes of the invention, the present invention can be realized in other specific forms.Therefore, no matter From the point of view of which point, the present embodiments are to be considered as illustrative and not restrictive, and the scope of the present invention is by appended power Benefit requires rather than above description limits, it is intended that all by what is fallen within the meaning and scope of the equivalent elements of the claims Variation is included in the present invention.Any reference signs in the claims should not be construed as limiting the involved claims.This Outside, it is clear that one word of " comprising " does not exclude other units or steps, and odd number is not excluded for plural number.That states in system claims is multiple Unit or device can also be implemented through software or hardware by a unit or device.The first, the second equal words are used to table Show title, and does not indicate any particular order.

Claims (15)

1. a kind of method for handling two-dimentional interface in VR equipment, wherein the VR equipment include VR handle, VR control equipment and VR screen, the described method comprises the following steps:
A obtains the two-dimentional interface to be presented from other equipment;
B determines the coordinate of the focus at the two-dimentional interface and VR handle ray intersection that are presented;
C determines the corresponding depth information in the two-dimentional interface according to the coordinate of the focus, to be phase by two-dimentional interface transformation The three-dimensional planar answered.
2. according to the method described in claim 1, wherein, when user is operated by VR handle, the method includes following Step:
According to motion-sensing data corresponding with the operation, location point to be processed in three-dimensional planar is determined;
According to location point to be processed, the location point to be processed corresponding two-dimensional coordinate in two-dimentional interface is obtained;
The two-dimensional coordinate of location point to be processed and operation relevant information corresponding with the operation are sent to other equipment, with Corresponding operation is executed in other equipment.
3. method according to claim 1 or 2, wherein if the corresponding three-dimensional planar in two-dimentional interface to be presented is solid It is fixed, the step b the following steps are included:
Solve the plane equation of three-dimensional planar and the ray equation of VR handle ray;
According to the plane equation and ray equation, the coordinate of the focus of three-dimensional planar and VR handle ray is determined.
4. method according to claim 1 or 2, wherein if the corresponding three-dimensional planar in two-dimentional interface is movement, and Initial plane corresponds to the initial position of three-dimensional planar, and current plane corresponds to the current location of three-dimensional planar, the method packet Include following steps:
Solve the plane equation of current plane and the ray equation of VR handle ray;
According to the plane equation and ray equation of the current plane, the seat of the focus of current plane and VR handle ray is determined Mark;
According to the exercise data of the coordinate of the focus and VR control equipment, the corresponding points in initial plane with the focus are determined.
5. method according to claim 1 to 4, wherein the method will wear the user of VR control equipment Position of the eyes in virtual scene as space coordinate origin.
6. according to the method described in claim 1, wherein, the described method comprises the following steps:
The two-dimentional interface from other equipment is presented in partial region in VR screen.
7. a kind of interface processor unit for handling two-dimentional interface in VR equipment, wherein the interface processor unit includes:
Interface acquisition device, for obtaining the two-dimentional interface to be presented from other equipment;
Focus determining device, the coordinate of the focus for determining the two-dimentional interface and VR handle ray intersection that are presented;
Plane conversion device determines the corresponding depth information in the two-dimentional interface, thus by two-dimentional according to the coordinate of the focus Interface transformation is corresponding three-dimensional planar.
8. interface processor unit according to claim 7, wherein when user is operated by VR handle, the interface Processing unit includes:
First determining device, for determining position to be processed in three-dimensional planar according to motion-sensing data corresponding with the operation It sets a little;
First acquisition device, it is corresponding in two-dimentional interface for according to location point to be processed, obtaining the location point to be processed Two-dimensional coordinate;
Sending device, for the two-dimensional coordinate of location point to be processed and operation relevant information corresponding with the operation to be sent to Other equipment, to execute corresponding operation in other equipment.
9. interface processor unit according to claim 7 or 8, wherein if the corresponding three-dimensional in two-dimentional interface to be presented Plane be it is fixed, the focus determining device is used for:
Solve the plane equation of three-dimensional planar and the ray equation of VR handle ray;
According to the plane equation and ray equation, the coordinate of the focus of three-dimensional planar and VR handle ray is determined.
10. interface processor unit according to claim 7 or 8, wherein if the corresponding three-dimensional in two-dimentional interface to be presented Plane is movement, and initial plane corresponds to the initial position of three-dimensional planar, and current plane corresponds to working as three-dimensional planar Front position, the focus determining device are used for:
Solve the plane equation of current plane and the ray equation of VR handle ray;
According to the plane equation and ray equation of the current plane, the seat of the focus of current plane and VR handle ray is determined Mark;
According to the exercise data of the coordinate of the focus and VR control equipment, the corresponding points in initial plane with the focus are determined.
11. interface processor unit according to any one of claims 7 to 10, wherein the interface processor unit will wear Position of the eyes of the user of VR control equipment in virtual scene is worn as space coordinate origin.
12. interface processor unit according to claim 7, wherein the interface processor unit includes:
The two-dimentional interface from other equipment is presented for the partial region in VR screen in part presentation device.
13. a kind of VR equipment, wherein the VR equipment includes the interface processing dress as described in right will go any one of 7 to 14 It sets.
14. a kind of computer equipment including memory, processor and stores the meter that can be run on a memory and on a processor Calculation machine program, which is characterized in that the processor realizes the side as described in any in claim 1 to 6 when executing described program Method.
15. a kind of computer-readable storage medium, is stored thereon with computer program, which is characterized in that the program is processed The method as described in any in claim 1 to 6 is realized when device executes.
CN201811426537.9A 2018-11-27 2018-11-27 A kind of method and apparatus handling two-dimentional interface in VR equipment Pending CN109584148A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811426537.9A CN109584148A (en) 2018-11-27 2018-11-27 A kind of method and apparatus handling two-dimentional interface in VR equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811426537.9A CN109584148A (en) 2018-11-27 2018-11-27 A kind of method and apparatus handling two-dimentional interface in VR equipment

Publications (1)

Publication Number Publication Date
CN109584148A true CN109584148A (en) 2019-04-05

Family

ID=65924515

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811426537.9A Pending CN109584148A (en) 2018-11-27 2018-11-27 A kind of method and apparatus handling two-dimentional interface in VR equipment

Country Status (1)

Country Link
CN (1) CN109584148A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111243103A (en) * 2020-01-07 2020-06-05 青岛小鸟看看科技有限公司 Method and device for setting safety area, VR equipment and storage medium
CN111803930A (en) * 2020-07-20 2020-10-23 网易(杭州)网络有限公司 Multi-platform interaction method and device and electronic equipment
CN114470769A (en) * 2022-04-01 2022-05-13 苏州达家迎信息技术有限公司 Interactive data processing method and device, electronic equipment and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6111583A (en) * 1997-09-29 2000-08-29 Skyline Software Systems Ltd. Apparatus and method for three-dimensional terrain rendering
CN101422352A (en) * 2008-12-10 2009-05-06 华北电力大学(保定) Interactive coronary artery virtual angioscope implementation method
CN101945685A (en) * 2007-12-13 2011-01-12 Oraya治疗公司 Methods and devices for orthovoltage ocular radiotherapy and treatment planning
CN102722106A (en) * 2011-03-29 2012-10-10 上海晟昊信息科技有限公司 Immersive virtual reality emulation interaction display method and display system
US20140104311A1 (en) * 2012-10-12 2014-04-17 Infinitt Healthcare Co., Ltd. Medical image display method using virtual patient model and apparatus thereof
CN103793060A (en) * 2014-02-14 2014-05-14 杨智 User interaction system and method
CN104939850A (en) * 2014-03-27 2015-09-30 西门子公司 Imaging tomosynthesis system, in particular mammography system
CN105511618A (en) * 2015-12-08 2016-04-20 北京小鸟看看科技有限公司 3D input device, head-mounted device and 3D input method
CN106598250A (en) * 2016-12-19 2017-04-26 北京星辰美豆文化传播有限公司 VR display method and apparatus, and electronic device
CN106681506A (en) * 2016-12-26 2017-05-17 惠州Tcl移动通信有限公司 Interaction method of non-VR application in terminal equipment and terminal equipment
CN107272889A (en) * 2017-05-23 2017-10-20 武汉秀宝软件有限公司 A kind of AR interface alternation method and system based on three-dimensional coordinate
CN206961066U (en) * 2017-02-28 2018-02-02 深圳市未来感知科技有限公司 A kind of virtual reality interactive device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6111583A (en) * 1997-09-29 2000-08-29 Skyline Software Systems Ltd. Apparatus and method for three-dimensional terrain rendering
CN101945685A (en) * 2007-12-13 2011-01-12 Oraya治疗公司 Methods and devices for orthovoltage ocular radiotherapy and treatment planning
CN101422352A (en) * 2008-12-10 2009-05-06 华北电力大学(保定) Interactive coronary artery virtual angioscope implementation method
CN102722106A (en) * 2011-03-29 2012-10-10 上海晟昊信息科技有限公司 Immersive virtual reality emulation interaction display method and display system
US20140104311A1 (en) * 2012-10-12 2014-04-17 Infinitt Healthcare Co., Ltd. Medical image display method using virtual patient model and apparatus thereof
CN103793060A (en) * 2014-02-14 2014-05-14 杨智 User interaction system and method
CN104939850A (en) * 2014-03-27 2015-09-30 西门子公司 Imaging tomosynthesis system, in particular mammography system
CN105511618A (en) * 2015-12-08 2016-04-20 北京小鸟看看科技有限公司 3D input device, head-mounted device and 3D input method
CN106598250A (en) * 2016-12-19 2017-04-26 北京星辰美豆文化传播有限公司 VR display method and apparatus, and electronic device
CN106681506A (en) * 2016-12-26 2017-05-17 惠州Tcl移动通信有限公司 Interaction method of non-VR application in terminal equipment and terminal equipment
CN206961066U (en) * 2017-02-28 2018-02-02 深圳市未来感知科技有限公司 A kind of virtual reality interactive device
CN107272889A (en) * 2017-05-23 2017-10-20 武汉秀宝软件有限公司 A kind of AR interface alternation method and system based on three-dimensional coordinate

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111243103A (en) * 2020-01-07 2020-06-05 青岛小鸟看看科技有限公司 Method and device for setting safety area, VR equipment and storage medium
CN111243103B (en) * 2020-01-07 2023-04-28 青岛小鸟看看科技有限公司 Method and device for setting security area, VR equipment and storage medium
CN111803930A (en) * 2020-07-20 2020-10-23 网易(杭州)网络有限公司 Multi-platform interaction method and device and electronic equipment
CN114470769A (en) * 2022-04-01 2022-05-13 苏州达家迎信息技术有限公司 Interactive data processing method and device, electronic equipment and storage medium
CN114470769B (en) * 2022-04-01 2022-08-05 苏州达家迎信息技术有限公司 Interactive data processing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US10890983B2 (en) Artificial reality system having a sliding menu
Frati et al. Using Kinect for hand tracking and rendering in wearable haptics
CN106873767B (en) Operation control method and device for virtual reality application
Stuerzlinger et al. The value of constraints for 3D user interfaces
US20200387214A1 (en) Artificial reality system having a self-haptic virtual keyboard
Rautaray et al. Real time multiple hand gesture recognition system for human computer interaction
CN110476142A (en) Virtual objects user interface is shown
US20200387286A1 (en) Arm gaze-driven user interface element gating for artificial reality systems
US10955929B2 (en) Artificial reality system having a digit-mapped self-haptic input method
CN106873886B (en) Control method and device for stereoscopic display and electronic equipment
US10921879B2 (en) Artificial reality systems with personal assistant element for gating user interface elements
Grandi et al. Design and assessment of a collaborative 3D interaction technique for handheld augmented reality
US10990240B1 (en) Artificial reality system having movable application content items in containers
CN106445157B (en) Method and device for adjusting picture display direction
CN109584148A (en) A kind of method and apparatus handling two-dimentional interface in VR equipment
CN108431734A (en) Touch feedback for non-touch surface interaction
US11043192B2 (en) Corner-identifiying gesture-driven user interface element gating for artificial reality systems
JP2018142313A (en) System and method for touch of virtual feeling
CN107735758A (en) Synchronous digital ink strokes are presented
JPWO2015011741A1 (en) Image processing program, server device, image processing system, and image processing method
CN108776544A (en) Exchange method and device, storage medium, electronic equipment in augmented reality
Cho et al. Xave: Cross-platform based asymmetric virtual environment for immersive content
JP5876600B1 (en) Information processing program and information processing method
Tran et al. Easy-to-use virtual brick manipulation techniques using hand gestures
CN108803862B (en) Account relation establishing method and device used in virtual reality scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190405