CN105975259A - Implementation method and device of 3D (Three-dimensional) space user interface - Google Patents
Implementation method and device of 3D (Three-dimensional) space user interface Download PDFInfo
- Publication number
- CN105975259A CN105975259A CN201610272110.2A CN201610272110A CN105975259A CN 105975259 A CN105975259 A CN 105975259A CN 201610272110 A CN201610272110 A CN 201610272110A CN 105975259 A CN105975259 A CN 105975259A
- Authority
- CN
- China
- Prior art keywords
- user interface
- space
- user
- spatial
- input instruction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the invention provides an implementation method and device of a 3D (Three-dimensional) space user interface, and belongs to the technical field of virtual reality. The method comprises the following steps: obtaining a 2D space user interface, and rendering the 2D space user interface into an original picture; and rending the original picture into 3D space to form the 3D space user interface. The embodiment of the invention can lower the design difficulty of the 3D space user interface, and is convenient for carrying out the design of the complex 3D space user interface.
Description
Technical field
The application belongs to technical field of virtual reality, is specifically related to the implementation method of a kind of 3d space user interface
And device.
Background technology
Virtual reality technology combine computer graphics techniques, computer simulation technique, sensor technology,
The multiple science and technology such as Display Technique, it creates a virtual information environment, energy on Multi information space
Make user have feeling of immersion on the spot in person, there is the interaction capacity perfect with environment, and contribute to
Inspire design.
Due to the above-mentioned advantage of virtual reality technology, with an improved the Consumer's Experience of existing audio & video equipment,
It has been directed to wider array of field, such as video conference, network technology and Distributed Computing Technology, and to distributed
Virtual reality develops.Virtual reality technology has become the important means of new product designs exploitation.
The user interface (User Interface is called for short UI, also known as user interface) of virtual reality technology
Design, designs more complicated, designer relative to the user interface (UI) in traditional 2D space
Need more specialized skills and taken more time user interface (UI) design of 3d space.
Further, 2D space user interface (UI) design have self user interface (UI) data base and
User interface (UI) development scheme, using the teaching of the invention it is possible to provide user interface (UI) design of various complexity, reduces
Render complexity.But, it is complicated that user interface (UI) design of current 3d space to realize these
The design of user interface (UI) is the most extremely difficult.
Therefore, need implementation method and the device thereof designing a kind of 3d space user interface badly, reduce 3D
The difficulty of spatial user INTERFACE DESIGN.
Summary of the invention
The embodiment of the present application one of solves the technical problem that the reality being to provide a kind of 3d space user interface
Existing method and device thereof, it can reduce the difficulty of 3d space user-interface design, it is simple to carries out complexity
The design of 3d space user interface.
The embodiment of the present application provides the implementation method of a kind of 3d space user interface, including:
Obtain 2D spatial user interface, and described 2D spatial user interface is rendered as original image;
Described original image is rendered into 3d space, forms 3d space user interface.
In the application one specific embodiment, described acquisition 2D spatial user interface, and described 2D is empty
Between user interface render and include for original image:
According to user interface database and user interface development mode, generate 2D spatial user interface;
Described 2D spatial user interface is rendered as original image.
In the application one specific embodiment, described original image is bitmap.
In the application one specific embodiment, described user interface database and user interface development mode bag
Include:
Android user interface database and user interface development mode, IOS user interface database and user
In interface development mode, third party's user interface database and user interface development mode any one.
In the application one specific embodiment, described method also includes:
By described 3d space user interface, receive the input instruction of user, produce and refer to described input
The control operation that order is corresponding.
In the application one specific embodiment, described by described 3d space user interface, receive user
Input instruction, produce to operate with described input control corresponding to instruction and include:
Resolve the input instruction that user is sent by 3d space user interface, obtain described input instruction and exist
The operating position point of 3d space;
Described input instruction is converted to 2D space at the operating position point of 3d space, it is thus achieved that it is at 2D
The position coordinates in space;
Obtain the operation object of described input instruction according to described position coordinates, carry out the control operation of correspondence.
What the application provided a kind of 3d space user interface realizes device, including:
Picture rendering module, is used for obtaining 2D spatial user interface, and by described 2D spatial user interface
Render as original image;
3D modular converter, for described original image is rendered into 3d space, forms 3d space user
Interface.
In the application one specific embodiment, described picture rendering module includes:
Interface signal generating unit, for according to user interface database and user interface development mode, generates 2D
Spatial user interface;
Picture storage unit, for rendering described 2D spatial user interface as original image.
In the application one specific embodiment, described original image is bitmap.
In the application one specific embodiment, described user interface database and user interface development mode bag
Include:
Android user interface database and user interface development mode, IOS user interface database and user
In interface development mode, third party's user interface database and user interface development mode any one.
In the application one specific embodiment, described device also includes:
Command operating module, for by described 3d space user interface, receiving the input instruction of user,
Produce the control operation corresponding with described input instruction.
In the application one specific embodiment, described command operating module includes:
Instruction resolution unit, the input sent by 3d space user interface for resolving user is instructed,
Obtain the described input instruction operating position point at 3d space;
Space conversion unit, for converting described input instruction to 2D at the operating position point of 3d space
Space, it is thus achieved that it is at the position coordinates in 2D space;
Control operating unit, for obtaining the operation object of described input instruction according to described position coordinates,
Carry out the control operation of correspondence.
2D spatial user interface is rendered as original image by the embodiment of the present application, and by described original image
It is rendered into 3d space, forms 3d space user interface.User by described 3d space user interface,
The input instruction sent, thus produce the control operation corresponding with described input instruction.Therefore, the application
The user interface completed in 2D spatial design is converted into 3d space user interface, and realizes user
Carrying out input in 3d space user interface and control operation, the application reduces 3d space user-interface design
Difficulty.Further, the application only need to carry out complex user interface design in 2D space, need not directly exist
3d space carries out complex user interface design, it is simple to carry out the design of complicated 3d space user interface.
Accompanying drawing explanation
In order to be illustrated more clearly that the embodiment of the present application or technical scheme of the prior art, below will be to reality
Execute the required accompanying drawing used in example or description of the prior art to be briefly described, it should be apparent that below,
Accompanying drawing in description is only some embodiments described in the application, for those of ordinary skill in the art
From the point of view of, it is also possible to other accompanying drawing is obtained according to these accompanying drawings.
Fig. 1 is the hardware structure diagram of the computer equipment that the application is applied;
Fig. 2 is the implementation method one embodiment flow chart of a kind of 3d space user interface that the application provides;
Fig. 3 is another embodiment step of implementation method of a kind of 3d space user interface that the application provides
The flow chart of S1;
Fig. 4 is another embodiment flow process of implementation method of a kind of 3d space user interface that the application provides
Figure;
Fig. 5 is the implementation method another embodiment step of a kind of 3d space user interface that the application provides
The flow chart of S3;
Fig. 6 be a kind of 3d space user interface of providing of the application realize device one example structure figure;
Fig. 7 is the figure realizing another embodiment of device of a kind of 3d space user interface that the application provides
The structure chart of sheet rendering module;
Fig. 8 be a kind of 3d space user interface of providing of the application realize device another embodiment structure
Figure;
Fig. 9 is the finger realizing device another embodiment of a kind of 3d space user interface that the application provides
Make operation module structure chart;
Figure 10 is the flow chart of the concrete application scenarios of the application one.
Detailed description of the invention
2D spatial user interface is rendered as original image by the embodiment of the present application, and by described original image
It is rendered into 3d space, forms 3d space user interface.User by described 3d space user interface,
The input instruction sent, thus produce the control operation corresponding with described input instruction.Therefore, the application
The user interface completed in 2D spatial design is converted into 3d space user interface, and realizes user
Carrying out input in 3d space user interface and control operation, the application reduces 3d space user-interface design
Difficulty.Further, the application only need to carry out complex user interface design in 2D space, need not directly exist
3d space carries out complex user interface design, it is simple to carry out the design of complicated 3d space user interface.
Although the application can have an embodiment of many multi-forms, but display in the accompanying drawings and will be
The specific embodiment described in detail herein, it should be appreciated that the disclosure of this embodiment should be considered principle
Example, and be not intended to the application to be limited to the specific embodiment that is shown and described.In description below
In, the identical label identical, similar or corresponding part in the several diagrams describing accompanying drawing.
As used herein, term " " or " a kind of " are defined as one (kind) or more than one (kind).
As used herein, term " multiple " is defined as two or more than two.As used herein, term " its
He " be defined as at least another or more.As used herein, term " comprises " and/or " having "
It is defined to include (that is, open language).As used herein, term " couples " and is defined as connecting,
But it is not necessarily and is directly connected to, and be not necessarily and mechanically connect.As used herein, term " journey
Sequence " or " computer program " or similar terms be defined as the finger that is designed for performing on the computer systems
Make sequence." program " or " computer program " can include subprogram, function, process, object method, right
As realizing, can perform application, applet, servlet, source code, object code, sharing
Storehouse/dynamic load library and/or be designed for other job sequence performed on the computer systems.
As used herein, term " program " can be also used in the second linguistic context (defined above for the first language
Border).In the second linguistic context, in the sense that " TV programme ", use this term.In this linguistic context, should
Term, for representing the audiovisual content of any relevant series, such as will be interpreted and at electronics
Program guide (EPG) is reported as the content of single TV programme, is film, body regardless of this content
Educate race, the fragment of many parts serial, news broadcast etc..This term also can be interpreted as including business
Break for commercialsy and other content as program that program may be reported as the most in an electronic program guide.
To " embodiment ", " some embodiment ", " embodiment " or similar terms in whole presents
Mention and represent that the special characteristic, structure or the characteristic that describe are included in the present invention at least in conjunction with the embodiments
In one embodiment.Therefore, the appearance at this word in the various places of whole this specification need not be complete
Portion represents identical embodiment.It addition, described special characteristic, structure or characteristic can be without limitation one
Individual or multiple embodiment combines in any suitable manner.
As used herein, term " or " should be construed as inclusive or represent any one
Or any combination.Therefore, " A, B or C " expression " following any one: A;B;C;A and
B;A and C;B and C;A, B and C ".Only when the combination of element, function, step or action is with certain
When the mode of kind is the most mutually exclusive, it will the exception of this definition occurs.
As used herein, term 3D or three-dimensional are intended to apply to stereoscopic three-dimensional visual experience.This body
Test and can create in many ways, differently polarize or for every eyes for every eyes including using
The image of colour filter.Specifically, in the context of the present invention, by separate left eye and eye image
Generation and display create 3D vision experience.Presenting the aobvious of the separate image for every eyes
This image, wherein active technique (such as, replacing by the image of every eyes viewing is watched on showing device
The stop of synchronization and pass through) for creating the separation of left eye and eye image, or passive technology is (such as,
Polarization or coloured spectacles) it is used for separating left eye and eye image, thus produce stereoscopic three-dimensional visual experience hallucination.
For the technical scheme making those skilled in the art be more fully understood that in the application, below in conjunction with this Shen
Please accompanying drawing in embodiment, the technical scheme in the embodiment of the present application is clearly and completely described,
Obviously, described embodiment is only some embodiments of the present application rather than whole embodiments.Base
Embodiment in the application, the every other embodiment that those of ordinary skill in the art are obtained, all answer
When the scope belonging to the application protection.
Further illustrate the application below in conjunction with illustrations to implement.
The application one embodiment provides the implementation method of a kind of 3d space user interface, is commonly used to meter
Calculate machine equipment.
Seeing Fig. 1, described computer equipment generally includes: main control chip 11, memorizer 12, input defeated
Go out device 13 and other hardware 14.Described main control chip 11 controls each functional module, memorizer 12
Store each application program and data.
Seeing Fig. 2, described method includes:
Step S1: obtain 2D spatial user interface, and described 2D spatial user interface is rendered as former
Beginning picture.
In the application one implements, seeing Fig. 3, described step S1 includes:
S11, according to user interface database and user interface development mode, generate 2D spatial user interface.
User interface is to interact between system and user and the medium of information exchange, and it realizes information
Inner form and the mankind can be to accept the conversion between form.User interface is to set with hardware between user
Related software linked up the most alternately by meter, and purpose goes to rate operation hard allowing users to easily and effectively
Part is two-way mutual to reach, and completes the desired work completed by hardware, and user interface definition is extensive,
Containing man-machine interaction and graphical user interface, all participation mankind deposit with the field of the communication for information of machinery
In user interface.
2D spatial user interface is to be opened by existing 2D spatial user interface database and 2D spatial user interface
Originating party formula is designed obtaining, generally use Android user interface database and user interface development mode,
IOS user interface database and user interface development mode, third party's user interface database and user circle
In the development scheme of face any one.
Concrete, described third party's user interface database and user interface development mode can be that QT uses
Interface data storehouse, family and user interface development mode.
Existing 2D spatial user interface database and 2D spatial user interface development mode have the typesetting of self
(layout) rule, can carry out the design of complex user interface according to the development scheme of self.
Existing 2D spatial user interface database and 2D spatial user interface development mode is how utilized to carry out 2D
Spatial user INTERFACE DESIGN is prior art, so repeating no more.
S12, described 2D spatial user interface is rendered into original image, preserve to internal memory.
User interface renders the rule referring to be defined according to CSS by the html code of user interface, with former
The mode of beginning picture shows.User interface renders and is generally completed by rendering engine, and rendering step is as follows:
The first step: first rendering engine resolves html document, is converted to a dom tree.
Second step: no matter being next intraconnected, the CSS style of external coupling type or embedded introducing also can
Resolved, render other one for rendering the tree-render tree (renDer tree) of dom tree, render tree
Comprising with color, the rectangle of the display properties such as size, the order of these rectangles is consistent with DISPLAY ORDER.
3rd step: be then exactly that each node to render tree is laid out processing, determine that it is on screen
Display position.
4th step: travel through render tree exactly and by UI back end layers, each node drawn out.
Above step is a progressive process, and in order to improve Consumer's Experience, rendering engine attempts as far as possible
Fast is shown to end user result.It will not wait until that all HTML have been resolved and just create also
Layout render tree.It can be the local content having been received by while obtaining document content from Internet
First display.
Concrete, the application original image is bitmap (bit map).Bitmap images is also referred to as dot matrix image
Or drawing image, it is made up of a single point of referred to as pixel (picture element).These points can be carried out
Different arrangements and dyeing are to constitute pattern.When amplifying bitmap, it can be seen that the whole image of composition of relying
Countless single square.The effect expanding bitmap size is to increase single pixel, so that lines and shape
Seem uneven.But, if watching it from the most remote position, the CF of bitmap images is again
Seem it is continuous print.
Step S2: described original image is rendered into 3d space, forms 3d space user interface.
The existing softwares such as 3D Max are generally used to complete rendering of original image, it is thus achieved that the use of 3d space
Interface, family.
The application utilizes ripe 2D spatial user interface database and the 2D spatial user in existing 2D space
Interface development mode carries out user-interface design, then 2D spatial user interface design completed is rendered to
For original image, original image is rendered into 3d space, thus obtains 3d space user interface.
Therefore, the user interface completed in 2D spatial design is converted into 3d space user by the application
Interface, and realize user 3d space user interface carry out input control operation, the application reduces 3D
The difficulty of spatial user INTERFACE DESIGN.Further, the application only need to carry out complex user interface in 2D space
Design, directly need not carry out complex user interface design at 3d space, it is simple to carries out complicated 3d space and uses
The design at interface, family.
During another implements in the application, seeing Fig. 4, described method also includes:
Step S3: by described 3d space user interface, receive the input instruction of user, produce and institute
State the control operation that input instruction is corresponding.
During another implements in the application, seeing Fig. 5, described step S3 includes:
The input that S31, parsing user are sent by 3d space user interface instructs, and obtains described input and refers to
Make the operating position point at 3d space.
Described 3d space user interface shows, user sends defeated by described 3d space user interface
Enter instruction, the operation button of 3d space user interface, picture or text etc., carry out button click,
Picture or text selecting, the action such as pull.
The application resolves described input instruction, and the input instruction especially by described user is used with 3d space
The intersection point of the plane at place, interface, family, as user input instruction at the operating position point of 3d space.
S32, by described input instruction 3d space operating position point convert to 2D space, it is thus achieved that its
Position coordinates in 2D space.
The application utilizes existing 3D-2D space conversion algorithms, is converted by the operating position point of 3d space
To 2D space, specifically can be realized by D3DXVec3Unproject function, this is prior art,
Therefore do not repeat them here.
S33, the operation object instructed according to the described input of described position coordinates acquisition, carry out the control of correspondence
System operation.
The application obtains the operation object that described position coordinates falls into, so that it is determined that described user input instruction
Carry out for this operation object, described operation object is carried out the control operation of correspondence.
Such as, the input instruction of user is to click on " determination " button in 3d space user interface, then
Receive the input instruction of user, by the operation of user's " determination " button in 3d space user interface
Location point converts the position coordinates to 2D space, determines that described position coordinates falls at 2D spatial user interface
Which operation object on, thus carry out clicking operation for this operation object.
Corresponding said method, another embodiment of the application provides the realization dress of a kind of 3d space user interface
Put, be commonly used to computer equipment.
Seeing Fig. 1, described computer equipment generally includes: main control chip 11, memorizer 12, input defeated
Go out device 13 and other hardware 14.Described main control chip 11 controls each functional module, memorizer 12
Store each application program and data.
Seeing Fig. 6, described device includes:
Picture rendering module 61, is used for obtaining 2D spatial user interface, and by described 2D spatial user
Interface renders as original image.
3D modular converter 62, for described original image is rendered into 3d space, forms 3d space and uses
Interface, family.
In the application one implements, seeing Fig. 7, described picture rendering module 61 includes:
Interface signal generating unit 611, for according to user interface database and user interface development mode, raw
Become 2D spatial user interface.
Picture storage unit 612, for rendering described 2D spatial user interface as original image, preservation
To internal memory.
User interface is to interact between system and user and the medium of information exchange, and it realizes information
Inner form and the mankind can be to accept the conversion between form.User interface is to set with hardware between user
Related software linked up the most alternately by meter, and purpose goes to rate operation hard allowing users to easily and effectively
Part is two-way mutual to reach, and completes the desired work completed by hardware, and user interface definition is extensive,
Containing man-machine interaction and graphical user interface, all participation mankind deposit with the field of the communication for information of machinery
In user interface.
2D spatial user interface is to be opened by existing 2D spatial user interface database and 2D spatial user interface
Originating party formula is designed obtaining, generally use Android user interface database and user interface development mode,
IOS user interface database and user interface development mode, third party's user interface database and user circle
In the development scheme of face any one.
Concrete, described third party's user interface database and user interface development mode can be that QT uses
Interface data storehouse, family and user interface development mode.
Existing 2D spatial user interface database and 2D spatial user interface development mode have the typesetting of self
(layout) rule, can carry out the design of complex user interface according to the development scheme of self.
Existing 2D spatial user interface database and 2D spatial user interface development mode is how utilized to carry out 2D
Spatial user INTERFACE DESIGN is prior art, so repeating no more.
User interface renders the rule referring to be defined according to CSS by the html code of user interface, with former
The mode of beginning picture shows.User interface renders and is generally completed by rendering engine, and rendering step is as follows:
The first step: first rendering engine resolves html document, is converted to a dom tree.
Second step: no matter being next intraconnected, the CSS style of external coupling type or embedded introducing also can
Resolved, render other one for rendering the tree-render tree (renDer tree) of dom tree, render tree
Comprising with color, the rectangle of the display properties such as size, the order of these rectangles is consistent with DISPLAY ORDER.
3rd step: be then exactly that each node to render tree is laid out processing, determine that it is on screen
Display position.
4th step: travel through render tree exactly and by UI back end layers, each node drawn out.
Above step is a progressive process, and in order to improve Consumer's Experience, rendering engine attempts as far as possible
Fast is shown to end user result.It will not wait until that all HTML have been resolved and just create also
Layout render tree.It can be the local content having been received by while obtaining document content from Internet
First display.
Concrete, the application original image is bitmap (bit map).Bitmap images is also referred to as dot matrix image
Or drawing image, it is made up of a single point of referred to as pixel (picture element).These points can be carried out
Different arrangements and dyeing are to constitute pattern.When amplifying bitmap, it can be seen that the whole image of composition of relying
Countless single square.The effect expanding bitmap size is to increase single pixel, so that lines and shape
Seem uneven.But, if watching it from the most remote position, the CF of bitmap images is again
Seem it is continuous print.
The existing softwares such as 3D Max are generally used to complete rendering of original image, it is thus achieved that the use of 3d space
Interface, family.
The application utilizes ripe 2D spatial user interface database and the 2D spatial user in existing 2D space
Interface development mode carries out user-interface design, then 2D spatial user interface design completed is rendered to
For original image, original image is rendered into 3d space, thus obtains 3d space user interface.
During another implements in the application, seeing Fig. 8, described device also includes:
Command operating module 63, for by described 3d space user interface, the input receiving user refers to
Order, produces the control operation corresponding with described input instruction.
Seeing Fig. 9, described command operating module 63 includes:
Instruction resolves single 631 yuan, and the input sent by 3d space user interface for resolving user is referred to
Order, obtains the described input instruction operating position point at 3d space.
Space conversion unit 632, for converting described input instruction extremely at the operating position point of 3d space
2D space, it is thus achieved that it is at the position coordinates in 2D space.
Control operating unit 633, right for obtaining the operation of described input instruction according to described position coordinates
As, carry out the control operation of correspondence.
Described 3d space user interface shows, user sends defeated by described 3d space user interface
Enter instruction, the operation button of 3d space user interface, picture or text etc., carry out button click,
Picture or text selecting, the action such as pull.
The application resolves described input instruction, and the input instruction especially by described user is used with 3d space
The intersection point of the plane at place, interface, family, as user input instruction at the operating position point of 3d space.
The application utilizes existing 3D-2D space conversion algorithms, is converted by the operating position point of 3d space
To 2D space, specifically can be realized by D3DXVec3Unproject function, this is prior art,
Therefore do not repeat them here.
The application obtains the operation object that described position coordinates falls into, so that it is determined that described user input instruction
Carry out for this operation object, described operation object is carried out the control operation of correspondence.
Such as, the input instruction of user is to click on " determination " button in 3d space user interface, then
Receive the input instruction of user, by the operation of user's " determination " button in 3d space user interface
Location point converts the position coordinates to 2D space, determines that described position coordinates falls at 2D spatial user interface
Which operation object on, thus carry out clicking operation for this operation object.
Therefore, the user interface completed in 2D spatial design is converted into 3d space user by the application
Interface, and realize user 3d space user interface carry out input control operation, the application reduces 3D
The difficulty of spatial user INTERFACE DESIGN.Further, the application only need to carry out complex user interface in 2D space
Design, directly need not carry out complex user interface design at 3d space, it is simple to carries out complicated 3d space and uses
The design at interface, family.
Further illustrate the application below by the concrete application scenarios of the application one to realize.
The application is applied on a computer equipment, and user-interface design personnel utilize the application to carry out 3D
User-interface design, it is achieved the input of 3d space user interface controls.
Seeing Figure 10, described method includes:
1001, according to Android user interface database and Android user interface development mode, 2D is generated empty
Between user interface.
1002, described 2D spatial user interface is rendered as bitmap, preserve to internal memory.
Rendering step is as follows:
The first step: first rendering engine resolves html document, is converted to a dom tree.
Second step: no matter being next intraconnected, the CSS style of external coupling type or embedded introducing also can
Resolved, render other one for rendering the tree-render tree (renDer tree) of dom tree, render tree
Comprising with color, the rectangle of the display properties such as size, the order of these rectangles is consistent with DISPLAY ORDER.
3rd step: be then exactly that each node to render tree is laid out processing, determine that it is on screen
Display position.
4th step: travel through render tree exactly and by UI back end layers, each node drawn out.
Above step is a progressive process, and in order to improve Consumer's Experience, rendering engine attempts as far as possible
Fast is shown to end user result.It will not wait until that all HTML have been resolved and just create also
Layout render tree.It can be the local content having been received by while obtaining document content from Internet
First display.
1003, use 3D Max that described bitmap is rendered into 3d space, form 3d space user interface.
The application utilizes ripe 2D spatial user interface database and the 2D spatial user in existing 2D space
Interface development mode carries out user-interface design, then 2D spatial user interface design completed is rendered to
For original image, original image is rendered into 3d space, thus obtains 3d space user interface.
1004, resolve the input instruction that user is sent by 3d space user interface, obtain described input
Instruct the operating position point at 3d space.
Described 3d space user interface shows, user sends defeated by described 3d space user interface
Enter instruction, the operation button of 3d space user interface, picture or text etc., carry out button click,
Picture or text selecting, the action such as pull.
The application resolves described input instruction, and the input instruction especially by described user is used with 3d space
The intersection point of the plane at place, interface, family, as user input instruction at the operating position point of 3d space.
1005, by D3DXVec3Unproject function, described input is instructed the operation at 3d space
Location point converts to 2D space, it is thus achieved that it is at the position coordinates in 2D space.
1006, obtain the operation object of described input instruction according to described position coordinates, carry out the control of correspondence
System operation.
The application obtains the operation object that described position coordinates falls into, so that it is determined that described user input instruction
Carry out for this operation object, described operation object is carried out the control operation of correspondence.
Such as, the input instruction of user is to click on " determination " button in 3d space user interface, then
Receive the input instruction of user, by the operation of user's " determination " button in 3d space user interface
Location point converts the position coordinates to 2D space, determines that described position coordinates falls at 2D spatial user interface
Which operation object on, thus carry out clicking operation for this operation object.
The most such as, the input instruction of user for pull the width picture in 3d space user interface, then connects
Receive the input instruction of user, user's drag operation location point in 3d space user interface is converted extremely
The position coordinates in 2D space, determines which operation that described position coordinates falls at 2D spatial user interface is right
As on (picture), thus carry out drag operation for this operation object.
Therefore, the user interface completed in 2D spatial design is converted into 3d space user by the application
Interface, and realize user 3d space user interface carry out input control operation, the application reduces 3D
The difficulty of spatial user INTERFACE DESIGN.Further, the application only need to carry out complex user interface in 2D space
Design, directly need not carry out complex user interface design at 3d space, it is simple to carries out complicated 3d space and uses
The design at interface, family.
It will be understood by those skilled in the art that embodiments herein can be provided as method, device (equipment),
Or computer program.Therefore, the application can use complete hardware embodiment, complete software implementation,
Or combine the form of embodiment in terms of software and hardware.And, the application can use one or more
The computer-usable storage medium wherein including computer usable program code (includes but not limited to disk
Memorizer, CD-ROM, optical memory etc.) form of the upper computer program implemented.
The application be with reference to the method for embodiment, device (equipment) and computer program flow chart with/
Or block diagram describes.It should be understood that can be by computer program instructions flowchart and/or block diagram
Flow process in each flow process and/or square frame and flow chart and/or block diagram and/or the combination of square frame.Can
There is provided these computer program instructions to general purpose computer, special-purpose computer, Embedded Processor or other
The processor of programmable data processing device is to produce a machine so that by computer or other can compile
The instruction that the processor of journey data handling equipment performs produces for realizing in one flow process or multiple of flow chart
The device of the function specified in flow process and/or one square frame of block diagram or multiple square frame.
These computer program instructions may be alternatively stored in and computer or the process of other programmable datas can be guided to set
In the standby computer-readable memory worked in a specific way so that be stored in this computer-readable memory
In instruction produce and include the manufacture of command device, this command device realize in one flow process of flow chart or
The function specified in multiple flow processs and/or one square frame of block diagram or multiple square frame.
These computer program instructions also can be loaded in computer or other programmable data processing device,
Make on computer or other programmable devices, perform sequence of operations step computer implemented to produce
Process, thus the instruction performed on computer or other programmable devices provides for realizing at flow chart
The step of the function specified in one flow process or multiple flow process and/or one square frame of block diagram or multiple square frame.
Although having been described for the preferred embodiment of the application, but those skilled in the art once knowing
Basic creative concept, then can make other change and amendment to these embodiments.So, appended power
Profit requires to be intended to be construed to include preferred embodiment and fall into all changes and the amendment of the application scope.
Obviously, those skilled in the art can carry out various change and modification without deviating from the application to the application
Spirit and scope.So, if the application these amendment and modification belong to the application claim and
Within the scope of its equivalent technologies, then the application is also intended to comprise these change and modification.
Claims (12)
1. the implementation method of a 3d space user interface, it is characterised in that including:
Obtain 2D spatial user interface, and described 2D spatial user interface is rendered as original image;
Described original image is rendered into 3d space, forms 3d space user interface.
2. the method for claim 1, it is characterised in that described acquisition 2D spatial user interface,
And described 2D spatial user interface rendered include for original image:
According to user interface database and user interface development mode, generate 2D spatial user interface;
Described 2D spatial user interface is rendered as original image.
3. method as claimed in claim 2, it is characterised in that described original image is bitmap.
4. method as claimed in claim 2, it is characterised in that described user interface database and user
Interface development mode includes:
Android user interface database and user interface development mode, IOS user interface database and user
In interface development mode, third party's user interface database and user interface development mode any one.
5. the method for claim 1, it is characterised in that described method also includes:
By described 3d space user interface, receive the input instruction of user, produce and refer to described input
The control operation that order is corresponding.
6. method as claimed in claim 5, it is characterised in that described by described 3d space user circle
Face, receives the input instruction of user, produces the control operation corresponding with described input instruction and includes:
Resolve the input instruction that user is sent by 3d space user interface, obtain described input instruction and exist
The operating position point of 3d space;
Described input instruction is converted to 2D space at the operating position point of 3d space, it is thus achieved that it is at 2D
The position coordinates in space;
Obtain the operation object of described input instruction according to described position coordinates, carry out the control operation of correspondence.
7. a 3d space user interface realize device, it is characterised in that including:
Picture rendering module, is used for obtaining 2D spatial user interface, and by described 2D spatial user interface
Render as original image;
3D modular converter, for described original image is rendered into 3d space, forms 3d space user
Interface.
8. device as claimed in claim 6, it is characterised in that described picture rendering module includes:
Interface signal generating unit, for according to user interface database and user interface development mode, generates 2D
Spatial user interface;
Picture storage unit, for rendering described 2D spatial user interface as original image.
9. device as claimed in claim 7, it is characterised in that described original image is bitmap.
10. device as claimed in claim 8, it is characterised in that described user interface database and use
Family interface development mode includes:
Android user interface database and user interface development mode, IOS user interface database and user
In interface development mode, third party's user interface database and user interface development mode any one.
11. devices as claimed in claim 7, it is characterised in that described device also includes:
Command operating module, for by described 3d space user interface, receiving the input instruction of user,
Produce the control operation corresponding with described input instruction.
12. devices as claimed in claim 6, it is characterised in that described command operating module includes:
Instruction resolution unit, the input sent by 3d space user interface for resolving user is instructed,
Obtain the described input instruction operating position point at 3d space;
Space conversion unit, for converting described input instruction to 2D at the operating position point of 3d space
Space, it is thus achieved that it is at the position coordinates in 2D space;
Control operating unit, for obtaining the operation object of described input instruction according to described position coordinates,
Carry out the control operation of correspondence.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610272110.2A CN105975259A (en) | 2016-04-27 | 2016-04-27 | Implementation method and device of 3D (Three-dimensional) space user interface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610272110.2A CN105975259A (en) | 2016-04-27 | 2016-04-27 | Implementation method and device of 3D (Three-dimensional) space user interface |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105975259A true CN105975259A (en) | 2016-09-28 |
Family
ID=56994114
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610272110.2A Pending CN105975259A (en) | 2016-04-27 | 2016-04-27 | Implementation method and device of 3D (Three-dimensional) space user interface |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105975259A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108228121A (en) * | 2016-12-15 | 2018-06-29 | 中科创达软件股份有限公司 | A kind of method, device and mobile terminal of browser split screen |
CN108513671A (en) * | 2017-01-26 | 2018-09-07 | 华为技术有限公司 | A kind of 2D applies display methods and terminal in VR equipment |
CN116483358A (en) * | 2023-04-13 | 2023-07-25 | 江西骏学数字科技有限公司 | Method and system for realizing pseudo 3D user interface of desktop VR |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102087598A (en) * | 2011-02-22 | 2011-06-08 | 深圳市同洲电子股份有限公司 | Method, device and browsing device for displaying 3D interface |
CN103136781A (en) * | 2011-11-30 | 2013-06-05 | 国际商业机器公司 | Method and system of generating three-dimensional virtual scene |
CN104350487A (en) * | 2012-06-08 | 2015-02-11 | Lg电子株式会社 | Rendering method of 3d web-page and terminal using the same |
CN105096368A (en) * | 2015-04-30 | 2015-11-25 | 华为技术有限公司 | Three-dimensional object processing method and related apparatus |
-
2016
- 2016-04-27 CN CN201610272110.2A patent/CN105975259A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102087598A (en) * | 2011-02-22 | 2011-06-08 | 深圳市同洲电子股份有限公司 | Method, device and browsing device for displaying 3D interface |
CN103136781A (en) * | 2011-11-30 | 2013-06-05 | 国际商业机器公司 | Method and system of generating three-dimensional virtual scene |
CN104350487A (en) * | 2012-06-08 | 2015-02-11 | Lg电子株式会社 | Rendering method of 3d web-page and terminal using the same |
CN105096368A (en) * | 2015-04-30 | 2015-11-25 | 华为技术有限公司 | Three-dimensional object processing method and related apparatus |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108228121A (en) * | 2016-12-15 | 2018-06-29 | 中科创达软件股份有限公司 | A kind of method, device and mobile terminal of browser split screen |
CN108228121B (en) * | 2016-12-15 | 2021-05-07 | 中科创达软件股份有限公司 | Browser split screen method and device and mobile terminal |
CN108513671A (en) * | 2017-01-26 | 2018-09-07 | 华为技术有限公司 | A kind of 2D applies display methods and terminal in VR equipment |
US11294533B2 (en) | 2017-01-26 | 2022-04-05 | Huawei Technologies Co., Ltd. | Method and terminal for displaying 2D application in VR device |
CN116483358A (en) * | 2023-04-13 | 2023-07-25 | 江西骏学数字科技有限公司 | Method and system for realizing pseudo 3D user interface of desktop VR |
CN116483358B (en) * | 2023-04-13 | 2024-04-12 | 江西骏学数字科技有限公司 | Method and system for realizing pseudo 3D user interface of desktop VR |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11145134B2 (en) | Augmented virtual reality object creation | |
US10587871B2 (en) | 3D User Interface—360-degree visualization of 2D webpage content | |
US12093704B2 (en) | Devices, methods, systems, and media for an extended screen distributed user interface in augmented reality | |
US11003305B2 (en) | 3D user interface | |
Duval et al. | Improving awareness for 3D virtual collaboration by embedding the features of users’ physical environments and by augmenting interaction tools with cognitive feedback cues | |
CN105955935A (en) | Text control realization method and apparatus | |
CN105528207A (en) | Virtual reality system, and method and apparatus for displaying Android application images therein | |
CN105975264A (en) | Implementation method and device of character control | |
CN106227327B (en) | A kind of display converting method, device and terminal device | |
CN103177467A (en) | Method for creating naked eye 3D (three-dimensional) subtitles by using Direct 3D technology | |
CN104598035B (en) | Cursor display method, smart machine and the system shown based on 3D stereo-pictures | |
CN105975259A (en) | Implementation method and device of 3D (Three-dimensional) space user interface | |
CN108038916A (en) | A kind of display methods of augmented reality | |
US10623713B2 (en) | 3D user interface—non-native stereoscopic image conversion | |
CN105975169A (en) | Method and apparatus for displaying text in 3D space | |
CN105955738A (en) | User interface display method and user interface display device corresponding to 3D list data | |
CN105975179A (en) | Method and apparatus for determining operation object in 3D spatial user interface | |
Shumaker et al. | Virtual, Augmented and Mixed Reality | |
Shumaker | Virtual, Augmented and Mixed Reality: Designing and Developing Augmented and Virtual Environments: 5th International Conference, VAMR 2013, Held as Part of HCI International 2013, Las Vegas, NV, USA, July 21-26, 2013, Proceedings, Part I | |
CN105976422A (en) | Method and device for optimizing texture of 3D space by using 9.png images | |
Saveljev et al. | Three-dimensional interactive cursor based on voxel patterns for autostereoscopic displays | |
CN105975263A (en) | Method and device for realizing control in 3D space | |
CN113129419B (en) | Intelligent visual interaction method and system based on semantics | |
CN106484397A (en) | The generation method of user interface controls and its device in a kind of 3d space | |
Walker | Improving everyday computing tasks with head-mounted displays |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20160928 |