CN105975179A - Method and apparatus for determining operation object in 3D spatial user interface - Google Patents
Method and apparatus for determining operation object in 3D spatial user interface Download PDFInfo
- Publication number
- CN105975179A CN105975179A CN201610274489.0A CN201610274489A CN105975179A CN 105975179 A CN105975179 A CN 105975179A CN 201610274489 A CN201610274489 A CN 201610274489A CN 105975179 A CN105975179 A CN 105975179A
- Authority
- CN
- China
- Prior art keywords
- user interface
- space
- position coordinates
- focus
- plane
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Embodiments of the invention provide a method and an apparatus for determining an operation object in a 3D spatial user interface, and belong to the technical field of virtual reality. The method comprises the steps of obtaining a viewpoint projection light ray and a focal point of a 2D plane where the operation object in the 3D spatial user interface is; converting position coordinates of the focal point in a 3D space into position coordinates in a 2D space; and obtaining a user interface object corresponding to the position coordinates in the 2D space, and determining the user interface object as the operation object. According to the embodiments of the method and apparatus, the operation object in the 3D spatial user interface can be quickly determined and the design of the 3D spatial user interface is simplified.
Description
Technical field
The application belongs to technical field of virtual reality, and the operation being specifically related to a kind of 3d space user interface is right
As determining method and device thereof.
Background technology
Virtual reality technology combine computer graphics techniques, computer simulation technique, sensor technology,
The multiple science and technology such as Display Technique, it creates a virtual information environment, energy on Multi information space
Make user have feeling of immersion on the spot in person, there is the interaction capacity perfect with environment, and contribute to
Inspire design.
Due to the above-mentioned advantage of virtual reality technology, with an improved the Consumer's Experience of existing audio & video equipment,
It has been directed to wider array of field, such as video conference, network technology and Distributed Computing Technology, and to distributed
Virtual reality develops.Virtual reality technology has become the important means of new product designs exploitation.
The user interface (User Interface is called for short UI, also known as user interface) of virtual reality technology,
Needing just to can determine that the operation object at 3d space through complicated algorithm, user is to 3d space user circle
Face operates, it is impossible to quickly determines operation object, and produces the operation of correspondence.
Therefore, the most quickly determine that the operation of 3d space user interface needs solution badly to liking prior art
Technical problem.
Summary of the invention
The embodiment of the present application one of solves the technical problem that the behaviour being to provide a kind of 3d space user interface
Determining method and device thereof as object, it can quickly determine the operation object of 3d space user interface,
Simplify the design of 3d space user interface.
The embodiment of the present application provides the operation object of a kind of 3d space user interface to determine method, including:
Obtain the focus operating object place 2D plane in viewpoint throw light and 3d space user interface;
Described focus position coordinates in described 3d space is converted to the position coordinates in 2D space;
Obtain the user interface object that the position coordinates in described 2D space is corresponding, by described user interface
Object is defined as operating object.
In the application one implements, in described acquisition viewpoint throw light and 3d space user interface
The focus of operation object place 2D plane includes:
Viewpoint throw light is to the operation object in described 3d space user interface;
Obtain the focus of described throw light and described operation object place 2D plane.
In the application one implements, the described position coordinates by described focus in described 3d space
Being converted to the position coordinates in 2D space is:
The position that described focus is operated in user interface in described 3d space object place 2D plane is sat
It is denoted as described focus position coordinates in 2D space.
In the application one implements, described position coordinates is XY two-dimensional coordinate.
In the application one implements, the use that position coordinates in described acquisition described 2D space is corresponding
Family interface object, is defined as described user interface object operating object and includes:
According to the placement rule of described user interface object, the user interface object in described 2D space is entered
Row traversal;
The scope of user interface object as described in described position coordinates falls into, then by described user interface object
It is defined as operating object.
Corresponding said method, the application also provides for the operation object of a kind of 3d space user interface and determines dress
Put, including:
Focus acquisition module, is used for obtaining in viewpoint throw light and 3d space user interface operation object
The focus of place 2D plane;
Coordinate transferring, for being converted to 2D by described focus position coordinates in described 3d space
Position coordinates in space;
Object determines module, for obtaining the user interface pair that the position coordinates in described 2D space is corresponding
As, it is defined as described user interface object operating object.
In the application one implements, described focus acquisition module includes:
Viewpoint projecting unit, the operation in viewpoint throw light to described 3d space user interface is right
As;
Focus determines unit, for obtaining described throw light and described operation object place 2D plane
Focus.
In the application one implements, described coordinate transferring specifically for by described focus described
3d space operates the position coordinates of object place 2D plane as described focus in user interface at 2D
Position coordinates in space.
In the application one implements, described position coordinates is XY two-dimensional coordinate.
In the application one implements, described object determines that module includes:
Object Traversal Unit, for the placement rule according to described user interface object, empty to described 2D
Between user interface object travel through;
Object judging unit, for falling into the scope of described user interface object, then when described position coordinates
It is defined as described user interface object operating object.
The embodiment of the present application obtains and operates object place 2D in viewpoint throw light and 3d space user interface
The focus of plane, and described focus position coordinates in described 3d space is converted in 2D space
Position coordinates.The application directly obtains the user interface pair that the position coordinates in described 2D space is corresponding
As, it is defined as described user interface object operating object.The embodiment of the present application is without entering at 3d space
Row determines the calculating operating object in user interface, and the operation only obtaining focus corresponding in 2D space is right
As, simplify the calculating determining operation object.Therefore, the embodiment of the present application can quickly determine that 3D is empty
Between the operation object of user interface, simplify the design of 3d space user interface, improve user and use
The Consumer's Experience of family interface operation.
Accompanying drawing explanation
In order to be illustrated more clearly that the embodiment of the present application or technical scheme of the prior art, below will be to reality
Execute the required accompanying drawing used in example or description of the prior art to be briefly described, it should be apparent that below,
Accompanying drawing in description is only some embodiments described in the application, for those of ordinary skill in the art
From the point of view of, it is also possible to other accompanying drawing is obtained according to these accompanying drawings.
Fig. 1 is the hardware structure diagram of the computer equipment that the application is applied;
Fig. 2 is that the operation object of a kind of 3d space user interface that the application provides determines method one enforcement
Example flow chart;
Fig. 3 is that the operation object of a kind of 3d space user interface that the application provides determines method one enforcement
The flow chart of example step S1;
Fig. 4 is that the operation object of a kind of 3d space user interface that the application provides determines method one enforcement
The flow chart of example step S3;
Fig. 5 is that the operation object of a kind of 3d space user interface that the application provides determines device one enforcement
Example structure chart;
Fig. 6 is that the operation object of a kind of 3d space user interface that the application provides determines device one enforcement
The structure chart of the picture rendering module of example;
Fig. 7 is that the operation object of a kind of 3d space user interface that the application provides determines device one enforcement
The command operating function structure chart of example;
Fig. 8 is the flow chart of the concrete application scenarios of the application one.
Detailed description of the invention
The embodiment of the present application obtains and operates object place 2D in viewpoint throw light and 3d space user interface
The focus of plane, and described focus position coordinates in described 3d space is converted in 2D space
Position coordinates.The application directly obtains the user interface pair that the position coordinates in described 2D space is corresponding
As, it is defined as described user interface object operating object.The embodiment of the present application is without entering at 3d space
Row determines the calculating operating object in user interface, and the operation only obtaining focus corresponding in 2D space is right
As, simplify the calculating determining operation object.Therefore, the embodiment of the present application can quickly determine that 3D is empty
Between the operation object of user interface, simplify the design of 3d space user interface, improve user and use
The Consumer's Experience of family interface operation.
Although the application can have an embodiment of many multi-forms, but display in the accompanying drawings and will be
The specific embodiment described in detail herein, it should be appreciated that the disclosure of this embodiment should be considered principle
Example, and be not intended to the application to be limited to the specific embodiment that is shown and described.In description below
In, the identical label identical, similar or corresponding part in the several diagrams describing accompanying drawing.
As used herein, term " " or " a kind of " are defined as one (kind) or more than one (kind).
As used herein, term " multiple " is defined as two or more than two.As used herein, term " its
He " be defined as at least another or more.As used herein, term " comprises " and/or " having "
It is defined to include (that is, open language).As used herein, term " couples " and is defined as connecting,
But it is not necessarily and is directly connected to, and be not necessarily and mechanically connect.As used herein, term " journey
Sequence " or " computer program " or similar terms be defined as the finger that is designed for performing on the computer systems
Make sequence." program " or " computer program " can include subprogram, function, process, object method, right
As realizing, can perform application, applet, servlet, source code, object code, sharing
Storehouse/dynamic load library and/or be designed for other job sequence performed on the computer systems.
As used herein, term " program " can be also used in the second linguistic context (defined above for the first language
Border).In the second linguistic context, in the sense that " TV programme ", use this term.In this linguistic context, should
Term, for representing the audiovisual content of any relevant series, such as will be interpreted and at electronics
Program guide (EPG) is reported as the content of single TV programme, is film, body regardless of this content
Educate race, the fragment of many parts serial, news broadcast etc..This term also can be interpreted as including business
Break for commercialsy and other content as program that program may be reported as the most in an electronic program guide.
To " embodiment ", " some embodiment ", " embodiment " or similar terms in whole presents
Mention and represent that the special characteristic, structure or the characteristic that describe are included in the present invention at least in conjunction with the embodiments
In one embodiment.Therefore, the appearance at this word in the various places of whole this specification need not be complete
Portion represents identical embodiment.It addition, described special characteristic, structure or characteristic can be without limitation one
Individual or multiple embodiment combines in any suitable manner.
As used herein, term " or " should be construed as inclusive or represent any one
Or any combination.Therefore, " A, B or C " expression " following any one: A;B;C;A and
B;A and C;B and C;A, B and C ".Only when the combination of element, function, step or action is with certain
When the mode of kind is the most mutually exclusive, it will the exception of this definition occurs.
As used herein, term 3D or three-dimensional are intended to apply to stereoscopic three-dimensional visual experience.This body
Test and can create in many ways, differently polarize or for every eyes for every eyes including using
The image of colour filter.Specifically, in the context of the present invention, by separate left eye and eye image
Generation and display create 3D vision experience.Presenting the aobvious of the separate image for every eyes
This image, wherein active technique (such as, replacing by the image of every eyes viewing is watched on showing device
The stop of synchronization and pass through) for creating the separation of left eye and eye image, or passive technology is (such as,
Polarization or coloured spectacles) it is used for separating left eye and eye image, thus produce stereoscopic three-dimensional visual experience hallucination.
For the technical scheme making those skilled in the art be more fully understood that in the application, below in conjunction with this Shen
Please accompanying drawing in embodiment, the technical scheme in the embodiment of the present application is clearly and completely described,
Obviously, described embodiment is only some embodiments of the present application rather than whole embodiments.Base
Embodiment in the application, the every other embodiment that those of ordinary skill in the art are obtained, all answer
When the scope belonging to the application protection.
Further illustrate the application below in conjunction with illustrations to implement.
The application one embodiment provides the operation object of a kind of 3d space user interface to determine method, generally
It is applied to computer equipment.
Seeing Fig. 1, described computer equipment generally includes: main control chip 11, memorizer 12, input defeated
Go out device 13 and other hardware 14.Described main control chip 11 controls each functional module, memorizer 12
Store each application program and data.
Seeing Fig. 2, described method includes:
Step S1: obtain operation object place 2D in viewpoint throw light and 3d space user interface and put down
The focus in face.
In the application one implements, seeing Fig. 3, described step S1 includes:
S11, viewpoint throw light are to the operation object in described 3d space user interface.
User interface is to interact between system and user and the medium of information exchange, and it realizes information
Inner form and the mankind can be to accept the conversion between form.User interface is to set with hardware between user
Related software linked up the most alternately by meter, and purpose goes to rate operation hard allowing users to easily and effectively
Part is two-way mutual to reach, and completes the desired work completed by hardware, and user interface definition is extensive,
Containing man-machine interaction and graphical user interface, all participation mankind deposit with the field of the communication for information of machinery
In user interface.
Described viewpoint includes human eye or the image capture apparatus such as photographing unit, video camera, viewpoint throw light
Operation object to described 3d space user interface, thus obtain the operation for described operation object.
S12, obtain the focus of described throw light and described operation object place 2D plane.
Described throw light intersects with described operation object place 2D plane, it is thus achieved that described focus, described
Focus is to carry out, for the operation object in described 3d space user interface, the location point that operation controls.
The application obtains user's by the focus of throw light Yu described operation object place 2D plane
Input instruction carries out, to the operation object in described 3d space user interface, the location point that operation controls.
Step S2: described focus position coordinates in described 3d space is converted in 2D space
Position coordinates.
Concrete, described step S2 is:
The position that described focus is operated in user interface in described 3d space object place 2D plane is sat
It is denoted as described focus position coordinates in 2D space.
Described 2D plane sets is XY two-dimensional coordinate plane by the application, and initial point is arranged on the upper left corner, X
Coordinate axes to the right, Y coordinate axially under.Described focus position in described XY two-dimensional coordinate plane is sat
Mark is set as described focus position coordinates in 2D space.
The application can also utilize existing 3D-2D space conversion algorithms, by the operating position of 3d space
Point converts to 2D space, specifically can be realized by D3DXVec3Unproject function, and this is existing
There is technology, therefore do not repeat them here.
Step S3: obtain the user interface object that the position coordinates in described 2D space is corresponding, by described
User interface object is defined as operating object.
During another implements in the application, seeing Fig. 4, described step S3 includes:
S31, placement rule according to described user interface object, the user interface pair to described 2D space
As traveling through.
In the user interface in described 2D space, the layout of each user interface object has cloth set in advance
Office's rule, the user interface object in described 2D space according to described placement rule, is carried out time by the application
Go through.
Such as, when described user interface object is button (button), the layout of described button be one by
Key tree (button tree), thus described button tree (button tree) is traveled through.
S32, as described in described position coordinates falls into the scope of user interface object, then by described user circle
In the face of as being defined as operation object.
Each described user interface object has position and the scope of oneself in described two-dimensional coordinate plane,
Such as, user interface object is square button, then its position is the position of this square button, and scope is then
The square region scope of this square button.By traveling through the user interface in described 2D space, it is thus achieved that described
Position coordinates falls in the scope of which user interface object, thus described user interface object is defined as
The user interface object that described position coordinates is corresponding.
The embodiment of the present application without operating the calculating of object in being determined user interface at 3d space, and
Only obtain the operation object that focus is corresponding in 2D space, simplify the calculating determining operation object.Therefore,
The embodiment of the present application can quickly determine the operation object of 3d space user interface, simplifies 3d space and uses
The design at interface, family, improves user and carries out the Consumer's Experience of operating user interface.
Corresponding said method, the operation of another embodiment of the application a kind of 3d space user interface of offer is right
As determining device, it is commonly used to computer equipment.
Seeing Fig. 1, described computer equipment generally includes: main control chip 11, memorizer 12, input defeated
Go out device 13 and other hardware 14.Described main control chip 11 controls each functional module, memorizer 12
Store each application program and data.
Seeing Fig. 5, described device includes:
Focus acquisition module 51, be used for obtaining viewpoint throw light and 3d space user interface operate right
Focus as place 2D plane.
Coordinate transferring 52, for being converted to described focus position coordinates in described 3d space
Position coordinates in 2D space.
Object determines module 53, for obtaining the user interface that the position coordinates in described 2D space is corresponding
Object, is defined as described user interface object operating object.
In the application one implements, seeing Fig. 6, described focus acquisition module 51 includes:
Viewpoint projecting unit 511, the operation in viewpoint throw light to described 3d space user interface
Object.
Focus determines unit 512, is used for obtaining described throw light and described operation object place 2D plane
Focus.
User interface is to interact between system and user and the medium of information exchange, and it realizes information
Inner form and the mankind can be to accept the conversion between form.User interface is to set with hardware between user
Related software linked up the most alternately by meter, and purpose goes to rate operation hard allowing users to easily and effectively
Part is two-way mutual to reach, and completes the desired work completed by hardware, and user interface definition is extensive,
Containing man-machine interaction and graphical user interface, all participation mankind deposit with the field of the communication for information of machinery
In user interface.
Described viewpoint includes human eye or the image capture apparatus such as photographing unit, video camera, viewpoint throw light
Operation object to described 3d space user interface, thus obtain the operation for described operation object.
Described throw light intersects with described operation object place 2D plane, it is thus achieved that described focus, described
Focus is to carry out, for the operation object in described 3d space user interface, the location point that operation controls.
The application obtains user's by the focus of throw light Yu described operation object place 2D plane
Input instruction carries out, to the operation object in described 3d space user interface, the location point that operation controls.
Concrete, described coordinate transferring 52 is specifically for using described focus in described 3d space
Interface, family operates the position coordinates of object place 2D plane as described focus position in 2D space
Coordinate.
Described 2D plane sets is XY two-dimensional coordinate plane by the application, and initial point is arranged on the upper left corner, X
Coordinate axes to the right, Y coordinate axially under.Described focus position in described XY two-dimensional coordinate plane is sat
Mark is set as described focus position coordinates in 2D space.
The application can also utilize existing 3D-2D space conversion algorithms, by the operating position of 3d space
Point converts to 2D space, specifically can be realized by D3DXVec3Unproject function, and this is existing
There is technology, therefore do not repeat them here.
During another implements in the application, seeing Fig. 7, described object determines that module 53 includes:
Object Traversal Unit 531, for the placement rule according to described user interface object, to described 2D
The user interface object in space travels through.
Object judging unit 532, for falling into the scope of described user interface object when described position coordinates,
Then it is defined as described user interface object operating object.
In the user interface in described 2D space, the layout of each user interface object has cloth set in advance
Office's rule, the user interface object in described 2D space according to described placement rule, is carried out time by the application
Go through.
Such as, when described user interface object is button (button), the layout of described button be one by
Key tree (button tree), thus described button tree (button tree) is traveled through.
Each described user interface object has position and the scope of oneself in described two-dimensional coordinate plane,
Such as, user interface object is square button, then its position is the position of this square button, and scope is then
The square region scope of this square button.By traveling through the user interface in described 2D space, it is thus achieved that described
Position coordinates falls in the scope of which user interface object, thus described user interface object is defined as
The user interface object that described position coordinates is corresponding.
The embodiment of the present application without operating the calculating of object in being determined user interface at 3d space, and
Only obtain the operation object that focus is corresponding in 2D space, simplify the calculating determining operation object.Therefore,
The embodiment of the present application can quickly determine the operation object of 3d space user interface, simplifies 3d space and uses
The design at interface, family, improves user and carries out the Consumer's Experience of operating user interface.
Further illustrate the application below by the concrete application scenarios of the application one to realize.
The application is applied on a computer equipment, supports and is controlled using by 3d space user interface
Family interface object carries out the operation of correspondence.
Seeing Fig. 8, described method includes:
801, viewpoint throw light is to the operation object in described 3d space user interface.
802, the focus of described throw light and described operation object place 2D plane is obtained.
803, described focus is empty at 2D as described focus at the XY two-dimensional coordinate of described 2D plane
Position coordinates between.
804, the user interface pair according to the placement rule of described user interface object, to described 2D space
As traveling through.
805, the scope of user interface object as described in described XY two-dimensional coordinate falls into, then by described user
Interface object is defined as operating object.
Each described user interface object has position and the scope of oneself in described two-dimensional coordinate plane,
By traveling through the user interface in described 2D space, it is thus achieved which user interface pair described position coordinates falls into
In the scope of elephant, thus described user interface object is defined as the user interface that described position coordinates is corresponding
Object.
In the user interface in described 2D space, the layout of each user interface object has cloth set in advance
Office's rule, the user interface object in described 2D space according to described placement rule, is carried out time by the application
Go through.
Such as, when described user interface object is button (button), the layout of described button be one by
Key tree (button tree), thus described button tree (button tree) is traveled through, it is thus achieved that described
The user interface object that XY two-dimensional coordinate is fallen into, is defined as described user interface object operating object.
The embodiment of the present application without operating the calculating of object in being determined user interface at 3d space, and
Only obtain the operation object that focus is corresponding in 2D space, simplify the calculating determining operation object.Therefore,
The embodiment of the present application can quickly determine the operation object of 3d space user interface, simplifies 3d space and uses
The design at interface, family, improves user and carries out the Consumer's Experience of operating user interface.
It will be understood by those skilled in the art that embodiments herein can be provided as method, device (equipment),
Or computer program.Therefore, the application can use complete hardware embodiment, complete software implementation,
Or combine the form of embodiment in terms of software and hardware.And, the application can use one or more
The computer-usable storage medium wherein including computer usable program code (includes but not limited to disk
Memorizer, CD-ROM, optical memory etc.) form of the upper computer program implemented.
The application be with reference to the method for embodiment, device (equipment) and computer program flow chart with/
Or block diagram describes.It should be understood that can be by computer program instructions flowchart and/or block diagram
Flow process in each flow process and/or square frame and flow chart and/or block diagram and/or the combination of square frame.Can
There is provided these computer program instructions to general purpose computer, special-purpose computer, Embedded Processor or other
The processor of programmable data processing device is to produce a machine so that by computer or other can compile
The instruction that the processor of journey data handling equipment performs produces for realizing in one flow process or multiple of flow chart
The device of the function specified in flow process and/or one square frame of block diagram or multiple square frame.
These computer program instructions may be alternatively stored in and computer or the process of other programmable datas can be guided to set
In the standby computer-readable memory worked in a specific way so that be stored in this computer-readable memory
In instruction produce and include the manufacture of command device, this command device realize in one flow process of flow chart or
The function specified in multiple flow processs and/or one square frame of block diagram or multiple square frame.
These computer program instructions also can be loaded in computer or other programmable data processing device,
Make on computer or other programmable devices, perform sequence of operations step computer implemented to produce
Process, thus the instruction performed on computer or other programmable devices provides for realizing at flow chart
The step of the function specified in one flow process or multiple flow process and/or one square frame of block diagram or multiple square frame.
Although having been described for the preferred embodiment of the application, but those skilled in the art once knowing
Basic creative concept, then can make other change and amendment to these embodiments.So, appended power
Profit requires to be intended to be construed to include preferred embodiment and fall into all changes and the amendment of the application scope.
Obviously, those skilled in the art can carry out various change and modification without deviating from the application to the application
Spirit and scope.So, if the application these amendment and modification belong to the application claim and
Within the scope of its equivalent technologies, then the application is also intended to comprise these change and modification.
Claims (10)
1. the operation object of a 3d space user interface determines method, it is characterised in that including:
Obtain the focus operating object place 2D plane in viewpoint throw light and 3d space user interface;
Described focus position coordinates in described 3d space is converted to the position coordinates in 2D space;
Obtain the user interface object that the position coordinates in described 2D space is corresponding, by described user interface
Object is defined as operating object.
2. the method for claim 1, it is characterised in that described acquisition viewpoint throw light and
The focus operating object place 2D plane in 3d space user interface includes:
Viewpoint throw light is to the operation object in described 3d space user interface;
Obtain the focus of described throw light and described operation object place 2D plane.
3. the method for claim 1, it is characterised in that described by described focus at described 3D
Position coordinates in space is converted to the position coordinates in 2D space:
The position that described focus is operated in user interface in described 3d space object place 2D plane is sat
It is denoted as described focus position coordinates in 2D space.
4. method as claimed in claim 3, it is characterised in that described position coordinates is that XY two dimension is sat
Mark.
5. method as claimed in claim 1, it is characterised in that the position in described acquisition described 2D space
Put the user interface object that coordinate is corresponding, described user interface object be defined as operates object and include:
According to the placement rule of described user interface object, the user interface object in described 2D space is entered
Row traversal;
The scope of user interface object as described in described position coordinates falls into, then by described user interface object
It is defined as operating object.
6. the operation object of a 3d space user interface determines device, it is characterised in that including:
Focus acquisition module, is used for obtaining in viewpoint throw light and 3d space user interface operation object
The focus of place 2D plane;
Coordinate transferring, for being converted to 2D by described focus position coordinates in described 3d space
Position coordinates in space;
Object determines module, for obtaining the user interface pair that the position coordinates in described 2D space is corresponding
As, it is defined as described user interface object operating object.
7. device as claimed in claim 6, it is characterised in that described focus acquisition module includes:
Viewpoint projecting unit, the operation in viewpoint throw light to described 3d space user interface is right
As;
Focus determines unit, for obtaining described throw light and described operation object place 2D plane
Focus.
8. device as claimed in claim 6, it is characterised in that described coordinate transferring specifically for
The position coordinates that described focus is operated in user interface in described 3d space object place 2D plane is made
For described focus position coordinates in 2D space.
9. device as claimed in claim 8, it is characterised in that described position coordinates is that XY two dimension is sat
Mark.
10. device as claimed in claim 6, it is characterised in that described object determines that module includes:
Object Traversal Unit, for the placement rule according to described user interface object, empty to described 2D
Between user interface object travel through;
Object judging unit, for falling into the scope of described user interface object, then when described position coordinates
It is defined as described user interface object operating object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610274489.0A CN105975179A (en) | 2016-04-27 | 2016-04-27 | Method and apparatus for determining operation object in 3D spatial user interface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610274489.0A CN105975179A (en) | 2016-04-27 | 2016-04-27 | Method and apparatus for determining operation object in 3D spatial user interface |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105975179A true CN105975179A (en) | 2016-09-28 |
Family
ID=56993297
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610274489.0A Pending CN105975179A (en) | 2016-04-27 | 2016-04-27 | Method and apparatus for determining operation object in 3D spatial user interface |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105975179A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108513671A (en) * | 2017-01-26 | 2018-09-07 | 华为技术有限公司 | A kind of 2D applies display methods and terminal in VR equipment |
CN113449546A (en) * | 2020-03-24 | 2021-09-28 | 南宁富桂精密工业有限公司 | Indoor positioning method and device and computer readable storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080013860A1 (en) * | 2006-07-14 | 2008-01-17 | Microsoft Corporation | Creation of three-dimensional user interface |
CN101770324A (en) * | 2008-12-31 | 2010-07-07 | 商泰软件(上海)有限公司 | Method for realizing interactive operation of 3D graphical interface |
CN102270035A (en) * | 2010-06-04 | 2011-12-07 | 三星电子株式会社 | Apparatus and method for selecting and operating object in non-touch mode |
CN102981616A (en) * | 2012-11-06 | 2013-03-20 | 中兴通讯股份有限公司 | Identification method and identification system and computer capable of enhancing reality objects |
-
2016
- 2016-04-27 CN CN201610274489.0A patent/CN105975179A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080013860A1 (en) * | 2006-07-14 | 2008-01-17 | Microsoft Corporation | Creation of three-dimensional user interface |
CN101770324A (en) * | 2008-12-31 | 2010-07-07 | 商泰软件(上海)有限公司 | Method for realizing interactive operation of 3D graphical interface |
CN102270035A (en) * | 2010-06-04 | 2011-12-07 | 三星电子株式会社 | Apparatus and method for selecting and operating object in non-touch mode |
CN102981616A (en) * | 2012-11-06 | 2013-03-20 | 中兴通讯股份有限公司 | Identification method and identification system and computer capable of enhancing reality objects |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108513671A (en) * | 2017-01-26 | 2018-09-07 | 华为技术有限公司 | A kind of 2D applies display methods and terminal in VR equipment |
US11294533B2 (en) | 2017-01-26 | 2022-04-05 | Huawei Technologies Co., Ltd. | Method and terminal for displaying 2D application in VR device |
CN113449546A (en) * | 2020-03-24 | 2021-09-28 | 南宁富桂精密工业有限公司 | Indoor positioning method and device and computer readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107018336B (en) | The method and apparatus of method and apparatus and the video processing of image procossing | |
US10192363B2 (en) | Math operations in mixed or virtual reality | |
US20150301596A1 (en) | Method, System, and Computer for Identifying Object in Augmented Reality | |
KR101096617B1 (en) | Spatial multi interaction-based 3d stereo interactive vision system and method of the same | |
Duval et al. | Improving awareness for 3D virtual collaboration by embedding the features of users’ physical environments and by augmenting interaction tools with cognitive feedback cues | |
CN105955935A (en) | Text control realization method and apparatus | |
CN103959340A (en) | Graphics rendering technique for autostereoscopic three dimensional display | |
CN106201259A (en) | A kind of method and apparatus sharing full-view image in virtual reality system | |
CN105159522A (en) | Method for response of virtual reality display device to operation of peripheral device | |
CN105975264A (en) | Implementation method and device of character control | |
CN112184356A (en) | Guided retail experience | |
Agarwal et al. | The evolution and future scope of augmented reality | |
CN106598250A (en) | VR display method and apparatus, and electronic device | |
CN104598035B (en) | Cursor display method, smart machine and the system shown based on 3D stereo-pictures | |
WO2017042070A1 (en) | A gazed virtual object identification module, a system for implementing gaze translucency, and a related method | |
Ohl | Tele-immersion concepts | |
CN106485789A (en) | A kind of 3D model loading method and its device | |
CN105975179A (en) | Method and apparatus for determining operation object in 3D spatial user interface | |
CN105975169A (en) | Method and apparatus for displaying text in 3D space | |
Gotsch et al. | Holoflex: A flexible light-field smartphone with a microlens array and a p-oled touchscreen | |
CN105975259A (en) | Implementation method and device of 3D (Three-dimensional) space user interface | |
CN107635132A (en) | Display control method, device and the display terminal of bore hole 3D display terminal | |
CN105955738A (en) | User interface display method and user interface display device corresponding to 3D list data | |
Narducci et al. | Enabling consistent hand-based interaction in mixed reality by occlusions handling | |
CN105931285A (en) | Control realization method and apparatus in 3D space |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20160928 |
|
WD01 | Invention patent application deemed withdrawn after publication |