CN106502376A - A kind of 3D touch operation methods, electronic equipment and 3D glasses - Google Patents

A kind of 3D touch operation methods, electronic equipment and 3D glasses Download PDF

Info

Publication number
CN106502376A
CN106502376A CN201510570539.5A CN201510570539A CN106502376A CN 106502376 A CN106502376 A CN 106502376A CN 201510570539 A CN201510570539 A CN 201510570539A CN 106502376 A CN106502376 A CN 106502376A
Authority
CN
China
Prior art keywords
coordinates
display object
axis coordinate
display
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510570539.5A
Other languages
Chinese (zh)
Inventor
冯雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Samsung Electronics Co Ltd
Samsung Electronics Co Ltd
Original Assignee
Tianjin Samsung Electronics Co Ltd
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Samsung Electronics Co Ltd, Samsung Electronics Co Ltd filed Critical Tianjin Samsung Electronics Co Ltd
Priority to CN201510570539.5A priority Critical patent/CN106502376A/en
Publication of CN106502376A publication Critical patent/CN106502376A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The invention discloses a kind of 3D touch operation methods, are applied in electronic equipment, including:3D rendering is exported by display unit, comprising M display object in 3D rendering, user can watch M with 3D effect to show object by wearing 3D glasses;The 3D coordinates of the operating body that real-time reception 3D glasses send and the 3D coordinates of M display object;3D coordinates based on operating body and the 3D coordinates of M display object, are controlled to the M specified display object shown in object.Meanwhile, the invention also discloses a kind of electronic equipment and 3D glasses.

Description

A kind of 3D touch operation methods, electronic equipment and 3D glasses
Technical field
The present invention relates to electronic technology field, more particularly to a kind of 3D touch operation methods, electronic equipment and 3D glasses.
Background technology
3D (Three DimeMsioMa, three-dimensional) imagings are produced by the sight equation of people two, people's The distance of 8 cms is typically had between two (pupil), and people to be allowed sees 3D rendering, just necessary Allow left eye and right eye to see different images, that is, the situation of the actual viewing of simulation eyes, make two sub-pictures There is certain sight equation, finally assume the image of the solid of a width 3D in human brain.
At present, a lot of electronic equipments are (for example:Smart mobile phone, panel computer, intelligent television etc.) all provide There are 3D display functions, it is possible to use the image with 3D effect is seen at family, a lot of displays in 3D rendering Scenery all has the depth of field (for example:The positive depth of field) so that it is certain in front of screen that user can see that scenery is located at Distance, picture are truer, and user experience is more preferable.
But present inventor during inventive technique scheme, is had found above-mentioned in the embodiment of the present application is realized At least there is following technical problem in technology:
At present, under 3D display environments, pass through screen (for example in user:Resistive touch screen or electricity Appearance formula touch screen) when carrying out man-machine interaction, still need with the actual Touch screen of handss, but show in 3D Under environment, all 3D scenery all have the depth of field, the vision position of screen position and 3D rendering seen by person Put inconsistent, the 3D scenery that user sees all is that user still uses handss positioned at certain distance in front of screen Touch screen carries out man-machine interaction and then seems unnatural, and user experience is not good.
Content of the invention
The embodiment of the present application is by providing a kind of 3D touch operation methods, electronic equipment and 3D glasses, solution When in prior art of having determined, user carries out man-machine interaction under 3D display environments, need to use handss Touch screen, There is touch location not corresponding with visual position, cause the factitious technical problem of touch operation.
In a first aspect, an embodiment of the application by the application, there is provided following technical scheme:
A kind of 3D touch operation methods, are applied in electronic equipment, and the electronic equipment includes display unit, Methods described includes:
3D rendering is exported by the display unit, wherein, right comprising M display in the 3D rendering As, M is positive integer, user can be watched by wearing 3D glasses corresponding with the electronic equipment There is the M display object of 3D effect;
The 3D coordinates of the operating body that 3D glasses described in real-time reception send and the M display object 3D coordinates;
The 3D coordinates of 3D coordinates and the M display object based on the operating body, to described The M specified display object shown in object is controlled.
Preferably, before the output 3D rendering by the display unit, also include:
Obtain the corresponding 2D images of the 3D rendering;
3D modeling is carried out to the 2D images and adds the depth of field, obtain the 3D rendering.
Preferably, the 3D of the 3D coordinates and the M display object based on the operating body Coordinate, is controlled to the described M specified display object shown in object, including:
Whether first group of 3D coordinate based on the operating body in first time period, detect the operating body Be carrying out choosing operation, wherein, described choose operation be used for from M display object in choose the finger Surely show object;
Detect described choose operation when, determine the 3D seats of the primary importance for choosing operation chosen Mark;
The 3D coordinates of 3D coordinates and the M display object based on the primary importance, from institute State in M display object and determine the specified display object.
Preferably, the 3D coordinates based on the primary importance and the M display object 3D coordinates, show in object from described M and determine the specified display object, including:
From the 3D coordinates of the primary importance, X-axis coordinate, the Y-axis coordinate of the primary importance are extracted With Z axis coordinate;
From the 3D coordinates of described M display object, extract described M and show each display in object The X-axis coordinate of object is interval, Y-axis coordinate is interval and Z axis coordinate;
Based on the X-axis coordinate of the primary importance, Y-coordinate and Z axis coordinate and the M display The X-axis coordinate of object is interval, Y-coordinate is interval and Z axis coordinate, shows in object from described M and determines The specified display object, wherein, the Z of the Z axis coordinate of the primary importance and the specified display object Axial coordinate is identical, and the X-axis coordinate of the primary importance is located at the specified X-axis coordinate area for showing object In, the Y-axis coordinate of the primary importance is located in the specified Y-axis coordinate interval for showing object.
Preferably, the 3D coordinates based on the primary importance and the M display object 3D coordinates, showing from described M after determine in object the specified display object, also including:
Whether second group of 3D coordinate based on the operating body in second time period, detect the operating body Drag operation is carrying out, wherein, after the second time period is located at the first time period, described is dragged Dynamic operation is shown for the specified display object is dragged to the second position;
When the drag operation is detected, based on second group of 3D coordinate, the second position is determined 3D coordinates;
Based on the 3D coordinates of the second position, the specified 3D for showing object current display position is adjusted Coordinate, the specified display object is shown in the second position.
Preferably, the Z axis coordinate in the 3D coordinates of the primary importance is sat with the 3D of the second position Z axis coordinate in mark is different.
Second aspect, based on same inventive concept, an embodiment of the application by the application, there is provided as follows Technical scheme:
A kind of 3D touch operation methods, are applied in 3D glasses, and the 3D glasses are coordinated with electronic equipment Use, the 3D glasses include that two image acquisition units, described two image acquisition units are located at respectively The 3D glasses or so two ends, methods described include:
In 3D rendering of the electronic equipment output comprising M display object, by described two images Collecting unit obtains the two width 2D images comprising operating body and the M display object respectively in real time, Wherein, M is positive integer, and the user for wearing the 3D glasses can pass through the two width 2D image-watchings To the M display object with 3D effect;
Based on the two width 2D images, the 3D coordinates of the operating body and the M display is determined The 3D coordinates of object;
The 3D coordinates of the 3D coordinates of the operating body and the M display object are sent to described Electronic equipment, so that the electronic equipment can be based on the 3D coordinates of the operating body and the M Show the 3D coordinates of object, the described M specified display object shown in object is controlled.
The third aspect, based on same inventive concept, an embodiment of the application by the application, there is provided as follows Technical scheme:
A kind of electronic equipment, the electronic equipment include display unit, and the electronic equipment also includes:
Output unit, for exporting 3D rendering by the display unit, wherein, in the 3D rendering Comprising M display object, M is positive integer, and user is by wearing 3D eyes corresponding with the electronic equipment Mirror, can watch the M display object with 3D effect;
Receiving unit, the 3D coordinates of the operating body sent for 3D glasses described in real-time reception, Yi Jisuo State the 3D coordinates of M display object;
Control unit, shows object for the 3D coordinates based on the operating body and described M 3D coordinates, are controlled to the described M specified display object shown in object.
Preferably, the electronic equipment, also includes:
First acquisition unit, for described 3D rendering is exported by the display unit before, obtain described The corresponding 2D images of 3D rendering;
3D modeling unit, for carrying out 3D modeling to the 2D images and adding the depth of field, obtains the 3D Image.
Preferably, described control unit, specifically for:
Whether first group of 3D coordinate based on the operating body in first time period, detect the operating body Be carrying out choosing operation, wherein, described choose operation be used for from M display object in choose the finger Surely show object;Detect described choose operation when, determine that described choosing operates chosen primary importance 3D coordinates;The 3D of 3D coordinates and the M display object based on the primary importance sits Mark, shows in object from described M and determines the specified display object.
Preferably, described control unit, specifically for:
From the 3D coordinates of the primary importance, X-axis coordinate, the Y-axis coordinate of the primary importance are extracted With Z axis coordinate;From the 3D coordinates of described M display object, extract described M and show in object The X-axis coordinate interval of each display object, Y-axis coordinate interval and Z axis coordinate;It is based on the primary importance X-axis coordinate, Y-coordinate and Z axis coordinate and the M display object X-axis coordinate interval, Y-coordinate interval and Z axis coordinate, show in object from described M and determine the specified display object, its In, the Z axis coordinate of the primary importance is identical with the specified Z axis coordinate for showing object, and described first The X-axis coordinate of position is located in the specified X-axis coordinate interval for showing object, the Y of the primary importance It is interval interior that axial coordinate is located at the specified Y-axis coordinate for showing object.
Preferably, described control unit, is additionally operable to:
In the 3D coordinates based on the primary importance and the 3D coordinates of the M display object, Showing from described M after determine in object the specified display object, based on the operating body second Second group of 3D coordinate in time period, detects whether the operating body is carrying out drag operation, wherein, After the second time period is located at the first time period, the drag operation is used for the specified display Object drags to the second position and is shown;When the drag operation is detected, based on second group of 3D Coordinate, determines the 3D coordinates of the second position;Based on the 3D coordinates of the second position, institute is adjusted State and specify the 3D coordinates for showing object current display position, described specified object will be shown described second Position is shown.
Preferably, the Z axis coordinate in the 3D coordinates of the primary importance is sat with the 3D of the second position Z axis coordinate in mark is different.
Fourth aspect, based on same inventive concept, an embodiment of the application by the application, there is provided as follows Technical scheme:
A kind of 3D glasses, the 3D glasses described electronic equipment arbitrary with claim 8~13 coordinates to be made With the 3D glasses include that two image acquisition units, described two image acquisition units are located at institute respectively 3D glasses or so two ends are stated, the 3D glasses also include:
Second acquisition unit, for the electronic equipment output comprising M show object 3D rendering when, It is right comprising operating body and the M display to be obtained in real time by described two image acquisition units respectively The two width 2D images of elephant, wherein, M is positive integer, and the user for wearing the 3D glasses can pass through institute Two width 2D image-watchings are stated to the M display object with 3D effect;
Determining unit, for based on the two width 2D images, determine the operating body 3D coordinates, with And the 3D coordinates of the M display object;
Transmitting element, for the 3D coordinates by the operating body and the 3D of the M display object Coordinate is sent to the electronic equipment so that the electronic equipment can based on the 3D coordinates of the operating body, And the 3D coordinates of the M display object, to the described M specified display object shown in object It is controlled.
The one or more technical schemes provided in the embodiment of the present application, at least have the following technical effect that or excellent Point:
1st, in the embodiment of the present application, in electronic equipment side, send as a result of real-time reception 3D glasses Operating body 3D coordinates and M display object 3D coordinates, and based on operating body 3D sit Mark and the 3D coordinates of M display object, are carried out to the M specified display object shown in object Control so that when carrying out man-machine interaction under 3D display environments, user can be schemed to 3D seen by person Display object at the visual position of picture carries out touch control operation, without the need for Touch screen.So, efficiently solve When user carries out man-machine interaction under 3D display environments in prior art, need to use handss Touch screen, exist Touch location is not corresponding with visual position, causes the factitious technical problem of touch operation.And then achieve use Family in the environment of 3D shows during row man-machine interaction, the display object that the touch location of user is seen with user Visual position consistent, operation more true nature improves the technique effect of user experience.
2nd, in the embodiment of the present application, in 3D glasses side, as a result of two width 2D images are based on, really Determine operating body 3D coordinates and M display object 3D coordinates, by the 3D coordinates of operating body, And the 3D coordinates of M display object are sent to electronic equipment, so that electronic equipment can be based on operating body 3D coordinates and M display object 3D coordinates, to M display object in specified display right As being controlled.So, efficiently solving user in prior art carries out man-machine friendship under 3D display environments When mutually, need to use handss Touch screen, it is not corresponding with visual position to there is touch location, causes touch operation not Natural technical problem.And then user is achieved in the descending man-machine interaction of 3D display environments, user touches Touch position consistent with the visual position of the display object that user sees, operation more true nature improves use The technique effect of family Experience Degree.
Description of the drawings
For the technical scheme being illustrated more clearly that in the embodiment of the present invention, below will be to institute in embodiment description The accompanying drawing for using is needed to be briefly described, it should be apparent that, drawings in the following description are of the invention Some embodiments, for those of ordinary skill in the art, on the premise of not paying creative work, Can be with according to these other accompanying drawings of accompanying drawings acquisition.
Fig. 1 is the flow chart of the 3D touch operation methods of (electronic equipment side) in the embodiment of the present application;
Fig. 2 is that under 3D display environments, electronic equipment exports the schematic diagram of 3D rendering in the embodiment of the present application;
Fig. 3 is the schematic diagram of shutter type 3 D spectacles in the embodiment of the present application;
Fig. 4 is that under 2D display environments, electronic equipment exports the schematic diagram of 2D images in the embodiment of the present application;
Fig. 5 be the embodiment of the present application in 3D glasses acquisition operations bodies 2D figures and carry out showing for 3D synthesis It is intended to;
Fig. 6 is the schematic diagram of operating body and the Z coordinate superposition of display object in the embodiment of the present application;
Fig. 7 executes showing for the specified display object for choosing operation to choose for determination operating body in the embodiment of the present application It is intended to;
Fig. 8 is the schematic diagram that operating body executes drag operation in the embodiment of the present application;
Fig. 9 is the flow chart of the 3D touch operation methods of (3D glasses side) in the embodiment of the present application;
Figure 10 is the structural representation of electronic equipment in the embodiment of the present application;
Figure 11 is the structural representation of 3D glasses in the embodiment of the present application.
Specific embodiment
The embodiment of the present application is by providing a kind of 3D touch operation methods, electronic equipment and 3D glasses, solution When in prior art of having determined, user carries out man-machine interaction under 3D display environments, need to use handss Touch screen, There is touch location not corresponding with visual position, cause the factitious technical problem of touch operation.
, for solving above-mentioned technical problem, general thought is as follows for the technical scheme of the embodiment of the present application:
A kind of 3D touch operation methods, are applied in electronic equipment, and electronic equipment includes display unit, should Method includes:3D rendering is exported by display unit, wherein, comprising M display object in 3D rendering, M is positive integer, and user can be watched with 3D by wearing 3D glasses corresponding with electronic equipment M display object of effect;The 3D coordinates and M of the operating body that real-time reception 3D glasses send Show the 3D coordinates of object;3D coordinates based on operating body and the 3D coordinates of M display object, The M specified display object shown in object is controlled.
In order to be better understood from above-mentioned technical proposal, below in conjunction with Figure of description and specific embodiment party Formula is described in detail to above-mentioned technical proposal.
Illustrate that herein presented term " depth of field " refers both to " the positive depth of field ", i.e., first:Image is located at screen The 3D display effects in curtain front.
Embodiment one
Electronic equipment side is stood in, this enforcement provides a kind of 3D touch operation methods, is applied to electronic equipment In, the electronic equipment can be:Smart mobile phone, panel computer or intelligent television etc., for the electronics Equipment specifically which kind of electronic equipment, the present embodiment are not particularly limited.And the electronic equipment includes a display Unit, the display unit can be:(Light-EmittiMg Diode light two for LCD screen or LED Pole pipe) screen etc., the display unit can have touch controllable function, it is also possible to which not there is touch controllable function.
As shown in figure 1, the 3D touch operation methods, including:
Step S101:3D rendering is exported by display unit, wherein, is shown comprising M in 3D rendering Object, M is positive integer, and user can watch tool by wearing 3D glasses corresponding with electronic equipment There is M display object of 3D effect.
For example, as shown in Fig. 2 electronic equipment exports a 3D rendering by display unit, in the 3D Include 9 display objects in image, these display objects are specifically as follows:9 application icons are (i.e.: APP1, APP2, APP3, APP4, APP5, APP6, APP7, APP8, APP9), described aobvious Showing that object can also be that picture, file, menu, button etc. are any can touch object.And in output 3D Before image, electronic equipment needs first to obtain (as shown in Figure 2) 3D rendering corresponding (such as Fig. 4 institutes Show) 2D (TwoDimeMsioMa, two dimension) image, then 3D modeling is carried out to the 2D images and is added Plus the depth of field, obtain (as shown in Figure 2) 3D rendering.
In the embodiment of the present application, realize that 3D rendering shows using shutter 3D technology.Specifically, Electronic equipment alternately exports two groups of 2D images, the left side of this two groups of 2D image simulation people using certain frequency There is certain difference in right vision difference, user needs to wear (as shown in Figure 3) shutter type 3 D spectacles, The 3D glasses are provided with two panels shutter eyeglass 501, and this two panels shutter eyeglass 501 is using setting with electronics Standby identical frequency alternate conduction so that corresponding that group of 2D images of left eye in two groups of 2D images are passed through The shutter eyeglass 501 of left eye side reaches the left eye of user, and corresponding that group of 2D images of right eye pass through right eye The shutter eyeglass 501 of side reaches the right eye of user, as two groups of 2D images have sight equation, finally exists Assume a width 3D rendering in user's brain so that user by the 3D glasses watch 3D images when, The M display object with 3D effect (as shown in Figure 2) can be watched.
Step S102:The 3D coordinates of the operating body that real-time reception 3D glasses send and M display are right The 3D coordinates of elephant.
In specific implementation process, user can pass through operating body (for example:The finger tip of finger or pointer Nib) to M show object in arbitrary display object carry out touch control operation (for example:Choose operation, Drag operation etc.).
In specific implementation process, user wears 3D glasses it is seen that (as shown in Figure 2) width 3D Image, wherein, M shows object (for example:9 application icons in Fig. 2) all it is stereo-picture, All there is certain depth of field, at a certain distance from the display location of each display object is all located in front of screen.
In specific implementation process, user is to specifying display object (i.e.:User wants to carry out touch control operation Arbitrary display object) when carrying out touch control operation, operating body is (for example:The finger tip of finger or pointer Nib) need not touch screen, but the visual position of the specified display object that is seen according to eyes, directly Show that object carries out touch control operation to specifying, this needs electronic equipment to complete with 3D glasses coordinated. Specifically, 3D glasses need the 3D coordinates and M that send operating body to electronic equipment in real time to show Show the 3D coordinates of object, these 3D coordinates can be with two 501 lines of shutter eyeglass of 3D glasses Midpoint (center of two lines of 3D eyeglass users is worn in simulation) is the 3D coordinate systems that origin is set up, The 3D coordinates of the operating body that electronic equipment real-time reception 3D glasses send and the 3D of M display object Coordinate, so that it is determined that the relative position between operating body and each display object.
Specifically, as shown in figure 3, being provided with two image acquisition units at the left and right two ends of 3D glasses (i.e.:Binocular camera), as shown in figure 5,3D glasses first pass through two image acquisition units respectively in real time Obtain the two width 2D images comprising operating body and M display object (i.e.:Left eye 2D in Fig. 5 Image and right eye 2D images), and 3D modeling is carried out based on two width 2D images, synthesize 3D rendering, The 3D coordinate of the 3D coordinate and M display object of operating body is calculated again, finally by operating body The 3D coordinates of 3D coordinates and M display object are sent to electronic equipment, wherein, the 3D coordinates X-axis, Y-axis, the coordinate on three directions of Z axis is included in, the coordinate on Z axis is used for representing the depth of field.
Step S103:3D coordinates based on operating body and the 3D coordinates of M display object, to M Specified display object in individual display object is controlled (for example:Choose, open, dragging etc.).
In specific implementation process, step S103 is specifically included:
The first step, first group of 3D coordinate based on operating body in first time period, whether detection operating body It is carrying out choosing operation, wherein, chooses operation to be used for from M display object, choose specified display right As;Specifically, execute in operating body and choose in operating process, during 3D glasses entirely will can be chosen The 3D coordinates of operating body are (i.e.:First group of 3D coordinate) it is sent to electronic equipment in real time, electronic equipment can To be based on first group of 3D coordinate, by 3D Grid Tracks and the relation of time, touch control operation identification is carried out, Whether detection operating body is carrying out choosing operation.In general, operation is chosen to include length by operation and click on Operation, by operation for choosing a certain display object and being dragged, clicking operation is used for choosing a certain showing length Show object (for example:Application icon) and open corresponding application program.
Second step, detect choose operation when, determine operation execute choose operation when choose first The 3D coordinates that puts.
The 3D coordinates of the 3rd step, the 3D coordinates based on primary importance and M display object, from M Determine to specify in individual display object and show object.Specifically, can first from the 3D coordinates of primary importance Extract X-axis coordinate, Y-axis coordinate and the Z axis coordinate of primary importance, then the 3D from M display object X-axis coordinate interval, Y-axis coordinate interval and the Z axis coordinate of each display object, last base is extracted in coordinate X-axis coordinate, Y-coordinate and Z axis coordinate in primary importance and the X-axis coordinate of M display object Interval, Y-coordinate interval and Z axis coordinate, specify from determination in M display object and show that object is (specified Show that object is that user executes the display object for choosing operation to choose), wherein, the Z axis coordinate of primary importance Identical with the specified Z axis coordinate for showing object (can also have certain error, for example:Difference several millimeters or Several centimetres, concrete error by the size of display unit and can show that the size of object determines, now do not do and have Body is limited), the X-axis coordinate of primary importance is located to specify and shows in the X-axis coordinate interval of object, first The Y-axis coordinate that puts is located at specifies the Y-axis coordinate for showing object interval interior.
For example, as shown in fig. 6, electronic equipment from operating body (for example:The finger tip or touch-control of finger Pen nib) 3D coordinates in extract operating body Z axis coordinate be Z, in 3D rendering, each display is right As (for example:Application image APP1~APP9) Z axis coordinate be Z ', as Z=Z ', then can determine Operating body is spatially in sustained height with these display objects at present;Of course, it is possible to take certain error Interval (Z1~Z2), Z1<Z’<Z2, if Z is located between error burst (Z1~Z2), it is also possible to think Operating body is spatially in sustained height with these display objects at present.As shown in fig. 7, recognizing behaviour When being carrying out choosing operation as body, it may be determined that the 3D coordinates of the primary importance for choosing operation to choose, if The 3D coordinates of primary importance in the range of (X1~X2, Y1~Y2, Z1~Z2), due to (X1~X2, Y1~Y2, Z1~Z2) be the corresponding 3D coordinates of APP8, then can determine the specified display that operating body is chosen Object is APP8.
If the operation of choosing for recognizing operating body execution is specially clicking operation, directly can open specified aobvious Show the corresponding application program of object, for example:Open APP8.
If recognize operating body execution to choose operation to be specially long by operation, described based on primary importance 3D coordinates and the 3D coordinates of M display object, show from M and determine in object that specified display is right As afterwards, also including:
4th step, based on second group 3D coordinate of the operating body within the second time, whether detection operating body Drag operation is carrying out, wherein, after second time period is located at first time period, drag operation is used for will Specify and show that object drags to the second position and shown;
5th step, when drag operation is detected, based on second group of 3D coordinate, determines the 3D of the second position Coordinate;
6th step, based on the 3D coordinates of the second position, the 3D for showing object current display position is specified in adjustment Coordinate, specified display object is shown in the second position.
For example, when operating body executes drag operation and moves to the second position, 3D glasses can entirely In dragging process, the 3D coordinates of operating body are (i.e.:Second group of 3D coordinate) it is sent to electronic equipment in real time. Electronic equipment can first be based on second group of 3D coordinate, by 3D Grid Tracks and the relation of time, carry out Touch control operation recognizes whether detection operating body is carrying out drag operation, however, it is determined that operating body is dragged Dynamic operation, then the 3D coordinates of the second position are determined based on second group of 3D coordinate, it is finally based on the second position 3D Coordinate Adjustings specify show object current display position 3D coordinates (including X-axis, Y-axis, Z Coordinate on axle is adjusted respectively), specified display object is shown (such as Fig. 8 in the second position Shown), wherein, Z axis coordinate in the 3D coordinates of primary importance can with the 3D coordinates of the second position in Z axis coordinate different, i.e.,:Specified display object can be dragged at the different depth of field (i.e. by user:Screen Before curtain at different distance) show.
In the embodiment of the present application, under 3D display environments, user is without the need for Touch screen, and only needs directly (the display object depth of field containing Z coordinate, before visual position is located at screen to display object in contact 3D rendering Just), you can complete interaction therewith, it is completely the same in the 3 d space with visual position that touch location is reached, Greatly strengthen Consumer's Experience.
Technical scheme in above-mentioned the embodiment of the present application, at least has the following technical effect that or advantage:
In the embodiment of the present application, the 3D of the operating body for sending as a result of real-time reception 3D glasses sits Mark and the 3D coordinates of M display object, and the 3D coordinates based on operating body and M display The 3D coordinates of object, are controlled to the M specified display object shown in object so that aobvious in 3D Show when carrying out man-machine interaction under environment, user can be to aobvious at the visual position of 3D rendering seen by person Show that object carries out touch control operation, without the need for Touch screen.So, user is efficiently solved in prior art in 3D When man-machine interaction is carried out under display environment, need to use handss Touch screen, there is touch location with visual position not Corresponding, cause the factitious technical problem of touch operation.And then it is descending in 3D display environments to achieve user During man-machine interaction, the touch location of user is consistent with the visual position of the display object that user sees, operation is more Plus true nature, improve the technique effect of user experience.
Embodiment two
Based on same inventive concept, as shown in figure 9, standing in 3D glasses side, one kind is present embodiments provided 3D touch operation methods, are applied in 3D glasses, and 3D glasses are coordinated with the electronic equipment in embodiment one Use, 3D glasses include two image acquisition units, and two image acquisition units are (i.e.:Binocular camera) 3D glasses or so two ends are located at respectively, and the method includes:
Step S201:In 3D rendering of the electronic equipment output comprising M display object, by two Image acquisition units obtain respectively in real time comprising operating body (nib of the finger tip of finger or pointer), with And M display object is (for example:9 as shown in Figure 4 application icon APP1~APP9) two width 2D Image, wherein, M is positive integer, and the user for wearing 3D glasses can be arrived by two width 2D image-watchings There is M display object of 3D effect;
Step S202:Based on two width 2D images, determine that the 3D coordinates of operating body and M display are right The 3D coordinates of elephant;
Step S203:The 3D coordinates of the 3D coordinates of operating body and M display object are sent to electricity Sub- equipment, so that electronic equipment can be based on the 3D coordinates of operating body and the 3D of M display object Coordinate, is controlled to the M specified display object shown in object.
In the embodiment of the present application, realize that 3D rendering shows using shutter 3D technology.Specifically, Electronic equipment alternately exports two groups of 2D images, the left side of this two groups of 2D image simulation people using certain frequency There is certain difference in right vision difference, user needs to wear (as shown in Figure 3) shutter type 3 D spectacles, The 3D glasses are provided with two panels shutter eyeglass 501, and this two panels shutter eyeglass 501 is using setting with electronics Standby identical frequency alternate conduction so that corresponding that group of 2D images of left eye in two groups of 2D images are passed through The shutter eyeglass 501 of left eye side reaches the left eye of user, and corresponding that group of 2D images of right eye pass through right eye The shutter eyeglass 501 of side reaches the right eye of user, as both sides 2D images have sight equation, finally exists Assume a width 3D rendering in user's brain so that user by the 3D glasses watch 3D images when, The M display object with 3D effect (as shown in Figure 2) can be watched.
In the embodiment of the present application, as shown in figure 3, being provided with two images at the left and right two ends of 3D glasses Collecting unit is (i.e.:Binocular camera), as shown in figure 5,3D glasses first pass through two image acquisition units The two width 2D images comprising operating body and M display object are obtained respectively in real time (i.e.:In Fig. 5 Left eye 2D images and right eye 2D images), and 3D modeling is carried out based on two width 2D images, synthesize 3D Image, then the 3D coordinates of the 3D coordinates and M display object of operating body are calculated, finally will behaviour The 3D coordinates for making the 3D coordinates and M display object of body are sent to electronic equipment, wherein, described 3D coordinates are included in X-axis, Y-axis, the coordinate on three directions of Z axis, and the coordinate on Z axis is used for representing The depth of field.
In the embodiment of the present application, the 3D for receiving the operating body that 3D glasses are sent in real time in electronic equipment sits After the 3D coordinates of mark and M display object, operating body and each display object can be known in space In position, and to operating body execute touch control operation be identified so that user is under 3D display environments When carrying out man-machine interaction, user is without the need for Touch screen, but the specified display object that is seen according to eyes is regarded Feel position, directly to specifying display object to carry out touch control operation, the touch location of user is seen with user Display object visual position consistent so that operation more true nature.
Technical scheme in above-mentioned the embodiment of the present application, at least has the following technical effect that or advantage:
In the embodiment of the present application, as a result of based on two width 2D images, determine that the 3D of operating body sits Mark and the 3D coordinates of M display object, by the 3D coordinates of operating body and M display object 3D coordinates be sent to electronic equipment so that electronic equipment can based on the 3D coordinates of operating body and The 3D coordinates of M display object, are controlled to the M specified display object shown in object.So, When efficiently solving that user carries out man-machine interaction under 3D display environments in prior art, need to be touched with handss Screen, it is not corresponding with visual position to there is touch location, causes the factitious technical problem of touch operation.Enter And user is achieved in the environment of 3D shows during row man-machine interaction, the touch location of user is seen with user Display object visual position consistent, operation more true nature improves the technology effect of user experience Really.
Embodiment three
Same inventive concept is based on, as shown in Figure 10, the present embodiment provides 3D in a kind of enforcement embodiment one The electronic equipment of touch operation method.
A kind of electronic equipment, including display unit, also includes:
Output unit 301, for exporting 3D rendering by display unit, wherein, includes M in 3D rendering Individual display object, M are positive integer, and user is by wearing 3D glasses corresponding with electronic equipment, Neng Gouguan See the M display object with 3D effect;
Receiving unit 302, the 3D coordinates and M of the operating body sent for real-time reception 3D glasses The individual 3D coordinates for showing object;
Control unit 303, the 3D for the 3D coordinates based on operating body and M display object are sat Mark, is controlled to the M specified display object shown in object.
Further, the electronic equipment, also includes:
First acquisition unit, for, before exporting 3D rendering by display unit, obtaining 3D rendering corresponding 2D images;
3D modeling unit, for carrying out 3D modeling to 2D images and adding the depth of field, obtains 3D rendering.
Further, control unit 303, specifically for:
Whether first group of 3D coordinate based on operating body in first time period, detection operating body are carrying out Operation is chosen, wherein, chooses operation to be used for choosing to specify from M display object showing object;In inspection Measure when choosing operation, determine the 3D coordinates of the primary importance for choosing operation chosen;It is based on primary importance 3D coordinates and M display object 3D coordinates, from M show object in determine specify display Object.
Further, control unit 303, specifically for:
From the 3D coordinates of primary importance, the X-axis coordinate, Y-axis coordinate and the Z axis that extract primary importance are sat Mark;From the 3D coordinates of M display object, the M X for showing each display object in object is extracted Axial coordinate is interval, Y-axis coordinate is interval and Z axis coordinate;Based on the X-axis coordinate of primary importance, Y-coordinate and The X-axis coordinate interval of Z axis coordinate and M display object, Y-coordinate interval and Z axis coordinate, from M shows that display object is specified in determination in object, and wherein, the Z axis coordinate of primary importance is right with specified display The Z axis coordinate of elephant is identical, and the X-axis coordinate of primary importance is located at specifies the X-axis coordinate for showing object interval Interior, the Y-axis coordinate of primary importance is located at specifies the Y-axis coordinate for showing object interval interior.
Further, control unit 303, are additionally operable to:
In the 3D coordinates based on primary importance and the 3D coordinates of M display object, show from M Determine in object after specifying display object, second group of 3D coordinate based on operating body in second time period, Whether detection operating body is carrying out drag operation, wherein, after second time period is located at first time period, Drag operation is used for showing that object drags to the second position and shown by specified;When drag operation is detected, Based on second group of 3D coordinate, the 3D coordinates of the second position are determined;Based on the 3D coordinates of the second position, adjust The whole specified 3D coordinates for showing object current display position, specified display object is carried out in the second position Show.
Further, the Z in the 3D coordinates of the Z axis coordinate in the 3D coordinates of primary importance and the second position Axial coordinate is different.
Technical scheme in above-mentioned the embodiment of the present application, at least has the following technical effect that or advantage:
In the embodiment of the present application, the 3D of the operating body for sending as a result of real-time reception 3D glasses sits Mark and the 3D coordinates of M display object, and the 3D coordinates based on operating body and M display The 3D coordinates of object, are controlled to the M specified display object shown in object so that aobvious in 3D Show when carrying out man-machine interaction under environment, user can be to aobvious at the visual position of 3D rendering seen by person Show that object carries out touch control operation, without the need for Touch screen.So, user is efficiently solved in prior art in 3D When man-machine interaction is carried out under display environment, need to use handss Touch screen, there is touch location with visual position not Corresponding, cause the factitious technical problem of touch operation.And then achieve user in the environment of 3D shows During row man-machine interaction, the touch location of user is consistent with the visual position of the display object that user sees, operation More true nature, improves the technique effect of user experience.
Example IV
Same inventive concept is based on, as shown in figure 11, the present embodiment provides 3D in a kind of enforcement embodiment two The 3D glasses of touch operation method.
A kind of 3D glasses, the 3D glasses are used cooperatively with the electronic equipment in embodiment three, the 3D glasses Including two image acquisition units (i.e.:Binocular camera), two image acquisition units are located at 3D eyes respectively Mirror or so two ends, the 3D glasses also include:
Second acquisition unit 401, for electronic equipment output comprising M show object 3D rendering when, Two width comprising operating body and M display object are obtained respectively in real time by two image acquisition units 2D images, wherein, M is positive integer, and the user for wearing 3D glasses can pass through two width 2D image-watchings To the M display object with 3D effect;
Determining unit 402, for based on two width 2D images, determining the 3D coordinates and M of operating body The individual 3D coordinates for showing object;
Transmitting element 403, for the 3D coordinates by operating body and the 3D coordinates of M display object Electronic equipment is sent to, so that electronic equipment can be right based on the 3D coordinates of operating body and M display The 3D coordinates of elephant, are controlled to the M specified display object shown in object.
As the 3D glasses introduced by the present embodiment are 3D touch control operation sides in enforcement the embodiment of the present application two The electronic equipment adopted by method, so based on the 3D touch operation methods described in the embodiment of the present application two, Those skilled in the art will appreciate that the specific embodiment of the 3D glasses of the present embodiment and which is various Version, so for the 3D glasses, how here realizes that the method in the embodiment of the present application is no longer detailed Introduce.As long as those skilled in the art implement 3D touch operation methods in the embodiment of the present application being adopted Electronic equipment, belong to the be intended to scope that protects of the application.
Technical scheme in above-mentioned the embodiment of the present application, at least has the following technical effect that or advantage:
In the embodiment of the present application, as a result of based on two width 2D images, determine that the 3D of operating body sits Mark and the 3D coordinates of M display object, by the 3D coordinates of operating body and M display object 3D coordinates be sent to electronic equipment so that electronic equipment can based on the 3D coordinates of operating body and The 3D coordinates of M display object, are controlled to the M specified display object shown in object.So, When efficiently solving that user carries out man-machine interaction under 3D display environments in prior art, need to be touched with handss Screen, it is not corresponding with visual position to there is touch location, causes the factitious technical problem of touch operation.Enter And user is achieved in the environment of 3D shows during row man-machine interaction, the touch location of user is seen with user Display object visual position consistent, operation more true nature improves the technology effect of user experience Really.
Those skilled in the art are it should be appreciated that embodiments of the invention can be provided as method, system or meter Calculation machine program product.Therefore, the present invention can adopt complete hardware embodiment, complete software embodiment or knot The form of the embodiment in terms of conjunction software and hardware.And, the present invention can be adopted and wherein be wrapped one or more Computer-usable storage medium containing computer usable program code (including but not limited to disk memory, CD-ROM, optical memory etc.) the upper computer program that implements form.
The present invention is produced with reference to method according to embodiments of the present invention, equipment (system) and computer program The flow chart and/or block diagram of product is describing.It should be understood that can by computer program instructions flowchart and / or block diagram in each flow process and/or square frame and flow chart and/or the flow process in block diagram and/ Or the combination of square frame.These computer program instructions can be provided to general purpose computer, special-purpose computer, embedded The processor of formula datatron or other programmable data processing devices is producing a machine so that by calculating The instruction of the computing device of machine or other programmable data processing devices is produced for realization in flow chart one The device of the function of specifying in individual flow process or one square frame of multiple flow processs and/or block diagram or multiple square frames.
These computer program instructions may be alternatively stored in and computer or the process of other programmable datas can be guided to set In the standby computer-readable memory for working in a specific way so that be stored in the computer-readable memory Instruction produce and include the manufacture of command device, command device realization is in one flow process or multiple of flow chart The function of specifying in one square frame of flow process and/or block diagram or multiple square frames.
These computer program instructions can be also loaded in computer or other programmable data processing devices, made Obtaining, series of operation steps is executed on computer or other programmable devices to produce computer implemented place Reason, the instruction so as to execute on computer or other programmable devices are provided for realizing in flow chart one The step of function of specifying in flow process or one square frame of multiple flow processs and/or block diagram or multiple square frames.
, but those skilled in the art once know base although preferred embodiments of the present invention have been described This creative concept, then can make other change and modification to these embodiments.So, appended right will Ask and be intended to be construed to include preferred embodiment and fall into the had altered of the scope of the invention and change.
Obviously, those skilled in the art can carry out various changes and modification without deviating from this to the present invention Bright spirit and scope.So, if the present invention these modification and modification belong to the claims in the present invention and Within the scope of its equivalent technologies, then the present invention is also intended to comprising these changes and modification.

Claims (14)

1. a kind of 3D touch operation methods, are applied in electronic equipment, and the electronic equipment includes showing list Unit, it is characterised in that methods described includes:
3D rendering is exported by the display unit, wherein, right comprising M display in the 3D rendering As, M is positive integer, user can be watched by wearing 3D glasses corresponding with the electronic equipment There is the M display object of 3D effect;
The 3D coordinates of the operating body that 3D glasses described in real-time reception send and the M display object 3D coordinates;
The 3D coordinates of 3D coordinates and the M display object based on the operating body, to described The M specified display object shown in object is controlled.
2. the method for claim 1, it is characterised in that described exported by the display unit Before 3D rendering, also include:
Obtain the corresponding 2D images of the 3D rendering;
3D modeling is carried out to the 2D images and adds the depth of field, obtain the 3D rendering.
3. the method for claim 1, it is characterised in that the 3D based on the operating body sits Mark and the 3D coordinates of the M display object, to the described M specified display shown in object Object is controlled, including:
Whether first group of 3D coordinate based on the operating body in first time period, detect the operating body Be carrying out choosing operation, wherein, described choose operation be used for from M display object in choose the finger Surely show object;
Detect described choose operation when, determine the 3D seats of the primary importance for choosing operation chosen Mark;
The 3D coordinates of 3D coordinates and the M display object based on the primary importance, from institute State in M display object and determine the specified display object.
4. method as claimed in claim 3, it is characterised in that the 3D based on the primary importance Coordinate and the 3D coordinates of the M display object, showing from described M described in determine in object Specify and show object, including:
From the 3D coordinates of the primary importance, X-axis coordinate, the Y-axis coordinate of the primary importance are extracted With Z axis coordinate;
From the 3D coordinates of described M display object, extract described M and show each display in object The X-axis coordinate of object is interval, Y-axis coordinate is interval and Z axis coordinate;
Based on the X-axis coordinate of the primary importance, Y-coordinate and Z axis coordinate and the M display The X-axis coordinate of object is interval, Y-coordinate is interval and Z axis coordinate, shows in object from described M and determines The specified display object, wherein, the Z of the Z axis coordinate of the primary importance and the specified display object Axial coordinate is identical, and the X-axis coordinate of the primary importance is located at the specified X-axis coordinate area for showing object In, the Y-axis coordinate of the primary importance is located in the specified Y-axis coordinate interval for showing object.
5. method as claimed in claim 3, it is characterised in that the 3D based on the primary importance Coordinate and the 3D coordinates of the M display object, showing from described M described in determine in object Specify after showing object, also include:
Whether second group of 3D coordinate based on the operating body in second time period, detect the operating body Drag operation is carrying out, wherein, after the second time period is located at the first time period, described is dragged Dynamic operation is shown for the specified display object is dragged to the second position;
When the drag operation is detected, based on second group of 3D coordinate, the second position is determined 3D coordinates;
Based on the 3D coordinates of the second position, the specified 3D for showing object current display position is adjusted Coordinate, the specified display object is shown in the second position.
6. method as claimed in claim 5, it is characterised in that in the 3D coordinates of the primary importance Z axis coordinate is different from the Z axis coordinate in the 3D coordinates of the second position.
7. a kind of 3D touch operation methods, are applied in 3D glasses, the 3D glasses and electronic equipment Use cooperatively, the 3D glasses include two image acquisition units, described two image acquisition units difference It is located at the 3D glasses or so two ends, it is characterised in that methods described includes:
In 3D rendering of the electronic equipment output comprising M display object, by described two images Collecting unit obtains the two width 2D images comprising operating body and the M display object respectively in real time, Wherein, M is positive integer, and the user for wearing the 3D glasses can pass through the two width 2D image-watchings To the M display object with 3D effect;
Based on the two width 2D images, the 3D coordinates of the operating body and the M display is determined The 3D coordinates of object;
The 3D coordinates of the 3D coordinates of the operating body and the M display object are sent to described Electronic equipment, so that the electronic equipment can be based on the 3D coordinates of the operating body and the M Show the 3D coordinates of object, the described M specified display object shown in object is controlled.
8. a kind of electronic equipment, the electronic equipment include display unit, it is characterised in that the electronics Equipment also includes:
Output unit, for exporting 3D rendering by the display unit, wherein, in the 3D rendering Comprising M display object, M is positive integer, and user is by wearing 3D eyes corresponding with the electronic equipment Mirror, can watch the M display object with 3D effect;
Receiving unit, the 3D coordinates of the operating body sent for 3D glasses described in real-time reception, Yi Jisuo State the 3D coordinates of M display object;
Control unit, shows object for the 3D coordinates based on the operating body and described M 3D coordinates, are controlled to the described M specified display object shown in object.
9. electronic equipment as claimed in claim 8, it is characterised in that the electronic equipment, also includes:
First acquisition unit, for described 3D rendering is exported by the display unit before, obtain described The corresponding 2D images of 3D rendering;
3D modeling unit, for carrying out 3D modeling to the 2D images and adding the depth of field, obtains the 3D Image.
10. electronic equipment as claimed in claim 8, it is characterised in that described control unit, concrete uses In:
Whether first group of 3D coordinate based on the operating body in first time period, detect the operating body Be carrying out choosing operation, wherein, described choose operation be used for from M display object in choose the finger Surely show object;Detect described choose operation when, determine that described choosing operates chosen primary importance 3D coordinates;The 3D of 3D coordinates and the M display object based on the primary importance sits Mark, shows in object from described M and determines the specified display object.
11. electronic equipments as claimed in claim 10, it is characterised in that described control unit, specifically For:
From the 3D coordinates of the primary importance, X-axis coordinate, the Y-axis coordinate of the primary importance are extracted With Z axis coordinate;From the 3D coordinates of described M display object, extract described M and show in object The X-axis coordinate interval of each display object, Y-axis coordinate interval and Z axis coordinate;It is based on the primary importance X-axis coordinate, Y-coordinate and Z axis coordinate and the M display object X-axis coordinate interval, Y-coordinate interval and Z axis coordinate, show in object from described M and determine the specified display object, its In, the Z axis coordinate of the primary importance is identical with the specified Z axis coordinate for showing object, and described first The X-axis coordinate of position is located in the specified X-axis coordinate interval for showing object, the Y of the primary importance It is interval interior that axial coordinate is located at the specified Y-axis coordinate for showing object.
12. electronic equipments as claimed in claim 10, it is characterised in that described control unit, also use In:
In the 3D coordinates based on the primary importance and the 3D coordinates of the M display object, Showing from described M after determine in object the specified display object, based on the operating body second Second group of 3D coordinate in time period, detects whether the operating body is carrying out drag operation, wherein, After the second time period is located at the first time period, the drag operation is used for the specified display Object drags to the second position and is shown;When the drag operation is detected, based on second group of 3D Coordinate, determines the 3D coordinates of the second position;Based on the 3D coordinates of the second position, institute is adjusted State and specify the 3D coordinates for showing object current display position, described specified object will be shown described second Position is shown.
13. electronic equipments as claimed in claim 12, it is characterised in that the 3D of the primary importance sits Z axis coordinate in mark is different from the Z axis coordinate in the 3D coordinates of the second position.
A kind of 14. 3D glasses, the 3D glasses described electronic equipment arbitrary with claim 8~13 are matched somebody with somebody Close and use, the 3D glasses include that two image acquisition units, described two image acquisition units distinguish position In the 3D glasses or so two ends, it is characterised in that the 3D glasses also include:
Second acquisition unit, for the electronic equipment output comprising M show object 3D rendering when, It is right comprising operating body and the M display to be obtained in real time by described two image acquisition units respectively The two width 2D images of elephant, wherein, M is positive integer, and the user for wearing the 3D glasses can pass through institute Two width 2D image-watchings are stated to the M display object with 3D effect;
Determining unit, for based on the two width 2D images, determine the operating body 3D coordinates, with And the 3D coordinates of the M display object;
Transmitting element, for the 3D coordinates by the operating body and the 3D of the M display object Coordinate is sent to the electronic equipment so that the electronic equipment can based on the 3D coordinates of the operating body, And the 3D coordinates of the M display object, to the described M specified display object shown in object It is controlled.
CN201510570539.5A 2015-09-08 2015-09-08 A kind of 3D touch operation methods, electronic equipment and 3D glasses Pending CN106502376A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510570539.5A CN106502376A (en) 2015-09-08 2015-09-08 A kind of 3D touch operation methods, electronic equipment and 3D glasses

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510570539.5A CN106502376A (en) 2015-09-08 2015-09-08 A kind of 3D touch operation methods, electronic equipment and 3D glasses

Publications (1)

Publication Number Publication Date
CN106502376A true CN106502376A (en) 2017-03-15

Family

ID=58287014

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510570539.5A Pending CN106502376A (en) 2015-09-08 2015-09-08 A kind of 3D touch operation methods, electronic equipment and 3D glasses

Country Status (1)

Country Link
CN (1) CN106502376A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112925430A (en) * 2019-12-05 2021-06-08 北京芯海视界三维科技有限公司 Method for realizing suspension touch control, 3D display equipment and 3D terminal

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110083106A1 (en) * 2009-10-05 2011-04-07 Seiko Epson Corporation Image input system
CN102378033A (en) * 2010-08-11 2012-03-14 Lg电子株式会社 Method for operating image display apparatus
CN103336575A (en) * 2013-06-27 2013-10-02 深圳先进技术研究院 Man-machine interaction intelligent glasses system and interaction method
CN103530060A (en) * 2013-10-31 2014-01-22 京东方科技集团股份有限公司 Display device and control method thereof and gesture recognition method
US8836768B1 (en) * 2012-09-04 2014-09-16 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110083106A1 (en) * 2009-10-05 2011-04-07 Seiko Epson Corporation Image input system
CN102378033A (en) * 2010-08-11 2012-03-14 Lg电子株式会社 Method for operating image display apparatus
US8836768B1 (en) * 2012-09-04 2014-09-16 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
CN103336575A (en) * 2013-06-27 2013-10-02 深圳先进技术研究院 Man-machine interaction intelligent glasses system and interaction method
CN103530060A (en) * 2013-10-31 2014-01-22 京东方科技集团股份有限公司 Display device and control method thereof and gesture recognition method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112925430A (en) * 2019-12-05 2021-06-08 北京芯海视界三维科技有限公司 Method for realizing suspension touch control, 3D display equipment and 3D terminal

Similar Documents

Publication Publication Date Title
CN105190477B (en) Head-mounted display apparatus for user&#39;s interaction in augmented reality environment
CN103336575B (en) The intelligent glasses system of a kind of man-machine interaction and exchange method
US9886102B2 (en) Three dimensional display system and use
EP3564788A1 (en) Three-dimensional user input
CN110456907A (en) Control method, device, terminal device and the storage medium of virtual screen
CN106681512A (en) Virtual reality device and corresponding display method
CN107615214A (en) Interface control system, interface control device, interface control method and program
CN106708270A (en) Display method and apparatus for virtual reality device, and virtual reality device
CN102819385A (en) Information processing device, information processing method and program
JPWO2014016992A1 (en) 3D user interface device and 3D operation method
CN102760308B (en) Method and device for node selection of object in three-dimensional virtual reality scene
CN102426486A (en) Stereo interaction method and operated apparatus
CN104765156B (en) A kind of three-dimensional display apparatus and 3 D displaying method
CN106980377B (en) A kind of interactive system and its operating method of three-dimensional space
KR100971667B1 (en) Apparatus and method for providing realistic contents through augmented book
KR20200051937A (en) Method for automatic display of GUI objects according to user&#39;s behavior in virtual reality environment and VR system using it
CN104363345A (en) Displaying method and electronic equipment
CN104598035B (en) Cursor display method, smart machine and the system shown based on 3D stereo-pictures
CN114138106A (en) Transitioning between states in a mixed virtual reality desktop computing environment
Caggianese et al. Situated visualization in augmented reality: Exploring information seeking strategies
JP2022058753A (en) Information processing apparatus, information processing method, and program
CN102508561B (en) Operating rod
CN106131533A (en) A kind of method for displaying image and terminal
CN104052981A (en) Information processing method and electronic equipment
CN106066689B (en) Man-machine interaction method and device based on AR or VR system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170315

RJ01 Rejection of invention patent application after publication