WO2015062251A1 - 显示装置及其控制方法、和手势识别方法 - Google Patents
显示装置及其控制方法、和手势识别方法 Download PDFInfo
- Publication number
- WO2015062251A1 WO2015062251A1 PCT/CN2014/078074 CN2014078074W WO2015062251A1 WO 2015062251 A1 WO2015062251 A1 WO 2015062251A1 CN 2014078074 W CN2014078074 W CN 2014078074W WO 2015062251 A1 WO2015062251 A1 WO 2015062251A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual
- user
- unit
- eye
- control screen
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
Definitions
- the invention belongs to the technical field of gesture recognition, and particularly relates to a display device, a control method thereof, and a gesture recognition method. Background technique
- a display device having a gesture recognition function includes a display unit for performing display, and an image collection unit (camera, camera, etc.) for collecting gestures, which can determine that the user wants to analyze the collected image. The operation performed.
- the “select” and “determine” operations must be performed separately through different gestures, and the operation is troublesome. For example, if the television is changed by the gesture, the first gesture (such as waving from left to right) is selected first. Taiwan, each time the wave is changed once, when the correct station number is selected, the second gesture (such as waving from top to bottom) enters the station. That is to say, the gesture recognition technology of the existing display device cannot implement the operation of "selecting" and “determining”, that is, it cannot "touch” one of the plurality of candidate icons, like a tablet computer, once. Select the instruction to execute and execute it. This is so because the "click" operation must accurately determine the click location.
- the hand is directly on the screen, so it is feasible to determine the click position by touch technology.
- the hand usually cannot touch the display unit (especially for the TV, the user is far away from the TV display during normal use), and can only "point” to a certain position of the display unit (such as an icon displayed by the display unit).
- this long-distance "pointing" accuracy is very poor.
- the gestures of different users may be different. Some people point to the left and some point to the right, so it is impossible to determine where the user wants to point. , you can not achieve the "click" operation.
- the technical problem to be solved by the present invention includes a problem that the "select” and “determine” operations must be separately performed in the existing gesture recognition, and a display capable of achieving “selection” and “determination” operations by gesture recognition is provided in one step.
- the device and its control method, and gesture recognition method are provided in one step.
- a technical solution for solving the technical problem to be solved by the present invention is a control method of a display device, comprising: a tree-eye 3D display unit displaying a virtual 3D control screen, wherein the virtual 3D control screen and the user's eyes The virtual distance is equal to the first distance, the first distance is smaller than the distance between the 3D display unit of the eye and the user's eyes; the image collection unit collects the image of the click action of the user on the virtual 3D control screen; the gesture recognition unit is configured according to the image The image collected by the unit determines the click position of the virtual 3D control screen by the user, and sends a control instruction corresponding to the click position to the corresponding execution unit.
- the first distance is less than or equal to the length of the user's arm.
- the first distance is less than or equal to 0.5 meters and greater than or equal to 0.25 meters.
- the virtual 3D control screen is distributed over the entire display screen for displaying the virtual 3D control screen; or the virtual 3D control screen is a part of a display screen for displaying the virtual 3D control screen.
- the virtual 3D control picture is divided into at least two areas, and each area corresponds to one control instruction.
- the method further includes: the positioning unit determines the position of the user relative to the eye 3D display unit; and the gesture recognition unit according to the image
- the image captured by the collection unit determines that the user clicks on the virtual 3D control screen includes: the gesture recognition unit determines the user's virtual 3D control according to the image collected by the image collection unit and the position of the user relative to the eye 3D display unit The click position of the screen.
- the positioning unit determines that the position of the user relative to the eye 3D display unit comprises: the positioning unit analyzes the image collected by the image collection unit to determine the position of the user relative to the eye 3D display unit.
- the technical solution adopted to solve the technical problem to be solved by the present invention is a display
- the display device includes: a eye 3D display unit capable of displaying a virtual 3D control screen, the virtual distance between the virtual 3D control screen and the user's eyes being equal to a first distance, the first distance being smaller than the eye 3D display unit and a distance between the eyes of the user; an image collection unit for collecting an image of the user's click action on the virtual 3D control screen; a gesture recognition unit for determining the user to the virtual 3D according to the image collected by the image collection unit Control the click position of the screen, and send the control command corresponding to the click position to the corresponding execution unit.
- the eye 3D display unit is a television display or a computer display.
- the eye 3D display unit is any one of a raster type 3D display unit, a prism film type 3D display unit, and a pointing light source type 3D display unit.
- the display device further includes: a positioning unit configured to determine a position of the user relative to the eye 3D display unit.
- the positioning unit is configured to analyze the image collected by the image collection unit to determine the position of the user relative to the eye 3D display unit.
- a technical solution for solving the technical problem to be solved by the present invention is a gesture recognition method, comprising: a eye 3D display unit displaying a virtual 3D control screen, wherein a virtual distance between the virtual 3D control screen and a user's eyes Equal to the first distance, the first distance is smaller than the distance between the 3D display unit of the eye and the user's eyes; the image collection unit collects an image of the click action of the user on the virtual 3D control screen; the gesture recognition unit is configured according to the image collection unit The captured image determines the user's click position on the virtual 3D control screen, and sends a control command corresponding to the click position to the corresponding execution unit.
- the eye 3D display unit refers to a display unit that allows a user to see a stereoscopic 3D image without using 3D glasses.
- the "virtual 3D control screen” refers to a stereoscopic control screen displayed by the eye 3D display unit for realizing control of the display device.
- “virtual distance” refers to the distance between the virtual 3D control screen that the user feels and himself.
- the sense of distance is part of the three-dimensional sense, which is the picture seen by the left and right eyes.
- the user can feel that the virtual 3D control screen is located at a certain distance in front of the user, even if the user is away from or close to the eye 3D display unit, the virtual feeling is felt.
- the distance between the 3D control screen and itself is always the same.
- execution unit refers to any unit that can execute the corresponding control instruction.
- the execution unit is the eye 3D display unit
- the execution unit is the sounding unit.
- the tree 3D display unit can present a virtual 3D control picture for the user, and the distance between the virtual 3D control picture and the user is smaller than the eye 3D display unit and the user.
- the distance between the users so the user will feel that the virtual 3D control screen is very close to himself (just in front of him), and can directly reach the virtual 3D control screen accurately, so that when different users click the same position of the virtual 3D control screen
- the same or similar so that the gesture recognition unit can accurately determine the click position desired by the user, thereby implementing a "click" operation of "selecting" and "determining".
- the invention is used for the control of a display device, and is particularly suitable for the control of a television.
- Fig. 1 is a flow chart showing a method of controlling a display device according to a first embodiment of the present invention.
- Fig. 2 is a view showing a state in which the display device of the first embodiment of the present invention displays a virtual 3D control screen.
- the embodiment provides a control method for a display device.
- the display device to which the method is applied includes a tree eye 3D display unit, an image collection unit, a gesture recognition unit, and preferably a positioning unit.
- the eye 3D display unit refers to any display unit that enables a user to see a stereoscopic 3D image without using 3D glasses.
- the eye 3D display unit is any one of a raster type 3D display unit, a prism film type 3D display unit, and a pointing light source type 3D display unit.
- the above three display units are all known eye 3D display units.
- the raster type 3D display unit is arranged outside the 2D display device, and the grating can block different areas of the display device respectively for the left eye and the right eye of the user, so that the left eye and the right eye of the user see the display device Different areas, that is, the contents seen by both eyes are different, thereby achieving the effect of 3D display.
- the prism sheet is disposed outside the 2D display device, and the light from different positions of the display device is respectively directed to the left and right eyes of the user through the refraction of the small prism in the prism sheet, thereby making the user Both eyes see different content to achieve 3D effect.
- the display module has a special structure, and the light emitting directions of different positions (such as a backlight) are different, and the light emitted by the light source at different positions is respectively directed to the left and right eyes of the user. So that the left and right eyes see different content to achieve the 3D effect.
- the image collection unit is used to collect images of the user, which may be known devices such as a CCD (Charge Coupled Device) camera or a camera. From a convenient point of view, the image collection unit may be disposed near the eye 3D display unit (e.g., fixed above or to the side of the eye 3D display unit), or may be integrally formed with the eye 3D display unit.
- CCD Charge Coupled Device
- the above control method includes the following steps S01 to S04.
- the eye 3D display unit displays a virtual 3D control screen.
- the virtual distance between the virtual 3D control screen and the user's eyes is equal to the first distance, and the first distance is smaller than the distance between the eye 3D display unit and the user's eyes.
- the virtual 3D control screen is specifically used to control the display device.
- the operation screen includes various control commands for the eye 3D display unit, and the user can realize different control of the display device by selecting different control commands.
- the eye 3D display unit 1 displays the virtual 3D control screen 4, and the user feels that the virtual 3D control screen 4 is located at a certain distance (first distance) in front of itself, and the first distance is smaller than the eye 3D.
- the distance between the unit 1 and the user is displayed. Since the user feels that the virtual 3D control screen 4 is close to himself, the user can be made to accurately "click” on a certain position of the screen, so that the display device can more accurately determine what operation the user wants to perform. "Click" control.
- the first distance is less than or equal to the length of the user's arm.
- the user feels that he can "touch" the virtual 3D control screen 4 by hand, thus maximally ensuring the accuracy of the click action.
- the first distance is less than or equal to 0.5 meters and greater than or equal to 0.25 meters. According to the range of the first distance, most people do not have to straighten their arms to "reach" the virtual 3D control screen 4, nor do they think that the virtual 3D control screen 4 is too close to itself.
- the virtual 3D control screen 4 is spread over the entire display screen for displaying the virtual 3D control screen 4. That is to say, when the virtual 3D control screen 4 is displayed, the virtual 3D control screen 4 is the entire display content, and the user can only see the virtual 3D control screen 4, so that the virtual 3D control screen 4 has a larger area and can accommodate More control commands to be selected, and the click accuracy is higher.
- the virtual 3D control screen 4 may be a part of the entire display screen for displaying the virtual 3D control screen 4. That is to say, the virtual 3D control picture 4 is displayed together with a normal picture (such as a 3D movie), and the virtual 3D control picture 4 seen by the user can be located on the side or corner of the display picture, so that the user can simultaneously view the regular picture and the virtual picture.
- the 3D controls the screen 4 so that it can be controlled at any time (such as adjusting the volume, changing channels, etc.).
- the virtual 3D control screen 4 when the virtual 3D control screen 4 is spread over the entire display screen for displaying the virtual 3D control screen 4, it is preferably displayed when certain conditions are met, such as when the user issues an instruction, and the normal screen is still displayed in other cases.
- the virtual 3D control screen 4 when the virtual 3D control screen 4 is a part of the display screen for displaying the virtual 3D control screen 4, it can be continuously displayed.
- the virtual 3D control screen 4 is divided into at least two areas, and each area corresponds to one control command. That is to say, the virtual 3D control screen 4 can be divided into a plurality of different areas, and different control commands can be executed by clicking different areas, so that a plurality of different operations can be performed through one virtual 3D control screen 4. For example, as shown in FIG.
- the virtual 3D control screen 4 can be equally divided into a total of 9 rectangular regions of 3 rows and 3 columns, and each rectangular region corresponds to a control command (such as changing the volume, changing the station number, changing the brightness, Exit the virtual 3D control screen 4, etc.).
- a control command such as changing the volume, changing the station number, changing the brightness, Exit the virtual 3D control screen 4, etc.
- the virtual 3D control screen 4 corresponds to only one control command (for example, the virtual 3D control screen 4 is a part of the display screen for displaying the virtual 3D control screen 4, the corresponding command is "Enter full screen control screen") feasible.
- the image collection unit collects an image of a user's click action on the virtual 3D control screen.
- the image collection unit 5 fixed above the eye 3D display unit 1 collects the image of the click action of the virtual 3D control screen 4 by the user's hand. That is, when the eye 3D display unit 1 displays the virtual 3D control screen 4, the image collection unit 5 is turned on, thereby collecting an image of the user's motion, specifically collecting the user's hand 3 to perform the virtual 3D control screen 4. Click on the image of the action.
- the image collection unit 5 can also be turned on, thereby collecting images of other gestures of the user or for determining the user position.
- the positioning unit determines the position (distance and/or angle) of the user relative to the eye 3D display unit.
- the positioning unit (not shown) can determine the position of the user relative to the eye 3D display unit 1 by analyzing the image collected by the image collection unit 5. For example, when the virtual 3D control screen 4 is displayed, the first image collected by the image collection unit 5 can be used to determine the user's relative 3D display. The position of the unit 1 is shown, and the images collected later are used for gesture recognition.
- the method of judging the position of the user with respect to the 3D display unit 1 of the eye according to the image of the collection is also various, for example, the contour of the user or the outline of the user's eye 2 can be obtained by contour analysis, thereby determining the position of the user.
- an infrared range finder can be set at two different positions, and the distance between the user and the user measured by the two infrared range finder is calculated. User location.
- the gesture recognition unit determines the click position of the virtual 3D control screen by the user according to the image collected by the image collection unit (and the position of the user relative to the 3D display unit of the eye), and sends a control instruction corresponding to the click position to the corresponding Execution unit.
- the gesture recognition unit (not shown) The spatial position of the virtual 3D control screen 4 relative to the eye 3D display unit 1 can be confirmed (since the virtual 3D control screen 4 is necessarily located on the line connecting the eye 3D display unit 1 and the user), and at the same time, when the user reaches 3 clicks on the virtual 3D
- the gesture recognition unit may also confirm the clicked spatial position (ie, the position of the hand 3) according to the collected image (the image collection unit 5 is also known with respect to the position of the eye 3D display unit 1), and further Confirming the position of the virtual 3D control screen 4 corresponding to the click position, that is, determining a control instruction corresponding to the user gesture, so that the gesture recognition unit can send the control instruction to the corresponding execution unit, so that the execution unit executes the corresponding instruction.
- the gesture recognition unit may also confirm the clicked spatial position (ie, the position of the hand 3) according to the collected image (the image collection unit 5 is also known with respect to the position of the eye 3D display unit 1), and further Confirming
- execution unit refers to any unit that can execute the corresponding control instruction.
- the execution unit is the eye 3D display unit 1
- the execution unit is the sounding unit.
- the user position may be determined according to the default position, or the relative position of the user's hand and the body may be determined. Relationship determines what the user wants to click Location (because the relative positional relationship of the virtual 3D control screen 4 to the user is known).
- the embodiment further provides a display device controllable by using the above method, comprising: a tree eye 3D display unit 1 for displaying, capable of displaying a virtual 3D control screen 4, the virtual 3D control screen 4 and a user
- the virtual distance between the eyes 2 is equal to the first distance, the first distance is smaller than the distance between the 3D display unit 1 and the user's eyes 2
- the image collection unit 5 is configured to collect the user's view on the virtual 3D control screen 4. Clicking on the image of the action;
- the gesture recognition unit is configured to determine the click position of the virtual 3D control screen 4 by the user according to the image collected by the image collection unit 5, and send the control instruction corresponding to the click position to the corresponding execution unit.
- the eye 3D display unit 1 is a television display or a computer display.
- the eye 3D display unit 1 is any one of a raster type 3D display unit, a prism film type 3D display unit, and a pointing light source type 3D display unit.
- the display device further includes: a positioning unit configured to determine a position of the user relative to the eye 3D display unit 1.
- the positioning unit is configured to analyze the image collected by the image collection unit 5 to determine the position of the user relative to the eye 3D display unit 1.
- Example 2
- the embodiment provides a gesture recognition method, including: the eye 3D display unit displays a virtual 3D control screen, where the virtual distance between the virtual 3D control screen and the user's eyes is equal to the first distance, and the first distance is smaller than The distance between the 3D display unit and the user's eyes; the image collection unit collects an image of the user's click action on the virtual 3D control screen; the gesture recognition unit determines the user's virtual 3D control screen according to the image collected by the image collection unit Clicking on the location and sending the control command corresponding to the click location to the corresponding execution unit.
- the above-described gesture recognition method is not limited to use for controlling the display device, and it can also be used to control other devices as long as the gesture recognition unit transmits (eg, wirelessly) the control command to the corresponding device.
- a number of specialized gesture recognition systems can be used to control a wide range of devices such as televisions, computers, air conditioners, and washing machines. While an exemplary embodiment is employed, the invention is not limited thereto. Various modifications and improvements can be made by those skilled in the art without departing from the spirit and scope of the invention. These modifications and improvements are also considered to be within the scope of the invention.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/426,012 US20160041616A1 (en) | 2013-10-31 | 2014-05-22 | Display device and control method thereof, and gesture recognition method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310529219.6A CN103529947A (zh) | 2013-10-31 | 2013-10-31 | 显示装置及其控制方法、手势识别方法 |
CN201310529219.6 | 2013-10-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015062251A1 true WO2015062251A1 (zh) | 2015-05-07 |
Family
ID=49932020
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2014/078074 WO2015062251A1 (zh) | 2013-10-31 | 2014-05-22 | 显示装置及其控制方法、和手势识别方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20160041616A1 (zh) |
CN (1) | CN103529947A (zh) |
WO (1) | WO2015062251A1 (zh) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103529947A (zh) * | 2013-10-31 | 2014-01-22 | 京东方科技集团股份有限公司 | 显示装置及其控制方法、手势识别方法 |
CN103530060B (zh) * | 2013-10-31 | 2016-06-22 | 京东方科技集团股份有限公司 | 显示装置及其控制方法、手势识别方法 |
CN109961478A (zh) * | 2017-12-25 | 2019-07-02 | 深圳超多维科技有限公司 | 一种裸眼立体显示方法、装置及设备 |
CN112089589B (zh) * | 2020-05-22 | 2023-04-07 | 未来穿戴技术有限公司 | 一种颈部按摩仪的控制方法、颈部按摩仪及存储介质 |
CN112613384B (zh) * | 2020-12-18 | 2023-09-19 | 安徽鸿程光电有限公司 | 手势识别方法、手势识别装置及交互显示设备的控制方法 |
CN112613389A (zh) * | 2020-12-18 | 2021-04-06 | 上海影创信息科技有限公司 | 眼部手势控制方法和系统及其vr眼镜 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101465957A (zh) * | 2008-12-30 | 2009-06-24 | 应旭峰 | 一种虚拟三维场景中实现遥控互动的系统 |
CN101655739A (zh) * | 2008-08-22 | 2010-02-24 | 原创奈米科技股份有限公司 | 一种三次元虚拟输入与仿真的装置 |
CN102508546A (zh) * | 2011-10-31 | 2012-06-20 | 冠捷显示科技(厦门)有限公司 | 一种3d虚拟投影及虚拟触摸的用户交互界面及实现方法 |
CN102769802A (zh) * | 2012-06-11 | 2012-11-07 | 西安交通大学 | 一种智能电视机的人机交互系统及其交互方法 |
CN103529947A (zh) * | 2013-10-31 | 2014-01-22 | 京东方科技集团股份有限公司 | 显示装置及其控制方法、手势识别方法 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2296400A (en) * | 1994-12-16 | 1996-06-26 | Sharp Kk | Autostereoscopic display having a high resolution 2D mode |
JP2004334590A (ja) * | 2003-05-08 | 2004-11-25 | Denso Corp | 操作入力装置 |
DE102005017313A1 (de) * | 2005-04-14 | 2006-10-19 | Volkswagen Ag | Verfahren zur Darstellung von Informationen in einem Verkehrsmittel und Kombiinstrument für ein Kraftfahrzeug |
EP2372512A1 (en) * | 2010-03-30 | 2011-10-05 | Harman Becker Automotive Systems GmbH | Vehicle user interface unit for a vehicle electronic device |
US20110310003A1 (en) * | 2010-05-21 | 2011-12-22 | Fraunhofer-Gesellschaft Zur Forderung Der Angewandten Forschung E.V. | Image display device and method of displaying images |
-
2013
- 2013-10-31 CN CN201310529219.6A patent/CN103529947A/zh active Pending
-
2014
- 2014-05-22 WO PCT/CN2014/078074 patent/WO2015062251A1/zh active Application Filing
- 2014-05-22 US US14/426,012 patent/US20160041616A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101655739A (zh) * | 2008-08-22 | 2010-02-24 | 原创奈米科技股份有限公司 | 一种三次元虚拟输入与仿真的装置 |
CN101465957A (zh) * | 2008-12-30 | 2009-06-24 | 应旭峰 | 一种虚拟三维场景中实现遥控互动的系统 |
CN102508546A (zh) * | 2011-10-31 | 2012-06-20 | 冠捷显示科技(厦门)有限公司 | 一种3d虚拟投影及虚拟触摸的用户交互界面及实现方法 |
CN102769802A (zh) * | 2012-06-11 | 2012-11-07 | 西安交通大学 | 一种智能电视机的人机交互系统及其交互方法 |
CN103529947A (zh) * | 2013-10-31 | 2014-01-22 | 京东方科技集团股份有限公司 | 显示装置及其控制方法、手势识别方法 |
Also Published As
Publication number | Publication date |
---|---|
CN103529947A (zh) | 2014-01-22 |
US20160041616A1 (en) | 2016-02-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015062247A1 (zh) | 显示装置及其控制方法、手势识别方法、和头戴显示装置 | |
JP6480434B2 (ja) | デジタルデバイスとの対話のための直接的なポインティング検出のためのシステムおよび方法 | |
WO2015062251A1 (zh) | 显示装置及其控制方法、和手势识别方法 | |
JP4900741B2 (ja) | 画像認識装置および操作判定方法並びにプログラム | |
RU2455676C2 (ru) | Способ управления устройством с помощью жестов и 3d-сенсор для его осуществления | |
KR101074940B1 (ko) | 입체이미지 상호 연동 시스템 | |
US20180136466A1 (en) | Glass type terminal and control method therefor | |
US20150035752A1 (en) | Image processing apparatus and method, and program therefor | |
US20110304650A1 (en) | Gesture-Based Human Machine Interface | |
US20120056989A1 (en) | Image recognition apparatus, operation determining method and program | |
US20130154913A1 (en) | Systems and methods for a gaze and gesture interface | |
CN106919294B (zh) | 一种3d触控交互装置、其触控交互方法及显示装置 | |
JP5114795B2 (ja) | 画像認識装置および操作判定方法並びにプログラム | |
WO2015062248A1 (zh) | 显示装置及其控制方法、和手势识别方法 | |
WO2015027574A1 (zh) | 3d眼镜、3d显示系统及3d显示方法 | |
KR20140107229A (ko) | 3차원으로 디스플레이된 오브젝트의 사용자 선택 제스쳐에 응답하기 위한 방법 및 시스템 | |
US20150341626A1 (en) | 3d display device and method for controlling the same | |
WO2020019548A1 (zh) | 基于人眼跟踪的裸眼3d显示方法、装置、设备和介质 | |
CN103176605A (zh) | 一种手势识别控制装置及控制方法 | |
JP2012238293A (ja) | 入力装置 | |
WO2013149475A1 (zh) | 一种用户界面的控制方法及装置 | |
CN106327583A (zh) | 一种实现全景摄像的虚拟现实设备及其实现方法 | |
WO2018161564A1 (zh) | 手势识别系统、方法及显示设备 | |
TW202018486A (zh) | 多螢幕操作方法與使用此方法的電子系統 | |
CN111176425A (zh) | 多屏幕操作方法与使用此方法的电子系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 14426012 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14857941 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205N DATED 07/07/2016) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14857941 Country of ref document: EP Kind code of ref document: A1 |