WO2018076720A1 - Procédé d'utilisation à une seule main et système de commande - Google Patents
Procédé d'utilisation à une seule main et système de commande Download PDFInfo
- Publication number
- WO2018076720A1 WO2018076720A1 PCT/CN2017/089027 CN2017089027W WO2018076720A1 WO 2018076720 A1 WO2018076720 A1 WO 2018076720A1 CN 2017089027 W CN2017089027 W CN 2017089027W WO 2018076720 A1 WO2018076720 A1 WO 2018076720A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- manipulation
- touch
- display screen
- control
- depth image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Definitions
- the present invention relates to the field of electronic technology, and in particular to a one-handed control method and a control system.
- adding a function key to the back of the mobile phone is a solution for one-handed operation, but it will undoubtedly affect the beauty of the back of the mobile phone, and thus this solution has not been accepted by the user.
- Another option is to add an additional touch screen to the back of the phone. This solution allows control of the area of the phone's screen that cannot be operated with one hand by manipulating the finger on the back.
- this solution is costly and cannot be a mainstream one-hand operation.
- the touch operation is performed by using the depth image, and the display screen outside the area cannot be touched between the control object and the display screen, and the position of the manipulation object on the display screen directly maps to the pixel on the acquired depth image.
- the coordinates, that is, the position of the manipulation object on the display screen is the pixel coordinate mapping of the manipulation object in the depth image. This method of directly obtaining the position on the display screen through the pixel coordinate mapping cannot touch the manipulation object.
- the area is touch-operated, so only some functions such as page turning can be implemented simply, and the one-hand control problem of the large screen cannot be solved well.
- the existing scheme of using the depth image for touching is to set the position of the manipulation object on the display screen by setting the depth camera on the display screen side, and the best way is to control the object.
- this method is suitable for a display without a touch function.
- the finger touch display screen is determined to generate a touch on the screen.
- the location information touches other areas of the display. Since two touch commands are generated, the touch conflict is caused, so that other areas of the display can no longer be touched, and other areas are touched.
- only the touch object is not required to touch the display, which makes The mapping relationship between the position of the manipulation object on the display screen through the depth image acquisition is complicated, and the experience is greatly weakened.
- the object of the present invention is to provide a one-handed control method and a control system, which can solve the problem that the above-mentioned prior art can not easily achieve one-hand control, and the touch screen with touch function is easy to generate touch conflict. problem.
- the present invention provides a one-handed control method in which the display screen and the control surface for the touch operation are on different planes, including the following steps: S1: acquiring the control surface and the depth of the manipulation object on the control surface Image; S2: obtaining a first position of the manipulation object on the manipulation surface by the depth image; S3: positioning the second position on the display screen according to the first position of the manipulation object on the manipulation surface; S4: according to the shape and shape of the manipulation object The predetermined control action determined by the action is recognized and converted into a touch command to be executed; S5: performing a touch operation at the second position according to the touch command.
- the manipulation method of the present invention may further have the following technical features:
- the control surface comprises at least one control area, the at least one of the control areas being automatically delimited by the acquired depth image and the obtained position of the manipulation object on the control surface.
- the size of the automatically delineated control area is easily accessible on the control surface when the manipulation object is operated by one hand.
- the display screen includes a near touch area for controlling an object to be easily touched, and a far touch area that is not easily touched except the near touch area, and the manipulation object utilizes the display screen itself in the near touch area.
- the touch function performs one-touch operation, and the touch is performed in the second position obtained by positioning in the far touch area.
- the near touch area and the far touch area are automatically delimited according to the position of the manipulation object on the display screen.
- the step of acquiring the depth image includes: S11: acquiring a first depth image including a control surface and not including the manipulation object; S12: acquiring a second depth image including the manipulation surface and the manipulation object; S13: passing the second depth image And obtaining a third depth image of the manipulation object with the first depth image.
- Obtaining the first location includes the following steps: S21: determining, according to the second depth image, whether the manipulation object is in contact with the control surface, and if not, performing the next step; S22: obtaining the manipulation object according to the third depth image In the spatial position information of the coordinate system in which the display screen is located, the coordinates of the vertices of the manipulation object are taken as the first position.
- the obtaining of the second location includes the following steps: S31: establishing a mapping relationship between the control plane and the display screen; S32: obtaining a second location of the display screen by the mapping relationship and the first location.
- a linear mapping relationship is established according to the control surface and the horizontal and vertical dimensions of the display screen.
- the present invention also provides a one-hand control system for performing the above touch method, including an image acquisition unit, a processor, a control surface, and a display screen, wherein the control surface and the display screen are on different planes. ;
- the image acquisition unit is configured to acquire a control surface and a depth image of the manipulation object and depth information of the manipulation object;
- the processor includes an identification module, a positioning module, a conversion module, and an execution module, and the identification module is configured to acquire a first position of the manipulation object on the manipulation surface according to the depth image, and identify a predetermined manipulation action of the manipulation object
- the positioning module is configured to locate a second position on the display screen that needs to be manipulated according to the first position; the conversion module converts and generates a corresponding touch instruction according to a predefined control action; the execution module is used to Performing a touch command at the second position completes a touch operation on the display screen.
- the present invention utilizes a depth image to implement a touch operation and is a one-handed control method in the control method.
- the control surface for the control and the display screen are arranged on different planes, the control object completes the touch operation on the control surface, and the depth image is used to obtain the first position of the manipulation object on the manipulation surface, and the display screen is further obtained through the first position.
- the second position is combined with a predetermined manipulation action to perform a touch operation at the second position, so that when the user touches the electronic device with the large screen, when the touch screen is touched
- the control can be completed on the control surface, and the position of the control object on the display screen is completed, thereby avoiding the problem of touch conflict between the existing control and the touch.
- the object can be manipulated at any time. It is in contact with the control surface to facilitate accurate and quick access to the position of the object on the control surface.
- the present invention can not only realize simple gesture operations such as page turning and rewinding for a display screen without a touch function, but also realize one-hand touch well for a single-hand touch.
- the object can be touched through the control surface and can achieve precise touch.
- the control surface includes at least one control area to solve the acquisition and manipulation of the depth image under different touch habits of the manipulation object, such as the left and right hands, and the manipulation area is automatically delineated according to the depth image.
- the manipulation area is A preset area on the control surface is automatically delineated by the position and depth image of the manipulated object on the manipulated object. It is not necessary to control the object to be manipulated in a certain area to perform the touch operation, thereby improving the manipulation.
- the size of the control area is easily achieved on the control surface when the manipulation object is operated by one hand, and the shape defined by the optimization control area is not limited to a rectangle, and may be the most on the control surface according to the manipulation object.
- the shape of the area that is easy to touch is determined, generally an irregular fan shape.
- a mode that uses both its own touch function and a directional control method for hybrid control is implemented, such as:
- the object is touched by the touch function of the display screen in the near touch area for one-hand touch, and the touch is performed in the second position obtained by positioning in the far touch area.
- This hybrid control method can compensate for the low precision caused by the randomness of the manipulation of the moving object, and provides a better experience for the user.
- the above-mentioned near touch area and far touch area are also automatically delineated according to the user's touch habits, to adapt to the user's habit of manipulation and to optimize the shape of the near touch area and the far touch area.
- the second position on the display screen can be quickly obtained through the first position, and the linear mapping relationship can be used to control the control surface or the control area as a rectangle.
- the shape quickly establishes a mapping relationship based on the horizontal and vertical dimensions.
- FIG. 1 is a schematic structural view of a control system according to a first embodiment of the present invention
- FIG. 2 is a schematic structural diagram of a processor according to a first embodiment of the present invention.
- FIG. 3 is a schematic structural diagram of an electronic device according to a third embodiment of the present invention.
- FIG. 4 is a schematic rear view of a single hand operated mobile phone according to the third and fourth embodiments of the present invention.
- Fig. 5 is a side elevational view showing the one hand control of the mobile phone according to the third and fourth embodiments of the present invention.
- Figure 6 is a schematic illustration of the hybrid manipulation of embodiments 2 and 5 of the present invention.
- FIG. 7 is a flowchart 1 of the operation of the fourth embodiment of the present invention.
- FIG. 8 is a second flowchart of the operation of the fourth embodiment of the present invention.
- Embodiment 1 is a diagrammatic representation of Embodiment 1:
- This embodiment provides a one-handed control system, as shown in Figure 1, including an image acquisition unit 1, a processor 2, a control surface 3 and a display screen 4;
- the image acquisition unit 1 is configured to acquire the control surface 3 and the depth image of the manipulation object and the depth information of the manipulation object, and the control surface 3 and the display screen 4 are on different planes;
- the processor 2 includes an identification module 21, a positioning module 22, a conversion module 23, and an execution module 24.
- the identification module 21 is configured to acquire a control object on the control surface 3 according to the depth image. a position, and a predetermined manipulation action for identifying the manipulation object;
- the positioning module 22 is configured to position a second position on the display screen 4 to be manipulated according to the first position;
- the conversion module 23 is configured to be based on a predefined
- the control action is generated to generate a corresponding touch command;
- the execution module 24 is configured to perform a touch operation on the display screen 4 by executing a touch command at the second position;
- the system can acquire the depth image of the control surface 3 through the image acquisition unit, and also acquire the depth image of the manipulation object that is in contact with the display screen 4, according to which the position and motion of the manipulation object on the control surface 3 can be recognized, thereby It is converted into a corresponding position on the display screen 4 and the instruction, so that the manipulation of the device can be realized.
- the display screen of the e-book reader is considered to be on the display screen. Touch manipulation on the display will affect the display effect. For example, when the page is constantly flipped with a finger on the e-book, the hand will inevitably block part of the screen, thus affecting the reading experience. Using the manipulation object to control on the control surface will alleviate this problem and enhance the user experience.
- the image acquisition unit 1 herein is a depth camera based on the structured light or TOF principle, and generally includes a receiving unit and a transmitting unit.
- control surface can effectively avoid the problem of touch conflicts.
- the object On the control surface, the object can be manipulated at any time. Face contact to facilitate accurate and fast access to the position of the object on the control surface.
- Embodiment 2 is a diagrammatic representation of Embodiment 1:
- the difference is that the display screen of the embodiment is combined with the control surface 3 and the display screen 4 to complete the touch operation, and the display screen 4 is a touch screen.
- the touch screen includes a touch panel 1 and a far touch region 2 , and the control object 16 performs touch control in the near touch region 1 according to the touch function of the touch screen itself.
- the far touch area 2 is directed by the control surface 16 to complete the touch operation.
- the touch control is still used in the near touch area 1 to ensure better precision, and in the far touch area.
- the first position of the control surface 3 is used and the operation is performed according to the shape and motion of the touch object.
- This embodiment is particularly suitable for a touch-enabled device such as a mobile phone that is currently unable to be completely controlled by a single screen due to a large screen.
- Embodiment 3 is a diagrammatic representation of Embodiment 3
- This embodiment provides an electronic device, which may be a mobile phone or a tablet, and includes a bus 5 for data transmission.
- the bus 4 is divided into a data bus, an address bus, and a control bus, respectively.
- Connected to the bus 4 are a CPU 6, a display 7, an IMU 8 (inertial measurement device), a memory 9, a camera 11, a sound device 12, a network interface 13, and a mouse/keyboard 14.
- the mouse and keyboard are replaced by the display 7, and the IMU device is used for positioning, tracking and other functions.
- the memory 9 is used to store an operating system, an application, etc., and can also be used to store temporary data during operation.
- the image acquisition unit 1 in the manipulation system provided on the electronic device corresponds to the camera 11, and processes
- the device 2 corresponds to the CPU 8, and may also be a separate processor.
- the display screen 4 corresponds to the display 7.
- the control surface is disposed on the back of the mobile phone, and is in a different plane from the display screen.
- the camera 11 is typically secured to an electronic device.
- the imaging direction is greatly different from that of the camera of the existing device.
- Existing cameras are typically front or rear, and such a configuration cannot capture an image of the device itself.
- the configuration of the camera 11 can be various. One is to rotate the camera by rotating the camera 90 degrees after rotating, and the other is to take the camera externally.
- the camera as a whole is connected to the device through a certain fixed measure and via an interface such as USB.
- Those skilled in the art can select the form of the camera on the electronic device according to the actual situation, without limitation.
- the camera 11 of the present embodiment is a depth camera for acquiring a depth image of the target area.
- FIG. 4 shows the back side of the phone with one hand.
- a camera 11 is arranged on the top of the mobile phone, and the camera 11 collects the direction from the top to the bottom of the mobile phone, so that the control surface on the back of the mobile phone and the image of the finger (control object) can be obtained, and
- FIG. 5 shows the operation of the mobile phone with one hand.
- Side view, 17 is the first position on the control surface, 16 is the manipulation object.
- the display 7 of the electronic device may be of a touch function or a touch function.
- the touch function When the touch function is not provided, it can be controlled only by the control surface, and when the touch function is provided, the display can be touched by the control object.
- the area is manipulated to perform control plane manipulation on areas that are not touched.
- the camera 11 in this embodiment may also be a normal RGB camera for capturing RGB images; the camera 11 may also be an infrared camera for capturing infrared images; or may be a depth camera, such as based on the principle of structured light or based on the TOF principle. Depth camera, etc.
- Depth images acquired with depth cameras are not affected by dark light, and can be measured even in the dark.
- positioning and motion recognition using depth images is more accurate than RGB images. Therefore, in the following description, the depth camera and the depth image will be explained. However, the invention should not be limited to depth cameras.
- Embodiment 4 is a diagrammatic representation of Embodiment 4:
- a one-handed control method as shown in Figures 4-5 and 7, includes the following steps:
- the control surface 3 comprises at least one control region 15 , the at least one of which is automatically delimited by the acquired depth image and the position of the controlled object 16 on the control surface 3 .
- the size of the automatically delineated control area 15 is easily accessible on the control surface 3 when the handling object 16 is operated with one hand.
- the contact point when the control object is in contact with the control surface 3 is determined.
- all the points in contact with the touch surface in this operation are determined as the manipulation area 15.
- the control surface 3 includes at least one control area 15 to solve the acquisition and manipulation of the depth image under different touch habits of the manipulation object 16, such as left and right hands, and the manipulation area 15 is automatically delineated according to the depth image.
- the manipulation area 15 For a certain area on the control surface 3, the position and depth image of the manipulation object 16 on the manipulation object 16 are automatically demarcated, and the manipulation object 16 is required to be controlled in a certain area to perform the touch operation.
- the size of the manipulation area 15 is easily achieved on the control surface 3 when the manipulation object 16 is operated by one hand, and the shape delimited by the manipulation area 15 is not limited to a rectangle. It can be determined according to the shape of the area of the manipulation object 16 that is most easily touched on the control surface 3, generally an irregular fan shape.
- the acquired depth image includes other unrelated parts in addition to the control surface 3 and the finger.
- the measurement range of the depth camera can be limited, that is, a certain threshold is set, and the depth information exceeding the threshold is removed.
- the depth image acquired in this way will only contain the information on the back of the phone and the finger, which can reduce the amount of calculation of the recognition.
- the image segmentation method is used to obtain the depth image, and one method includes the following steps:
- the manipulation object 16 of the present embodiment is a finger, and the depth image of the front end portion of the finger obtained by the background segmentation method in the above step can reduce the calculation amount at the time of modeling and increase the calculation speed.
- acquiring the first location 17 includes the following steps:
- the obtaining of the second location comprises the following steps:
- a linear mapping relationship is established according to the horizontal and vertical dimensions of the control surface 3 and the display screen 4.
- the second position on the display screen 4 can be quickly obtained through the first position 17, and the control plane 3 or the manipulation area can be used by the linear mapping relationship.
- 15 is a relatively regular shape such as a rectangle, and a mapping relationship is quickly established according to the horizontal and vertical dimensions.
- the positioning needs to be performed first, and the manipulation action may be a change of the shape of the finger, such as a change in the angle between the finger and the display screen 4, or an action of the finger, such as a click action.
- the shape and motion recognition of the finger are prior art and will not be described in detail herein.
- a click action is completed.
- the shape of the finger and the manipulation command corresponding to the action need to be preset.
- the processor converts it into a corresponding manipulation command.
- Embodiment 5 is a diagrammatic representation of Embodiment 5:
- the display screen 4 is a touch screen with a touch function. As shown in FIG. 6 , the display screen 4 includes a near touch area 1 for controlling the object 16 to be easily touched, and is not easy to touch except for the near touch area. In the far touch area, the control object 16 performs one-hand touch using the touch function of the display screen 4 in the near touch area, and touches the second position obtained by positioning in the far touch area.
- the distinction between the touch area 1 and the pointing area 2 can be automatically recognized and delineated by the system. That is, when the finger on the display 4 side is in contact with the touch display screen 4, the processor processes the signal fed back by the touch display screen 4 and executes the corresponding touch command, while the finger on the display screen 4 side and the touch display screen 4 When not in contact, the depth camera recognizes the non-contact motion and feeds back to the processor. The processor processes the depth image acquired by the depth camera and performs a touch operation through the second position.
- a mode in which the touch control function is utilized and the control method of the pointing type is used for the hybrid control is implemented, such as:
- the control object 16 performs one-hand touch using the touch function of the display screen 4 in the near touch area, and performs touch control in the second position obtained by positioning in the far touch area.
- This hybrid control method can compensate for the low precision caused by the randomness of the manipulation of the object 16 and provide a better experience for the user.
- the above-mentioned near touch area and far touch area are also automatically delineated according to the user's touch habits, to adapt to the user's habit of manipulation and to optimize the shape of the near touch area and the far touch area.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
La présente invention concerne un procédé d'utilisation à une seule main et un système d'utilisation, le procédé d'utilisation à une seule main comprenant les étapes consistant à : S1 : acquérir une surface d'utilisation et une image de profondeur d'un objet d'utilisation sur la surface d'utilisation ; S2 : obtenir une première position de l'objet d'utilisation sur la surface d'utilisation au moyen de l'image de profondeur ; S3 : mettre en place une seconde position sur un écran d'affichage en fonction de la première position de l'objet d'utilisation sur la surface d'utilisation ; S4 : selon une action d'utilisation prédéterminée déterminée par une forme et une action de l'objet d'utilisation, identifier et traduire l'action d'utilisation prédéterminée en une instruction tactile qui doit être exécutée ; S5 : réaliser une utilisation tactile à une seconde position selon l'instruction tactile. Par l'intermédiaire de la surface d'utilisation, la présente invention peut bien réaliser l'utilisation à une seule main d'un dispositif électronique à grand écran et éviter un conflit de commande tactile.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610942444.6 | 2016-10-25 | ||
CN201610942444.6A CN106569716B (zh) | 2016-10-25 | 2016-10-25 | 单手操控方法及操控系统 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018076720A1 true WO2018076720A1 (fr) | 2018-05-03 |
Family
ID=58536395
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/089027 WO2018076720A1 (fr) | 2016-10-25 | 2017-06-19 | Procédé d'utilisation à une seule main et système de commande |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN106569716B (fr) |
WO (1) | WO2018076720A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112204511A (zh) * | 2018-08-31 | 2021-01-08 | 深圳市柔宇科技股份有限公司 | 输入控制方法及电子装置 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106569716B (zh) * | 2016-10-25 | 2020-07-24 | 深圳奥比中光科技有限公司 | 单手操控方法及操控系统 |
CN107613094A (zh) * | 2017-08-17 | 2018-01-19 | 珠海格力电器股份有限公司 | 一种单手操作移动终端的方法和移动终端 |
WO2023220983A1 (fr) * | 2022-05-18 | 2023-11-23 | 北京小米移动软件有限公司 | Procédé et appareil de commande pour passer à un mode à une seule main, dispositif, et support de stockage |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102929388A (zh) * | 2011-09-30 | 2013-02-13 | 微软公司 | 全空间姿势输入 |
CN103440033A (zh) * | 2013-08-19 | 2013-12-11 | 中国科学院深圳先进技术研究院 | 一种基于徒手和单目摄像头实现人机交互的方法和装置 |
CN105824553A (zh) * | 2015-08-31 | 2016-08-03 | 维沃移动通信有限公司 | 一种触控方法及移动终端 |
CN106569716A (zh) * | 2016-10-25 | 2017-04-19 | 深圳奥比中光科技有限公司 | 单手操控方法及操控系统 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10216286B2 (en) * | 2012-03-06 | 2019-02-26 | Todd E. Chornenky | On-screen diagonal keyboard |
CN102789568B (zh) * | 2012-07-13 | 2015-03-25 | 浙江捷尚视觉科技股份有限公司 | 一种基于深度信息的手势识别方法 |
CN102937822A (zh) * | 2012-12-06 | 2013-02-20 | 广州视声电子科技有限公司 | 一种移动设备的背面操控结构及方法 |
CN103176605A (zh) * | 2013-03-27 | 2013-06-26 | 刘仁俊 | 一种手势识别控制装置及控制方法 |
CN103777701A (zh) * | 2014-01-23 | 2014-05-07 | 深圳市国华光电研究所 | 大屏幕触屏电子设备 |
CN104331182B (zh) * | 2014-03-06 | 2017-08-25 | 广州三星通信技术研究有限公司 | 具有辅助触摸屏的便携式终端 |
CN104750188A (zh) * | 2015-03-26 | 2015-07-01 | 小米科技有限责任公司 | 移动终端 |
-
2016
- 2016-10-25 CN CN201610942444.6A patent/CN106569716B/zh active Active
-
2017
- 2017-06-19 WO PCT/CN2017/089027 patent/WO2018076720A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102929388A (zh) * | 2011-09-30 | 2013-02-13 | 微软公司 | 全空间姿势输入 |
CN103440033A (zh) * | 2013-08-19 | 2013-12-11 | 中国科学院深圳先进技术研究院 | 一种基于徒手和单目摄像头实现人机交互的方法和装置 |
CN105824553A (zh) * | 2015-08-31 | 2016-08-03 | 维沃移动通信有限公司 | 一种触控方法及移动终端 |
CN106569716A (zh) * | 2016-10-25 | 2017-04-19 | 深圳奥比中光科技有限公司 | 单手操控方法及操控系统 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112204511A (zh) * | 2018-08-31 | 2021-01-08 | 深圳市柔宇科技股份有限公司 | 输入控制方法及电子装置 |
Also Published As
Publication number | Publication date |
---|---|
CN106569716B (zh) | 2020-07-24 |
CN106569716A (zh) | 2017-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210096651A1 (en) | Vehicle systems and methods for interaction detection | |
EP2972727B1 (fr) | Affichage non occulté pour interactions par survol | |
EP2864932B1 (fr) | Positionnement d'extrémité de doigt pour une entrée de geste | |
US8619049B2 (en) | Monitoring interactions between two or more objects within an environment | |
US20140300542A1 (en) | Portable device and method for providing non-contact interface | |
US9454260B2 (en) | System and method for enabling multi-display input | |
WO2018076720A1 (fr) | Procédé d'utilisation à une seule main et système de commande | |
JP2015516624A (ja) | 有効インターフェース要素の強調のための方式 | |
CN104081307A (zh) | 图像处理装置、图像处理方法和程序 | |
US11886643B2 (en) | Information processing apparatus and information processing method | |
US20180196530A1 (en) | Method for controlling cursor, device for controlling cursor and display apparatus | |
CN106598422B (zh) | 混合操控方法及操控系统和电子设备 | |
US10162501B2 (en) | Terminal device, display control method, and non-transitory computer-readable recording medium | |
US9041689B1 (en) | Estimating fingertip position using image analysis | |
WO2021004413A1 (fr) | Dispositif d'entrée portatif et procédé et appareil de commande d'extinction pour icône d'indication d'un dispositif d'entrée portatif | |
US11294510B2 (en) | Method, system and non-transitory computer-readable recording medium for supporting object control by using a 2D camera | |
JP6555958B2 (ja) | 情報処理装置、その制御方法、プログラム、および記憶媒体 | |
WO2019100547A1 (fr) | Procédé de commande de projection, appareil, système d'interaction de projection, et support d'informations | |
TWI444875B (zh) | 多點觸碰輸入裝置及其使用單點觸控感應板與影像感測器之資料融合之介面方法 | |
JP3201596U (ja) | 操作入力装置 | |
TW201419087A (zh) | 微體感偵測模組及其微體感偵測方法 | |
JP2018181169A (ja) | 情報処理装置、及び、情報処理装置の制御方法、コンピュータプログラム、記憶媒体 | |
CN112433624B (zh) | 倾角获取方法、装置和电子设备 | |
US9116573B2 (en) | Virtual control device | |
CN105528059A (zh) | 一种三维空间手势操作方法及系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17866199 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17866199 Country of ref document: EP Kind code of ref document: A1 |