WO2017041433A1 - 可穿戴设备的触控响应方法、装置及可穿戴设备 - Google Patents
可穿戴设备的触控响应方法、装置及可穿戴设备 Download PDFInfo
- Publication number
- WO2017041433A1 WO2017041433A1 PCT/CN2016/073771 CN2016073771W WO2017041433A1 WO 2017041433 A1 WO2017041433 A1 WO 2017041433A1 CN 2016073771 W CN2016073771 W CN 2016073771W WO 2017041433 A1 WO2017041433 A1 WO 2017041433A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- wearable device
- target fingertip
- fingertip
- target
- touch
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G04—HOROLOGY
- G04G—ELECTRONIC TIME-PIECES
- G04G21/00—Input or output devices integrated in time-pieces
- G04G21/08—Touch switches specially adapted for time-pieces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
Definitions
- the present invention relates to the field of wearable device technologies, and in particular, to a touch response method, device, and wearable device for a wearable device.
- wearable products tend to be smaller in size for wear comfort, and correspondingly, the display screens of wearable products are also relatively small.
- wearable products mainly adopt touch input mode, and the small screen gives the user a relatively poor human-computer interaction experience.
- smart watches are often equipped with relatively rich software resources, and correspondingly, dozens or even dozens of software icons are displayed on the dial screen. In actual use, touching the screen with a finger will block the related icon, so that the user cannot accurately determine the position of the touch, and often causes a wrong click.
- some smart watches can alleviate the above problems to some extent by using some buttons or defining some touch operations such as up and down and left and right sliding, the satisfactory results have not been achieved.
- An object of the present invention is to provide a touch response method, a device, and a wearable device for a wearable device, so that the wearable device can feedback the touch operation effect to the user in real time, and improve the touch accuracy of the wearable device.
- the embodiment of the invention provides a touch response method of a wearable device, including:
- the mapping point is identified on the wearable device screen.
- the target fingertip when the target fingertip is located in the set touch action occurrence area, the position of the target fingertip and the cursor displayed on the screen of the wearable device are reflected. Shooting relationship. In this way, the user can vacate the screen of the touchable wearable device, and the wearable device can feedback the touch operation effect to the user in real time. Therefore, the solution improves the touch accuracy of the wearable device and improves the user experience.
- the method before acquiring the location information of the target fingertip in the set touch action occurrence area, the method further includes: setting the touch action occurrence area by:
- mapping point is determined according to location information of the target fingertip and the reference point.
- the determining a standard circle trajectory corresponding to the target fingertip circle trajectory includes:
- the standard circle trajectory corresponding to the target fingertip circle trajectory is determined.
- the method further includes:
- the gesture action is a set gesture action
- performing a control operation that matches the set gesture action and/or causing the cursor to display a change in the setting that matches the set gesture action effect.
- the method further includes: stopping the touch response when detecting that the target fingertip located in the set touch action occurrence area moves out of the touch action occurrence area exceeds a preset threshold.
- the method before determining a mapping point on the screen of the wearable device to which the target fingertip is mapped, the method further includes:
- the position information of the target fingertip is corrected based on the spatial acceleration information.
- the position and posture of the wearable device may change according to the human body motion.
- the acceleration sensor is used to detect the spatial acceleration information of the wearable device, and the position information of the target fingertip is corrected according to the spatial acceleration information, so that the relevant calculation processing can be performed. More accurate, which reduces misuse and further improves the touch accuracy of wearable devices.
- the embodiment of the invention further provides a touch response device of the wearable device, comprising:
- a first acquiring module configured to acquire location information of a target fingertip collected by the binocular recognition device in the set touch action occurrence region
- a determining module configured to determine, according to location information of the target fingertip, a mapping point on the screen of the wearable device to which the target fingertip is mapped;
- a processing module configured to identify the mapping point on the wearable device screen.
- the touch response device of the wearable device when the target fingertip is located in the set touch action occurrence area, the position of the target fingertip is mapped to the cursor displayed on the screen of the wearable device. .
- the user can vacate the screen of the touchable wearable device, and the wearable device can feedback the touch operation effect to the user in real time. Therefore, the solution improves the touch accuracy of the wearable device and improves the user experience.
- An embodiment of the present invention further provides a wearable device, including: a binocular identification device, configured to collect location information of a target fingertip;
- a controller for communicating with the binocular identification device, configured to acquire position information of the target fingertip collected by the binocular identification device in the set touch action occurrence region; and determine the target according to the position information of the target fingertip A mapping point on the screen of the wearable device to which the fingertip is mapped; the mapping point is identified on the wearable device screen.
- an acceleration sensor communicatively coupled to the controller is further included.
- the acceleration sensor is a gyroscope or a three-axis acceleration sensor.
- the wearable device is a smart watch
- the smart watch includes a watch case
- the controller is disposed in the watch case
- the binocular identification device is disposed on an outer edge of the watch case.
- the binocular identification device includes: a main camera, a secondary camera, and a processing unit respectively connected to the main camera and the auxiliary camera, and is configured to obtain a target according to images captured by the main camera and the auxiliary camera. The location information of the fingertips.
- the wearable device has a floating touch function, and can feedback the touch operation effect to the user in real time, and the touch precision thereof is relatively high, so that the user experience is greatly improved compared with the prior art.
- FIG. 1 is a schematic structural diagram of a wearable device according to an embodiment of the present invention.
- FIG. 2 is a schematic flow chart showing a touch response method of a wearable device according to an embodiment of the present invention
- FIG. 3 is a schematic diagram showing the mapping of a target fingertip to a screen of a wearable device
- Figure 4 is a schematic diagram showing the principle of determining a standard circle drawing track
- FIG. 5 is a schematic structural diagram of a touch response device of a wearable device according to an embodiment of the invention.
- FIG. 6 is a schematic structural diagram of a touch response device of a wearable device according to another embodiment of the present invention.
- the embodiment of the invention provides a touch response method, device and wearable device for the wearable device.
- FIG. 1 illustrates a wearable device 100 in accordance with an embodiment of the present invention.
- the wearable device 100 includes: a binocular identification device, such as a binocular camera 4a, 4b, for collecting position information of a target fingertip; and a controller 5 communicably connected with the binocular identification device, Obtaining location information of the target fingertip collected by the binocular recognition device in the set touch action occurrence region; determining, according to the location information of the target fingertip, a mapping point on the screen of the wearable device to which the target fingertip is mapped; And the mapping point is identified on the wearable device screen.
- the mapping point may be identified by displaying a cursor at a mapping point location or highlighting a valid area corresponding to the mapping point (such as an application icon corresponding to the mapping point, an optional item, etc.).
- the correspondence between the position in the touch action occurrence area and the position on the wearable device screen may be preset.
- wearable devices include, but are not limited to, smart watches, smart wristbands, and other smart accessories and the like.
- the wearable device is specifically a smart watch, which includes a hardware device such as a watch band 1, a watch case 2, a screen 3, and a binocular camera 4a, 4b.
- the binocular identification device may include a binocular camera 4a, 4b and a processing unit communicatively coupled to the binocular camera.
- the processing unit of the binocular identification device may be disposed inside the watch case 2 together with the controller 5.
- the processing unit of the binocular identification device is integrated with the controller 5 and disposed inside the watch case 2.
- the binocular cameras 4a, 4b include two cameras, which are a main camera and a secondary camera, respectively, and the coordinate positions of the auxiliary camera and the main camera have a certain positional deviation.
- the processing unit of the binocular recognition device can run an associated recognition program to derive positional information of the target body (such as the target fingertip) in space based on the images captured by the two cameras, for example by accurate calculations.
- the processing unit of the binocular recognition device enables not only the user's hand, the gesture action, the fingertip position of the specific finger but also the fingertip position of the two or more fingers to be recognized at the same time.
- the case 2 is rectangular, and the controller 5 (optionally together with the processing unit of the binocular identification device) is disposed inside the case 2, and the main camera and the assistant The cameras are respectively disposed on the edges of the case 2, for example on opposite sides.
- the case may also take other shapes than a rectangle, such as a circle, an ellipse or the like.
- the target fingertip may be any target fingertip or a specific target fingertip.
- the binocular recognition device uses the index fingertip as the recognition target, and the user can only use the index fingertip to perform the floating touch operation in the space region; if any fingertip is used Defined as the target fingertip, the binocular recognition device uses any fingertip as the recognition target, and the user can use any fingertip to perform the floating touch operation.
- the wearable device screen When the target fingertip is located in the set touch action occurrence area, the wearable device screen has a mapping point corresponding to the position of the target fingertip.
- a cursor may be displayed at the mapping point location to identify or feed back relative position information of the target fingertip on the wearable device screen to the user; wherein the cursor includes (but is not limited to) a mouse shape, a small red dot, a small circle, etc. .
- the target fingertip when the target fingertip is located in the set touch action occurrence area, the position of the target fingertip is mapped to the cursor displayed on the screen of the wearable device, and the user can move the target fingertip Controlling the movement of the cursor and performing a floating touch operation on the screen of the wearable device, so that the wearable device can feedback the touch operation effect to the user in real time. Therefore, the solution improves the touch precision of the wearable device and improves the user experience.
- the set touch action occurrence area may be fixed.
- An area can also be an area set by the user when the wearable touch function of the wearable device is enabled.
- the controller may further acquire the target fingertip circle trajectory collected by the binocular recognition device after receiving the trigger instruction for setting the touch action occurrence area; determining the target fingertip a standard circle trajectory corresponding to the circled trajectory; determining a reference point for establishing a mapping between the target fingertip and the wearable device screen according to the standard circle trajectory and the boundary of the wearable device screen; according to the reference point, the standard circle trajectory Set the touch action occurrence area with the boundary of the wearable device screen.
- the controller may determine the mapping point according to the location information of the target fingertip and the reference point, for example, determining the mapping point as the intersection of the reference point and the target fingertip connection with the wearable device screen.
- the user may press a physical button, click the screen, or double-click the screen as the trigger condition for setting the touch action occurrence area (and optionally enabling the floating touch function).
- the touch action occurrence area can be set.
- the user can set or update the touch action occurrence area in any posture at any time without adjusting the posture of the touch, thereby making the subsequent floating touch Control operations are easy to implement and have high accuracy.
- the wearable device further includes an acceleration sensor 6 communicatively coupled to the controller 5.
- the acceleration sensor 6 can be used to collect spatial acceleration information of the wearable device; and the controller 5 can correct the position information of the target fingertip according to the spatial acceleration information of the wearable device.
- An acceleration sensor is an electronic device capable of measuring an acceleration force, and is applied to a wearable device to detect spatial acceleration information of the wearable device, which is equivalent to sensing a change in posture position of the wearable device.
- Acceleration sensors include, but are not limited to, gyroscopes or three-axis acceleration sensors.
- the controller 5 can correct the position information of the target fingertip according to the spatial acceleration information of the wearable device, thereby reducing the deviation of the target fingertip position information due to the user's posture instability, thereby reducing the false touch. Control action occurs.
- the controller 5 may also perform a control operation matching the specific gesture action when the target fingertip is located in the set touch action occurrence area and perform a specific gesture action, and/or cause the cursor to exhibit a specific change. effect.
- a control operation matching the specific gesture action when the target fingertip is located in the set touch action occurrence area and perform a specific gesture action, and/or cause the cursor to exhibit a specific change. effect.
- the cursor is located on an icon button of an application on the screen of the wearable device, and the target fingertip makes a click action relative to the screen and then away within a set period of time (for example, within 1 second)
- the application can be launched while the cursor can be rendered with a specific change effect, such as changing the shape of the mouse, Presenting a fireworks bloom effect, presenting a water fluctuation effect, etc., to prompt the user that the application is starting up.
- the type of specific gesture action can be set in combination with ergonomics or usage habits.
- the controller 5 can also cancel the cursor display on the screen of the wearable device when the location information of the target fingertip is not acquired, and/or cancel the touch after experiencing the set time period.
- the control action occurs in the locale setting record.
- the binocular recognition device can operate only after the user enables the dangling touch function of the wearable device.
- controller of the wearable device can be implemented by using a control chip, and can also be implemented by a processor, a microprocessor, a circuit, a circuit unit, an integrated circuit, an application specific integrated circuit (ASIC), or the like.
- PLD Programming logic device
- DSP digital signal processor
- the controller may obtain location information of the target fingertip from any applicable other type of target recognition device, including but not limited to, for example, time of flight (ToF) technology, structured light.
- target recognition device for 3D inspection technology such as technology or millimeter wave radar technology.
- FIG. 2 illustrates a touch response method 200 of a wearable device according to an embodiment of the invention.
- the touch response method 200 includes the following steps:
- Step 201 Obtain location information of a target fingertip collected by the binocular recognition device in the set touch action occurrence area;
- Step 202 Determine, according to location information of the target fingertip, a mapping point on a screen of the wearable device to which the target fingertip is mapped;
- Step 203 Identify a mapping point on the wearable device screen.
- the touch response method when the target fingertip is located in the set touch action occurrence area, the position of the target fingertip is mapped to the cursor displayed on the screen of the wearable device, so that the user can pause the touch. Control the screen of the wearable device
- the wearable device can feedback the touch operation effect to the user in real time. Therefore, the solution improves the touch accuracy of the wearable device and improves the user experience.
- the set touch action occurrence area may be a fixed area or an area set by the user when the wearable device's floating touch function is enabled.
- the touch response method before the step 201, further includes: setting a touch action occurrence area.
- the touch action generating area can be set as follows.
- Step 1 After receiving the triggering instruction for setting the touch action occurrence area, acquiring the target fingertip circle drawing track 12 (shown in FIG. 4) collected by the binocular recognition device.
- the target fingertip circle can be roughly circular, square or other shape. Since the circle painting is a virtual painting operation of the user, the target fingertip circle drawing track is not necessarily an ideal shape, and is not necessarily a closed track, and is not necessarily a plane figure.
- Step 2 Determine a standard circle trajectory 11 corresponding to the target fingertip circle trajectory 12.
- the standard circle trajectory 11 is a boundary of the set touch action occurrence area.
- the standard circle track 11 can be determined by the following substeps:
- Sub-step 1 Envelope calculation is performed on the target fingertip circle 12 to determine a minimum envelope space 13 that can accommodate the target fingertip circle track 12.
- the shape of the minimum envelope space includes, but is not limited to, a cylindrical space, a rectangular space, and the like.
- the cross-sectional shape of the smallest envelope space is adapted to the shape of the wearable device screen.
- the minimum envelope space 13 is a cylindrical space having the smallest diameter and the smallest height capable of accommodating the target fingertip circle 12.
- Sub-step 2 According to the minimum envelope space 13, the standard circle trajectory 11 corresponding to the target fingertip circle trajectory 12 is determined.
- the circular surface of the cylindrical space may be defined as a standard circled track, or the circular surface of the cylindrical space may be defined as a standard circled track.
- the minimum envelope space is a box or other shape, it can be defined by similar rules.
- the shape and size of the standard loop trace depends on the shape and size of the wearable device screen, for example the area of the area surrounded by the standard loop track may be proportional to the area of the wearable device screen.
- data such as the shape size of the standard circled track 11 and the position relative to the screen can be determined by geometric calculation.
- Step 3 Determine a reference point for establishing a mapping between the target fingertip and the screen of the wearable device according to the standard circle track and the boundary of the wearable device screen.
- the reference point may be determined by: the central point trajectory Videos by standard ring and screen center B A straight line l 1; l comprising determining any of the standard ring plane trajectory Videos 1
- the intersection point C, and the intersection point D with the screen boundary; the intersection point C and the intersection point D are the straight line l 2 , and the intersection point O of l 1 and l 2 is the reference point.
- the mapping point can be determined as the intersection of the reference point and the target fingertip connection with the wearable device screen, and in FIG. 3, the intersection of the target fingertip M and the O point with the screen 3.
- any geometric calculation method suitable for determining the mapping relationship between two spatial planes may be used to determine the reference point.
- the reference point can be determined based on the distance and area ratio relationship of the area surrounded by the standard circled track from the wearable device screen. .
- Step 4 The touch action occurrence area 14 is set according to the boundary point O, the standard circle drawing track 11 and the boundary of the screen 3. In the set touch action occurrence area, regardless of the position of the target fingertip, it can be mapped to a corresponding mapping point on the wearable device screen. It should be understood that the touch action occurrence area 14 in FIG. 3 is only a schematic, and the touch action occurrence area is a set of positions of the target fingertip capable of mapping the target fingertip with the point on the wearable device screen.
- the user can press a physical button, click the screen or double-click the screen as the trigger condition for setting the touch action occurrence area and optionally enabling the floating touch function.
- the touch action occurrence area can be set according to the above steps.
- the user can set or update the touch action occurrence area in any posture at any time, thereby making the subsequent floating touch operation easy to implement and high in accuracy.
- the touch response method further includes: acquiring a gesture action of the target fingertip in the set touch action occurrence area; and performing and setting the gesture when the gesture action is the set gesture action
- the action matches the control operation, and/or causes the cursor to present a set change effect that matches the set gesture action.
- the touch response method further includes: stopping the touch when detecting that the target fingertip located in the set touch action occurrence area moves out of the touch action occurrence area for more than a preset threshold. response. Stopping the touch response may include canceling the cursor display on the wearable device screen, eliminating the touch action occurrence area setting record, and/or disabling the floating touch function.
- the wearable device to which the target fingertip is mapped is determined Before the mapping point on the screen is prepared, the touch response method further includes: acquiring spatial acceleration information of the wearable device collected by the acceleration sensor; and correcting the position information of the target fingertip according to the spatial acceleration information of the wearable device.
- the position and posture of the wearable device may change according to the human body motion.
- the acceleration sensor is used to detect the spatial acceleration information of the wearable device, and the position information of the target fingertip is corrected according to the spatial acceleration information, so that the determined mapping point position is more accurate, thereby reducing misoperation and further improving the touch of the wearable device. Precision.
- the target fingertip is any target fingertip, or the target fingertip is a specific target fingertip.
- FIG. 5 illustrates a touch response device 500 of a wearable device in accordance with an embodiment of the present invention.
- the touch response device 500 includes:
- the first obtaining module 7a is configured to acquire position information of the target fingertip collected by the binocular recognition device in the set touch action occurrence region;
- a determining module 8a configured to determine, according to location information of the target fingertip, a mapping point on the screen of the wearable device to which the target fingertip is mapped;
- the processing module 9 is configured to identify a mapping point on the wearable device screen.
- the touch response device of the wearable device provided by the above embodiment of the present invention can display the cursor on the position of the target fingertip and the screen of the wearable device when the target fingertip is located in the set touch action occurrence area. Establish a mapping relationship. In this way, the user can hang the screen of the touchable wearable device, and the wearable device can feedback the touch operation effect to the user in real time. Therefore, the solution improves the touch accuracy of the wearable device and improves the user experience.
- FIG. 6 illustrates a touch response device 600 of a wearable device in accordance with another embodiment of the present invention.
- the touch response device 600 includes, in addition to the first acquisition module 7a, the determination module 8a and the processing module 9, which are the same as those shown in FIG. 5, a setting module 8b for acquiring the target fingertip at the set touch.
- the touch action occurrence area is set before the position information of the action occurrence area is set, and the touch action occurrence area is set to: after receiving the trigger instruction for setting the touch action occurrence area, acquiring the target fingertip circle picture collected by the binocular recognition device a trajectory, and determining a standard circle trajectory corresponding to the target fingertip circle trajectory; determining a reference point for establishing a mapping between the target fingertip and the wearable device screen according to the standard circle trajectory and the boundary of the wearable device screen; The standard looping track and the boundary of the wearable device screen are set to set the touch action occurrence area.
- the mapping point is based on the location of the reference point and the target fingertip Determined by interest.
- the setting module 8b is further configured to perform an envelope calculation on the target fingertip circle to determine a minimum envelope space that can accommodate the target fingertip circle trajectory; and determine the target fingertip according to the minimum envelope space.
- the first obtaining module 7a is further configured to acquire a gesture action of the target fingertip in the set touch action occurrence area; and the processing module 9 is further configured to: when the gesture action is the set gesture action, A control operation that matches the set gesture action is performed, and/or the cursor is caused to display a setting change effect that matches the set gesture action.
- the processing module 9 is further configured to stop the touch response when it is detected that the target fingertip located in the set touch action occurrence area moves out of the touch action occurrence area for more than a preset threshold.
- the touch response device 600 further includes a second acquisition module 7b.
- the second obtaining module 7b is configured to acquire the spatial acceleration information of the wearable device collected by the acceleration sensor; and the determining module 8a is further configured to correct the position information of the target fingertip according to the spatial acceleration information of the wearable device.
- the first obtaining module, the determining module, the processing module, and the second obtaining module and the setting module may be implemented by using a hardware unit, a software unit, or a combination of the two.
- these modules may each be implemented by providing computer program instructions to a processor of a general purpose computer, a special purpose computer, or other programmable data processing apparatus such that the computer is executed by a processor of the computer
- the functions and actions corresponding to each module are implemented by program instructions.
- the target fingertip is any target fingertip, or the target fingertip is a specific target fingertip.
- An embodiment of the present invention further provides a wearable device, including: a binocular identification device disposed on a wearable device (for example, a main body structure thereof) for collecting location information of a target fingertip, and communicating with the binocular identification device And a controller that is used to receive the location information of the target fingertip collected by the binocular identification device, and the area corresponding to the location information on the screen of the wearable device is used as the touch region.
- a wearable device including: a binocular identification device disposed on a wearable device (for example, a main body structure thereof) for collecting location information of a target fingertip, and communicating with the binocular identification device And a controller that is used to receive the location information of the target fingertip collected by the binocular identification device, and the area corresponding to the location information on the screen of the wearable device is used as the touch region.
- the wearable device is a smart watch
- the main body structure comprises a watch case and a screen.
- the binocular identification device comprises: a main camera, a secondary camera, and a processing unit respectively connected in communication with the main camera and the secondary camera.
- the wearable device screen is rectangular, and the main camera and the auxiliary camera respectively Set on the opposite sides of the edge of the screen.
- the case and the screen are circular, and the main camera and the auxiliary camera are respectively disposed at the edge of the watch case or the screen, and are located on the radial line of the circular case or the screen.
- the accuracy of the touch area of the wearable device can be effectively improved.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims (17)
- 一种可穿戴设备的触控响应方法,包括:获取双目识别设备所采集的目标指尖在设定的触控动作发生区域的位置信息;根据目标指尖的位置信息,确定目标指尖所映射到的可穿戴设备屏幕上的映射点;以及在可穿戴设备屏幕上标识所述映射点。
- 如权利要求1所述的方法,其中在获取目标指尖在设定的触控动作发生区域的位置信息之前,所述方法还包括通过以下方式设置所述触控动作发生区域:在接收到设置触控动作发生区域的触发指令后,获取双目识别设备所采集的目标指尖圈画轨迹,并确定所述目标指尖圈画轨迹对应的标准圈画轨迹;根据标准圈画轨迹和可穿戴设备屏幕的边界,确定用以建立目标指尖与可穿戴设备屏幕间的映射的基准点;以及根据基准点、标准圈画轨迹和可穿戴设备屏幕的边界,设置所述触控动作发生区域,以及其中根据目标指尖的位置信息和该基准点来确定所述映射点。
- 如权利要求2所述的方法,其中所述确定目标指尖圈画轨迹对应的标准圈画轨迹,包括:对目标指尖圈画轨迹进行包络计算,确定可容纳目标指尖圈画轨迹的最小包络空间;以及根据最小包络空间,确定目标指尖圈画轨迹对应的标准圈画轨迹。
- 如权利要求2或3所述的方法,所述方法还包括:获取目标指尖在设定的触控动作发生区域的手势动作;当所述手势动作为设定的手势动作时,执行与所述设定的手势动作相匹配的控制操作,和/或,使光标显示与所述设定的手势动作相匹配的设定的变化效果。
- 如权利要求4所述的方法,所述方法还包括:当检测到位于设定的触控动作发生区域的目标指尖移出该触控动作发生区域的时间超过预设阈值时,停止触控响应。
- 如权利要求1所述的方法,其中在确定目标指尖所映射到的可穿戴设备屏幕上的映射点之前,所述方法还包括:获取加速度传感器采集的可穿戴设备的空间加速度信息;根据所述空间加速度信息,修正目标指尖的位置信息。
- 一种用于可穿戴设备的触控响应装置,包括:第一获取模块,用于获取双目识别设备所采集的目标指尖在设定的触控动作发生区域的位置信息;确定模块,用于根据目标指尖的位置信息,确定目标指尖所映射到的可穿戴设备屏幕上的映射点;和处理模块,用于在可穿戴设备屏幕上标识所述映射点。
- 如权利要求7所述的装置,还包括:设置模块,用于在获取目标指尖在设定的触控动作发生区域的位置信息之前,通过以下方式设置触控动作发生区域:在接收到设置触控动作发生区域的触发指令后,获取双目识别设备所采集的目标指尖圈画轨迹,并确定所述目标指尖圈画轨迹对应的标准圈画轨迹;根据标准圈画轨迹和可穿戴设备屏幕的边界,确定用以建立目标指尖与可穿戴设备屏幕间的映射的基准点;且根据基准点、标准圈画轨迹和可穿戴设备屏幕的边界,设置所述触控动作发生区域;以及所述确定模块被用于根据目标指尖的位置信息和该基准点来确定所述映射点。
- 如权利要求8所述的装置,其中所述设置模块还用于:对目标指尖圈画轨迹进行包络计算,确定可容纳目标指尖圈画轨迹的最小包络空间;根据最小包络空间,确定目标指尖圈画轨迹对应的标准圈画轨迹。
- 如权利要求8或9所述的装置,其中:所述第一获取模块还用于获取目标指尖在设定的触控动作发生区域的手势动作;以及所述处理模块还用于当所述手势动作为设定的手势动作时,执行与所述设定的手势动作相匹配的控制操作,和/或,使光标显示与所述设定的手势动作相匹配的设定的变化效果。
- 如权利要求10所述的装置,其中所述处理模块还用于当检测 到位于设定的触控动作发生区域的目标指尖移出该触控动作发生区域的时间超过预设阈值时,停止触控响应。
- 如权利要求7所述的装置,还包括第二获取模块,用于获取加速度传感器采集的可穿戴设备的空间加速度信息;以及所述确定模块还用于根据可穿戴设备的空间加速度信息,修正目标指尖的位置信息。
- 一种可穿戴设备,包括:双目识别设备,用于采集目标指尖的位置信息;和与所述双目识别设备通信连接的控制器,用于获取双目识别设备所采集的目标指尖在设定的触控动作发生区域的位置信息;根据目标指尖的位置信息,确定目标指尖所映射到的可穿戴设备屏幕上的映射点;且在可穿戴设备屏幕上标识所述映射点。
- 如权利要求13所述的可穿戴设备,还包括与所述控制器通信连接的加速度传感器。
- 如权利要求14所述的可穿戴设备,其中所述加速度传感器为陀螺仪或者三轴加速度传感器。
- 如权利要求13所述的可穿戴设备,其中所述可穿戴设备为智能手表,所述智能手表包括表壳,且所述控制器被设置于表壳内,以及该双目识别设备被设置于表壳外缘。
- 如权利要求16所述的可穿戴设备,其中所述双目识别设备包括:主摄像头、辅摄像头,以及与所述主摄像头和辅摄像头分别通信连接的处理单元,其用以根据主摄像头和辅摄像头摄取的图像,得出目标指尖的位置信息。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/325,057 US10185433B2 (en) | 2015-09-10 | 2016-02-14 | Method and apparatus for touch responding of wearable device as well as wearable device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510575198.0 | 2015-09-10 | ||
CN201510575198.0A CN105159539B (zh) | 2015-09-10 | 2015-09-10 | 可穿戴设备的触控响应方法、装置及可穿戴设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017041433A1 true WO2017041433A1 (zh) | 2017-03-16 |
Family
ID=54800415
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/073771 WO2017041433A1 (zh) | 2015-09-10 | 2016-02-14 | 可穿戴设备的触控响应方法、装置及可穿戴设备 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10185433B2 (zh) |
CN (1) | CN105159539B (zh) |
WO (1) | WO2017041433A1 (zh) |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105159539B (zh) | 2015-09-10 | 2018-06-01 | 京东方科技集团股份有限公司 | 可穿戴设备的触控响应方法、装置及可穿戴设备 |
US20170225690A1 (en) * | 2016-02-09 | 2017-08-10 | General Motors Llc | Wearable device controlled vehicle systems |
CN105589607B (zh) * | 2016-02-14 | 2018-09-07 | 京东方科技集团股份有限公司 | 触控系统、触控显示系统和触控交互方法 |
CN108027648A (zh) * | 2016-07-29 | 2018-05-11 | 华为技术有限公司 | 一种可穿戴设备的手势输入方法及可穿戴设备 |
CN207115040U (zh) * | 2017-05-08 | 2018-03-16 | 北京搜狗科技发展有限公司 | 一种表体和一种穿戴设备 |
EP3647908B1 (en) * | 2017-08-31 | 2021-12-29 | Huawei Technologies Co., Ltd. | Input method and intelligent terminal device |
CN107450672B (zh) * | 2017-09-19 | 2024-03-29 | 曾泓程 | 一种高识别率的腕式智能装置 |
CN108737720B (zh) * | 2018-04-11 | 2020-12-04 | 努比亚技术有限公司 | 可穿戴设备拍摄方法、可穿戴设备及计算机可读存储介质 |
CN109960448B (zh) * | 2019-03-21 | 2020-01-17 | 掌阅科技股份有限公司 | 场景特效显示方法、电子设备及计算机存储介质 |
CN111722703A (zh) * | 2019-03-22 | 2020-09-29 | 上海博泰悦臻网络技术服务有限公司 | 手势交互方法及系统 |
CN111752380B (zh) * | 2019-04-08 | 2024-03-19 | 广东小天才科技有限公司 | 一种基于腕式穿戴设备的交互方法及腕式穿戴设备 |
CN110244843B (zh) * | 2019-06-03 | 2023-12-08 | 努比亚技术有限公司 | 可穿戴设备控制方法、可穿戴设备及计算机可读存储介质 |
CN111240472A (zh) * | 2019-12-31 | 2020-06-05 | Oppo广东移动通信有限公司 | 电子设备、手势识别装置和方法 |
CN113359995B (zh) * | 2021-07-02 | 2022-07-29 | 北京百度网讯科技有限公司 | 人机交互方法、装置、设备以及存储介质 |
US11816275B1 (en) * | 2022-08-02 | 2023-11-14 | International Business Machines Corporation | In-air control regions |
CN115640561B (zh) * | 2022-11-15 | 2023-03-14 | 季华实验室 | 屏幕控制方法、装置、终端及存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103336575A (zh) * | 2013-06-27 | 2013-10-02 | 深圳先进技术研究院 | 一种人机交互的智能眼镜系统及交互方法 |
CN103677561A (zh) * | 2012-09-25 | 2014-03-26 | 三星电子株式会社 | 用于提供由便携式装置和其它装置使用的用户接口的系统 |
CN103713737A (zh) * | 2013-12-12 | 2014-04-09 | 中国科学院深圳先进技术研究院 | 用于智能眼镜的虚拟键盘系统 |
CN105159539A (zh) * | 2015-09-10 | 2015-12-16 | 京东方科技集团股份有限公司 | 可穿戴设备的触控响应方法、装置及可穿戴设备 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020036617A1 (en) * | 1998-08-21 | 2002-03-28 | Timothy R. Pryor | Novel man machine interfaces and applications |
WO2003032143A2 (en) * | 2001-10-12 | 2003-04-17 | Hrl Laboratories, Llc | Vision-based pointer tracking method and apparatus |
US10026177B2 (en) * | 2006-02-28 | 2018-07-17 | Microsoft Technology Licensing, Llc | Compact interactive tabletop with projection-vision |
CN102662462B (zh) * | 2012-03-12 | 2016-03-30 | 中兴通讯股份有限公司 | 电子装置、手势识别方法及手势应用方法 |
CN104471511B (zh) * | 2012-03-13 | 2018-04-20 | 视力移动技术有限公司 | 识别指点手势的装置、用户接口和方法 |
US9720505B2 (en) * | 2013-01-03 | 2017-08-01 | Meta Company | Extramissive spatial imaging digital eye glass apparatuses, methods and systems for virtual or augmediated vision, manipulation, creation, or interaction with objects, materials, or other entities |
CN103150020A (zh) * | 2013-03-14 | 2013-06-12 | 上海电机学院 | 一种三维指控操作方法及系统 |
US20140337621A1 (en) * | 2013-05-07 | 2014-11-13 | Serguei Nakhimov | Wearable communication device, security complex and user interface |
WO2015052588A2 (en) * | 2013-10-10 | 2015-04-16 | Itay Katz | Systems, devices, and methods for touch-free typing |
US9740296B2 (en) * | 2013-12-16 | 2017-08-22 | Leap Motion, Inc. | User-defined virtual interaction space and manipulation of virtual cameras in the interaction space |
US10031583B2 (en) * | 2014-03-21 | 2018-07-24 | Immersion Corporation | Systems and methods for force-based object manipulation and haptic sensations |
CA2948823C (en) * | 2014-05-12 | 2020-09-22 | Quest Diagnostics Investments Incorporated | Quantitation of tamoxifen and metabolites thereof by mass spectrometry |
CN103995592A (zh) * | 2014-05-21 | 2014-08-20 | 上海华勤通讯技术有限公司 | 穿戴式设备与终端进行信息交互的方法及终端 |
US9924143B2 (en) * | 2014-09-23 | 2018-03-20 | Intel Corporation | Wearable mediated reality system and method |
CA3213021A1 (en) * | 2014-11-11 | 2016-05-19 | Zerokey Inc. | A method of detecting user input in a 3d space and a 3d input system employing same |
CN205050078U (zh) * | 2015-09-10 | 2016-02-24 | 京东方科技集团股份有限公司 | 一种可穿戴设备 |
-
2015
- 2015-09-10 CN CN201510575198.0A patent/CN105159539B/zh active Active
-
2016
- 2016-02-14 US US15/325,057 patent/US10185433B2/en active Active
- 2016-02-14 WO PCT/CN2016/073771 patent/WO2017041433A1/zh active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103677561A (zh) * | 2012-09-25 | 2014-03-26 | 三星电子株式会社 | 用于提供由便携式装置和其它装置使用的用户接口的系统 |
CN103336575A (zh) * | 2013-06-27 | 2013-10-02 | 深圳先进技术研究院 | 一种人机交互的智能眼镜系统及交互方法 |
CN103713737A (zh) * | 2013-12-12 | 2014-04-09 | 中国科学院深圳先进技术研究院 | 用于智能眼镜的虚拟键盘系统 |
CN105159539A (zh) * | 2015-09-10 | 2015-12-16 | 京东方科技集团股份有限公司 | 可穿戴设备的触控响应方法、装置及可穿戴设备 |
Also Published As
Publication number | Publication date |
---|---|
CN105159539B (zh) | 2018-06-01 |
US10185433B2 (en) | 2019-01-22 |
US20170205939A1 (en) | 2017-07-20 |
CN105159539A (zh) | 2015-12-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017041433A1 (zh) | 可穿戴设备的触控响应方法、装置及可穿戴设备 | |
US10606441B2 (en) | Operation control device and operation control method | |
EP2817694B1 (en) | Navigation for multi-dimensional input | |
JP5412227B2 (ja) | 映像表示装置、および、その表示制御方法 | |
EP2677398A2 (en) | Virtual touch device without pointer on display surface | |
JP5658500B2 (ja) | 情報処理装置及びその制御方法 | |
US10234955B2 (en) | Input recognition apparatus, input recognition method using maker location, and non-transitory computer-readable storage program | |
EP2677399A2 (en) | Virtual touch device without pointer | |
WO2014106219A1 (en) | User centric interface for interaction with visual display that recognizes user intentions | |
US20140317576A1 (en) | Method and system for responding to user's selection gesture of object displayed in three dimensions | |
JP6344530B2 (ja) | 入力装置、入力方法、及びプログラム | |
KR101343748B1 (ko) | 포인터를 표시하지 않는 투명 디스플레이 가상 터치 장치 | |
CN205050078U (zh) | 一种可穿戴设备 | |
US20190156118A1 (en) | Information processing apparatus, control method, and program | |
CN106598422B (zh) | 混合操控方法及操控系统和电子设备 | |
US20130229348A1 (en) | Driving method of virtual mouse | |
TWI486815B (zh) | 顯示設備及其控制系統和方法 | |
CN106796462B (zh) | 确定输入对象的位置 | |
JP2016115310A (ja) | 電子機器 | |
WO2018161421A1 (zh) | 终端设备的触摸显示屏幕的性能测试方法和性能测试装置 | |
US20190196602A1 (en) | Information processing device, information processing method, and program | |
US9354706B2 (en) | Storage medium having stored therein information processing program, information processing apparatus, information processing system, and method of calculating designated position | |
JP2018181169A (ja) | 情報処理装置、及び、情報処理装置の制御方法、コンピュータプログラム、記憶媒体 | |
EP3059664A1 (en) | A method for controlling a device by gestures and a system for controlling a device by gestures | |
TWI566128B (zh) | 虛擬觸控裝置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 15325057 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16843374 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16843374 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16843374 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 200918) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16843374 Country of ref document: EP Kind code of ref document: A1 |