WO2015165175A1 - 穿戴式触控装置和穿戴式触控方法 - Google Patents
穿戴式触控装置和穿戴式触控方法 Download PDFInfo
- Publication number
- WO2015165175A1 WO2015165175A1 PCT/CN2014/084820 CN2014084820W WO2015165175A1 WO 2015165175 A1 WO2015165175 A1 WO 2015165175A1 CN 2014084820 W CN2014084820 W CN 2014084820W WO 2015165175 A1 WO2015165175 A1 WO 2015165175A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch
- image
- information
- pattern
- unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 238000012544 monitoring process Methods 0.000 claims abstract description 47
- 238000012545 processing Methods 0.000 claims abstract description 45
- 230000008569 process Effects 0.000 claims abstract description 8
- 230000000875 corresponding effect Effects 0.000 claims description 44
- 230000008859 change Effects 0.000 claims description 18
- 238000013507 mapping Methods 0.000 claims description 13
- 239000000463 material Substances 0.000 claims description 5
- 239000000969 carrier Substances 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000000059 patterning Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
- G06F3/0383—Signal control means within the pointing device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1662—Details related to the integrated keyboard
- G06F1/1673—Arrangements for projecting a virtual keyboard
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0325—Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/033—Indexing scheme relating to G06F3/033
- G06F2203/0331—Finger worn pointing device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Definitions
- the present invention relates to the field of control technologies, and in particular, to a wearable touch device and a wearable touch method. Background technique
- wearable technology is the science and technology that explores and creates devices that can be worn directly on the body or integrated into the user's clothing or accessories.
- Wearable smart devices have become the new favorite of technology and the development direction of smart devices in the future.
- Wearable smart devices are a general term for wearable devices (such as glasses, gloves, watches, apparel, etc.) that have been developed using wearable technology to intelligently design everyday wear.
- Wearable smart devices include: full-featured, large-sized wearable smart devices (such as smart watches or smart glasses) that can achieve complete or partial intelligence without relying on smartphones; and focus on only one type of smart application function.
- wearable smart devices such as various smart bracelets for smart sign monitoring, smart jewelry, etc.
- other devices such as smart phones.
- the existing wearable touch devices generally use a button-type control method, and even if the touch screen is used to implement the touch, in view of the small size of the wearable touch device, the general touch screen cannot meet the user's needs well.
- the existing wearable touch devices generally require the touch end to directly contact with the touch screen (ie, its button or touch screen) to make the touch, so that the touch experience of the wearable device is poor. Therefore, it is a technical problem to be solved that provides a convenient wearable touch device and method that enables a user to complete touch without a button or a touch screen.
- the present invention provides a wearable touch device and a wearable touch control device that can perform touch control without a button or a touch screen, and is compact and portable, and correspondingly worn.
- the touch method is accurate in touch and has a wide application range.
- the technical solution for solving the technical problem to be solved by the present invention is a wearable touch device, comprising a carrier, a pattern transmitting unit, an image collecting unit, an image monitoring unit and a processing unit, wherein: the carrier can be worn;
- the pattern emitting unit is configured to emit a scanning pattern to a touch surface touchable by the touch end;
- the image collecting unit is configured to collect an image formed by the scanning pattern on the touch surface, and
- the image information of the captured image is sent to the processing unit;
- the image monitoring unit is configured to monitor current light energy of each region of the scanning surface of the scanning surface, and send current light energy information to the processing
- the processing unit is configured to process the image information of the captured image and the current light energy information to determine a touch position of the touch end on the touch surface and generate corresponding command information.
- the pattern emitting unit and the image collecting unit are both disposed on the carrier and located in the same area of the carrier.
- the pattern emitting unit is an infrared projection pattern transmitting unit
- the image collecting unit is an infrared image collecting unit
- the image monitoring unit is an infrared image monitoring unit.
- the scanning pattern emitted by the pattern emitting unit is a grid-shaped pattern.
- the processing unit includes a comparison module and a command module, and the image information of the captured image includes image coordinate information of the captured image, and the processing unit pre-stores a position coordinate and a command information mapping table, and the scanning Initial light energy information and pattern coordinate information of the pattern;
- the comparison module is configured to compare the image coordinate information of the captured image and the current light energy information with the pattern coordinate information and the initial light energy information of the scan pattern, thereby obtaining the determined touch position coordinate information.
- the command module is configured to receive according to a location coordinate and a command information mapping table The determined touch position coordinate information is converted into corresponding command information.
- the wearable touch device further includes an execution unit, and the execution unit is configured to perform a corresponding action according to the received command information.
- the touch end is a human body finger
- the touch surface touched by the touch end is an area touched by a human finger.
- the pattern emitting unit, the image collecting unit, the image monitoring unit, and the processing unit are all disposed on the carrier.
- the pattern emitting unit, the image collecting unit and the image monitoring unit are all disposed on an outer surface of the carrier near the palm side.
- the outer surfaces of the pattern emitting unit, the image collecting unit and the image monitoring unit are respectively flush with the outer surface of the carrier.
- the wearable touch device includes a plurality of carriers, the pattern emission unit and the image collection unit are disposed on one carrier, and the image monitoring unit and the processing unit are disposed on another carrier. .
- the carrier is in a closed loop and the carrier is formed from a PVC material.
- the technical solution for solving the technical problem to be solved by the present invention is a wearable touch method, which includes the following steps: transmitting a scan pattern to a touch surface touched by the touch terminal; collecting the scan pattern in the An image formed by projection on the touch surface, and simultaneously monitoring current light energy information of each region of the touch surface of the scan pattern; processing image information of the captured image and current light energy information to determine The touch position of the touch end on the touch surface generates corresponding command information.
- the image information of the captured image includes image coordinate information of the captured image, and is provided with a position coordinate and command information mapping table, initial light energy information and pattern coordinate information of the scan pattern, and is determined.
- the step of the touch end on the touch surface and generating corresponding command information includes the following steps: comparing current light energy information in an area of the touch surface with an initial of the scan pattern The light energy information, when the amount of change of the current light energy information relative to the initial light energy information is greater than or equal to the set value, initially determining that the area includes approximate touch Pointing; comparing image coordinate information of the collected image of the region with pattern coordinate information of the scan pattern, when a change occurs between the captured image and the scan pattern at a corresponding coordinate, determining The coordinate point is the actual touch point, thereby obtaining the coordinates of the touch position; and obtaining the command information corresponding to the coordinates of the touch position according to the position coordinate and the command information mapping table.
- the set value is 5%.
- the wearable touch method further includes the step of: performing a corresponding action according to the command information corresponding to the coordinates of the touch position.
- the wearable touch device can be a closed ring smart wearable device that can be worn on a human finger, and the touch surface can be touched by the touch emitting unit (for example, a human finger) by the pattern emitting unit.
- the touch emitting unit for example, a human finger
- the pattern emitting unit monitors the current light energy of the formed image.
- the information processing unit analyzes and processes the current light energy information and the image information of the captured image, determines the coordinates of the touch position of the finger, and determines the precise touch position of the finger, thereby performing touch feedback.
- the wearable touch device enables the user to complete the touch without pressing a button or a touch screen, and is compact and portable, and the corresponding wearable touch method has accurate touch and wide application.
- FIG. 1 is a schematic structural view of a wearable touch device according to Embodiment 1 of the present invention.
- 2 is a schematic view showing the operation of the wearable touch device of FIG. 1.
- FIG. 3 is a schematic diagram of the touch control performed by the wearable touch device of FIG. 4 is a flow chart of touch control of the wearable touch method in Embodiment 1 of the present invention. detailed description
- a wearable touch device including a pattern emitting unit, an image collecting unit, an image monitoring unit, and a processing unit, wherein: the carrier is wearable; the pattern emitting unit is configured to emit a scanning pattern to a touch surface touchable by the touch end; The image collection unit is configured to collect an image formed by the scan pattern on the touch surface, and send image information of the captured image to the processing unit; the image monitoring unit is configured to monitor the scan Patterning current light energy in each area of the touch surface, and transmitting current light energy information to the processing unit; the processing unit is configured to process image information of the captured image and current light energy information, to Determining a touch position of the touch end on the touch surface and generating corresponding command information.
- a wearable touch method includes the steps of: transmitting a scan pattern to a touch surface touchable by a touch end; and collecting the scan pattern on the touch surface Forming an image, and simultaneously monitoring a current optical energy signal of each area of the scanning surface of the scanning surface; processing image information of the collected image and current light energy information, thereby determining that the touch end is The touch position on the touch surface generates corresponding command information.
- the user can perform touch without directly contacting the wearable touch device, and the wearable touch device is compact and portable, and the corresponding wearable touch method is accurate in touch, and the applicable surface is more suitable. wide.
- Example 1
- This embodiment provides a wearable touch device and a corresponding wearable touch method.
- the wearable touch device in this embodiment includes a carrier 1, a pattern transmitting unit 2, an image collecting unit 3, an image monitoring unit 4, and a processing unit 5.
- the carrier 1 can be worn, for example, on a human finger.
- the pattern emitting unit 2 is configured to emit a scanning pattern to a touch surface that can be touched by the touch terminal.
- the image collection unit 3 is configured to collect an image formed by the scan pattern on the touch surface, and send the image information of the collected image to the processing unit 5.
- the image monitoring unit 4 is configured to monitor current light energy information of the scan pattern in each area of the touch surface, and send the current light energy information to the processing unit 5.
- the processing unit 5 is configured to process the image information of the captured image and the current light energy information to determine a touch position of the touch end on the touch surface and generate corresponding command information.
- both the pattern emitting unit 2 and the image collecting unit 3 are disposed on the carrier 1 and located in the same area of the carrier 1, that is, the transmitting portion of the pattern and the receiving portion of the image are included in the same area to facilitate emission. Fixed point recognition and comparison between the scanned pattern and the received captured image.
- the carrier 1 is preferably in a closed loop
- the image monitoring unit 4 is also disposed on the carrier 1
- the pattern emitting unit 2 and the image collecting unit 3 are disposed at one end of the carrier 1, the image The monitoring unit 4 is disposed at the opposite end.
- the pattern emitting unit 2 is an infrared projection type pattern emitting unit
- the image collecting unit 3 is an infrared image collecting unit, such as a camera. It can be understood that the camera can be a rotatable camera for the touch device and the touch surface. When the relative position changes, the normal operation of the pattern emitting unit 2, the image collecting unit 3, and the image monitoring unit 4 is ensured.
- the image monitoring unit 4 is an infrared image monitoring unit such as an optical detection sensor (op t i ca l sens or ).
- the scanning pattern emitted by the pattern emitting unit 1 is, for example, a grid-shaped pattern.
- the scanning pattern emitted by the pattern emitting unit 2 is an infrared fringe (if rared gr i d ) pattern, that is, an infrared matrix pattern. Since the matching infrared emitting pattern emitting unit 2, the image collecting unit 3 and the image monitoring unit 4 are used, when the wearable touch device is used, the touch end and the touch device can be performed without direct contact. Touch, touch is more flexible.
- each of the above units can be implemented by using a smaller microchip to have a smaller volume, thereby ensuring the compact size and portability of the wearable touch device.
- the carrier 1 be formed of a PVC material.
- the material is strong and insulated, so that the structure and position of each unit disposed therein are stable, ensuring the effective operation of the wearable touch device and ensuring the safety of the human body.
- the touch end can be in direct contact with or not in contact with the touch surface
- the touch surface can be a regular plane, an irregular plane, or a surface. Any plane, irregular plane or surface with bumps can be regarded as a plane when its area is small enough, and has corresponding coordinate information. It can be understood that, when the touch end is not in direct contact with the touch surface, although the touch position can be determined by the current light energy information and the image information of the captured image, the touch end and the touch surface may be A certain distance, the projection of the touch end on the touch surface and the actual point to be touched have a certain deviation, so the calculated touch position may have a certain touch positioning error. It is easy to understand that when the touch end is in direct contact with the touch surface, since the projection of the touch end on the touch surface is substantially the same as the actual point to be touched, the touch positioning is more accurate.
- the touch end 7 (see FIG. 3) is a finger of a human hand, and the carrier 1 is worn on the finger of the other hand of the human body, and the area touched by the touch end 7 is a carrier.
- the touch end 7 is a finger of a human hand, and the carrier 1 is worn on a finger as the touch end 7, and the touched area of the touch end 7 is the palm of the other hand of the human body. .
- the pattern transmitting unit 2, the image collecting unit 3, the image monitoring unit 4 and the processing unit 5 are all disposed in the carrier 1, and the pattern transmitting unit 2, the image collecting unit 3 and the image monitoring unit 4 are all disposed on the carrier 1 Close to the outer layer on the palm side.
- the outer surfaces of the pattern emitting unit 2, the image collecting unit 3 and the image monitoring unit 4 are flush with the outer surface of the carrier 1.
- the processing unit 5 further includes a comparison module and a command module (not shown in FIG. 1 ), and the image information of the captured image includes image coordinate information of the captured image, and the processing unit 5
- the position coordinate and the command information mapping table, the initial light energy information of the scan pattern, and the pattern coordinate information are prestored, and the comparison module is configured to respectively use the image coordinate information and the current light energy information of the captured image with the pattern coordinate information and the initial light of the scan pattern.
- the energy information is compared to obtain the determined touch position coordinate information;
- the command module is configured to convert the received determined touch position coordinate information into corresponding command information according to the position coordinate and the command information mapping table.
- the processing unit 5 can be implemented using a microprocessor such as a microcontroller.
- the wearable touch device further includes an execution unit, and the execution unit is configured to perform a corresponding action according to the received command information.
- the pattern emitting unit 2 for example, an infrared mesh projection unit
- the image collecting unit 3 for example, an infrared camera
- the pattern emitting unit 2 will continuously emit an infrared mesh projection to the touch surface (for example, the palm or the palm of the hand wearing the carrier 1), for example, the infrared grid 6 Projected on the palm or palm of the hand as the touch surface 8 on which the carrier 1 is worn (see Fig. 3).
- the touch surface for example, the palm or the palm of the hand wearing the carrier 1
- the infrared grid 6 Projected on the palm or palm of the hand as the touch surface 8 on which the carrier 1 is worn
- the touch surface 8 when the touch surface 8 is touched, when the touch end 7 (for example, a human finger) falls into an area within the infrared grid, a part of the infrared that forms the infrared grid pattern is blocked.
- the infrared grid pattern where the finger is located is blocked and the light energy changes (ie, the loss or loss of light is generated), and the current light energy is collected by the image monitoring unit 4 (for example, an optical detecting sensor), that is, The current light energy information is monitored; on the other hand, the image collection unit 3 collects the change of the infrared grid pattern where the finger is located, that is, the image formed by the infrared grid pattern projected on the touch surface 8.
- the initial light energy information of the scan pattern and the pattern coordinates of the scan pattern (in this embodiment, infrared grid coordinates) information are pre-set in the processing unit 5, and the current light energy information and the image collection unit 3 monitored by the image monitoring unit 4
- the image coordinate information of the collected image is fed back to the processing unit 5, and the comparison module in the processing unit 5 processes and analyzes the current light energy information and the image coordinate information of the captured image, if the current light of a certain region
- the amount of change of the energy information relative to the initial light energy information is greater than or equal to a set value (for example, 5%), and then initially determining that the region includes an approximate touch point; further, comparing the image coordinate information of the captured image with the scan pattern
- the pattern coordinate information when the image is changed between the image and the scan pattern at the corresponding coordinate, the coordinate point is determined as the actual touch point, and the coordinates of the corresponding touch position are obtained; and further, according to the position coordinate and the command information
- the mapping table obtains command information corresponding to the coordinate
- the image monitoring unit When the finger touches, the image monitoring unit The light energy change is sensed, and the light energy change information is transmitted to the processing unit, so that the touch action can be confirmed, and an area is initially determined to include the touch position according to the relationship between the light energy change and the set value.
- the image collection unit acquires the projection image formed by the finger occlusion scan pattern, and accordingly records the image coordinate information of the captured image, and the image coordinate information is compared with the pattern coordinate information of the scan pattern to determine the image and scan of the image. The corresponding coordinates of the change between the patterns, resulting in a precise touch position.
- the pattern emitting unit 2, the image collecting unit 3, and the image monitoring unit 4 may be units other than the infrared mode, and only need to be matched with each other, and the specific form thereof is not limited.
- the specific pattern emitted by the pattern emitting unit 1 may also be other patterns than the grid pattern. In a specific application, the setting may be flexibly performed, and details are not described herein again.
- the present embodiment provides a wearable touch method, including the following steps: transmitting a scan pattern to a touch surface touched by the touch end;
- the corresponding command action is executed according to the command information corresponding to the touch position.
- the image information of the captured image includes image coordinate information of the captured image, and is provided with a position coordinate and a command information mapping table, initial light energy information of the scan pattern, and pattern coordinate information, and the touch end is determined in the above method.
- the step of touching the touch position on the touch surface and generating corresponding command information includes the following steps:
- the initial determination of the area includes an approximation Touch point
- FIG. 4 shows a touch flow chart of the wearable touch method.
- the wearable touch device is worn on the human finger by the carrier 1, and the initial light energy information and the pattern coordinate information of the specific pattern are set during the initializing process.
- the pattern emitting unit 2 projects a red grid pattern (Pa t tern ) toward, for example, the palm or the palm of the touch surface 8.
- the wearable touch device When the touchless end 7 (for example, a human finger) performs touch, the wearable touch device remains in a standby state.
- the infrared grid pattern When a human finger performs a touch, the infrared grid pattern will change, and the image gathering unit 3 (for example, a camera) will project an image formed by the infrared grid pattern where the human finger is located, and the image monitoring unit 4 (for example, an optical detection sensor) monitors current light energy information in an infrared grid pattern.
- the image gathering unit 3 for example, a camera
- the image monitoring unit 4 for example, an optical detection sensor
- the image coordinate information and the current light energy information of the captured image formed when the human finger performs the touch are transmitted to the processing unit 5 for processing and analysis.
- the current light energy information and the initial light energy information are compared to confirm that there is The touch action occurs, and an area is initially determined to be an approximate area including the touch position according to the relationship between the change of the light energy and the set value; and then, according to the image coordinate information of the captured image and the pattern coordinate of the scan pattern
- the coordinate point is determined as the actual touch point, thereby obtaining the coordinates of the precise touch position.
- the touch command according to the command information corresponding to the coordinates of the touch position.
- each unit in the wearable touch device can be separately disposed on one or more carriers. That is, in Embodiment 1, the pattern emitting unit, the image collecting unit, the image monitoring unit, and the processing unit are disposed in the same carrier; In this embodiment, the pattern transmitting unit and the image collecting unit may be disposed in one carrier, and the image monitoring unit and the processing unit are disposed in another carrier.
- the carrier comprises a first carrier and a second carrier, the first carrier and the second carrier being formed of the same material.
- the pattern emitting unit and the image collecting unit may be disposed on the first carrier and located in the same area of the first carrier; and the image monitoring unit is disposed on the second carrier, and the first carrier and the second carrier may be worn in the same The same hand or two hands of a person, or can be worn on different people's hands.
- each unit in the wearable touch device of this embodiment is the same as the configuration of the corresponding unit in Embodiment 1, and details are not described herein again.
- the corresponding wearable touch method of the embodiment is the same as the wearable touch method of the first embodiment, and all of the infrared matrix patterns are emitted by the pattern emitting unit when the human finger touches the infrared matrix pattern of the touch surface.
- the image monitoring unit detects the loss of infrared light at these points (change in light energy), and then according to the change of the infrared grid pattern where the finger collected by the image collection unit is located, and then the processing unit According to the change of light energy and pattern change, the specific position of the touch is determined, and the coordinates of the touch point are obtained, and the command information corresponding to the coordinates of the touch point is obtained.
- the wearable touch device of the present invention may be a closed-loop smart wearable device that can be worn on a human finger.
- the pattern emitting unit projects a specific pattern to the touch surface touched by the touch end.
- the image forming unit images the image formed by the specific pattern of the finger, and the image monitoring unit monitors the current light energy information of the image, and the processing unit analyzes the image information of the captured image and the current light energy information. Processing, determining the touch position coordinates of the finger, determining the position of the precise touch of the finger, and performing touch feedback.
- the wearable touch device enables the user to complete the touch without pressing a button or a touch screen, and is compact and portable, and the corresponding wearable touch method has accurate touch and wide application. While an exemplary embodiment is employed, the invention is not limited thereto. Various modifications and improvements can be made by those skilled in the art without departing from the spirit and scope of the invention. These modifications and improvements are also considered to be within the scope of the invention.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/436,911 US10042443B2 (en) | 2014-04-28 | 2014-08-20 | Wearable touch device and wearable touch method |
KR1020157012703A KR101790853B1 (ko) | 2014-04-28 | 2014-08-20 | 착용가능 터치 디바이스 및 착용가능 터치 방법 |
JP2017508720A JP6467495B2 (ja) | 2014-04-28 | 2014-08-20 | 着用型タッチ装置及び着用型タッチ方法 |
EP14859317.1A EP3139256B1 (en) | 2014-04-28 | 2014-08-20 | Wearable touch apparatus and wearable touch method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410175468.4 | 2014-04-28 | ||
CN201410175468.4A CN103995621B (zh) | 2014-04-28 | 2014-04-28 | 一种穿戴式触控装置和穿戴式触控方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015165175A1 true WO2015165175A1 (zh) | 2015-11-05 |
Family
ID=51309807
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2014/084820 WO2015165175A1 (zh) | 2014-04-28 | 2014-08-20 | 穿戴式触控装置和穿戴式触控方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US10042443B2 (zh) |
EP (1) | EP3139256B1 (zh) |
JP (1) | JP6467495B2 (zh) |
KR (1) | KR101790853B1 (zh) |
CN (1) | CN103995621B (zh) |
WO (1) | WO2015165175A1 (zh) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101524575B1 (ko) * | 2014-08-20 | 2015-06-03 | 박준호 | 웨어러블 디바이스 |
CN104536556B (zh) * | 2014-09-15 | 2021-01-15 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
WO2016060461A1 (en) | 2014-10-15 | 2016-04-21 | Jun Ho Park | Wearable device |
CN104317398B (zh) * | 2014-10-15 | 2017-12-01 | 天津三星电子有限公司 | 一种手势控制方法、穿戴式设备及电子设备 |
WO2016106481A1 (en) * | 2014-12-29 | 2016-07-07 | Empire Technology Development Llc | Quick command entry for wearable devices |
KR101586759B1 (ko) * | 2015-02-27 | 2016-01-19 | 주식회사 디엔엑스 | 웨어러블 장치 및 그 제어방법 |
US10599216B2 (en) | 2015-03-02 | 2020-03-24 | Tap Systems Inc. | Arbitrary surface and finger position keyboard |
KR20170028130A (ko) | 2015-09-03 | 2017-03-13 | 박준호 | 웨어러블 디바이스 |
CN105589607B (zh) * | 2016-02-14 | 2018-09-07 | 京东方科技集团股份有限公司 | 触控系统、触控显示系统和触控交互方法 |
US10638316B2 (en) * | 2016-05-25 | 2020-04-28 | Intel Corporation | Wearable computer apparatus with same hand user authentication |
US11009968B1 (en) | 2017-03-29 | 2021-05-18 | Tap Systems Inc. | Bi-directional tap communication device |
US10691205B1 (en) | 2017-08-29 | 2020-06-23 | Tap Systems Inc. | Tap device with dynamically switchable modes |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN201548938U (zh) * | 2009-10-29 | 2010-08-11 | 北京汇冠新技术股份有限公司 | 一种触摸屏及触摸系统 |
CN101859209A (zh) * | 2010-05-28 | 2010-10-13 | 程宇航 | 红外线检测装置和方法、红外线输入装置以及图形用户设备 |
CN101963868A (zh) * | 2009-07-22 | 2011-02-02 | 影来腾贸易(上海)有限公司 | 红外线扩展光源式多点触控系统 |
CN102915153A (zh) * | 2012-10-26 | 2013-02-06 | 苏州瀚瑞微电子有限公司 | 一种由非接触式手势动作实现鼠标功能的装置及方法 |
CN103546181A (zh) * | 2012-07-17 | 2014-01-29 | 高寿谦 | 可拆卸并可自由组合功能的穿戴式无线智能电子装置 |
US20140098067A1 (en) * | 2012-10-02 | 2014-04-10 | Autodesk, Inc. | Always-available input through finger instrumentation |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2740952B2 (ja) | 1988-04-13 | 1998-04-15 | オリエント時計株式会社 | 腕時計 |
US6771294B1 (en) | 1999-12-29 | 2004-08-03 | Petri Pulli | User interface |
FI20010387A0 (fi) * | 2001-02-27 | 2001-02-27 | Rto Holding Oy | Menetelmä ja järjestelmä tiedon tai valinnan syöttämiseksi |
DE102004044999A1 (de) * | 2004-09-16 | 2006-04-06 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Eingabesteuerung für Geräte |
US20080018591A1 (en) * | 2006-07-20 | 2008-01-24 | Arkady Pittel | User Interfacing |
KR20090061179A (ko) | 2007-12-11 | 2009-06-16 | 한국전자통신연구원 | 데이터 입력 장치 및 이를 이용한 데이터 처리 방법 |
SE534411C2 (sv) * | 2009-11-02 | 2011-08-09 | Stanley Wissmar | Elektronisk Finger Ring och tillverkning av densamme |
EP2378394A3 (en) * | 2010-04-15 | 2015-03-25 | Electronics and Telecommunications Research Institute | User interface device and method for recognizing user interaction using same |
CN203287855U (zh) * | 2010-05-24 | 2013-11-13 | 卡尼特·布迪派特 | 用于移动计算装置的虚拟输入装置的设备 |
KR20120071551A (ko) | 2010-12-23 | 2012-07-03 | 한국전자통신연구원 | 구조 영상을 이용한 사용자 인터랙션 장치 및 방법 |
JP2012208926A (ja) * | 2011-03-15 | 2012-10-25 | Nikon Corp | 検出装置、入力装置、プロジェクタ、及び電子機器 |
US10061387B2 (en) * | 2011-03-31 | 2018-08-28 | Nokia Technologies Oy | Method and apparatus for providing user interfaces |
KR101446902B1 (ko) | 2011-08-19 | 2014-10-07 | 한국전자통신연구원 | 사용자 인터랙션 장치 및 방법 |
JP2014048691A (ja) | 2012-08-29 | 2014-03-17 | Sharp Corp | リストバンド型入力デバイスおよびリストバンド型入力デバイスを用いた文字入力方法 |
CN105027030B (zh) | 2012-11-01 | 2018-10-23 | 艾卡姆有限公司 | 用于三维成像、映射、建网和界面连接的无线腕式计算和控制设备和方法 |
CN203178918U (zh) * | 2013-03-21 | 2013-09-04 | 联想(北京)有限公司 | 一种电子设备 |
JP2015041052A (ja) * | 2013-08-23 | 2015-03-02 | ソニー株式会社 | リストバンド型情報処理装置および記憶媒体 |
-
2014
- 2014-04-28 CN CN201410175468.4A patent/CN103995621B/zh active Active
- 2014-08-20 WO PCT/CN2014/084820 patent/WO2015165175A1/zh active Application Filing
- 2014-08-20 JP JP2017508720A patent/JP6467495B2/ja active Active
- 2014-08-20 US US14/436,911 patent/US10042443B2/en active Active
- 2014-08-20 EP EP14859317.1A patent/EP3139256B1/en active Active
- 2014-08-20 KR KR1020157012703A patent/KR101790853B1/ko active IP Right Grant
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101963868A (zh) * | 2009-07-22 | 2011-02-02 | 影来腾贸易(上海)有限公司 | 红外线扩展光源式多点触控系统 |
CN201548938U (zh) * | 2009-10-29 | 2010-08-11 | 北京汇冠新技术股份有限公司 | 一种触摸屏及触摸系统 |
CN101859209A (zh) * | 2010-05-28 | 2010-10-13 | 程宇航 | 红外线检测装置和方法、红外线输入装置以及图形用户设备 |
CN103546181A (zh) * | 2012-07-17 | 2014-01-29 | 高寿谦 | 可拆卸并可自由组合功能的穿戴式无线智能电子装置 |
US20140098067A1 (en) * | 2012-10-02 | 2014-04-10 | Autodesk, Inc. | Always-available input through finger instrumentation |
CN102915153A (zh) * | 2012-10-26 | 2013-02-06 | 苏州瀚瑞微电子有限公司 | 一种由非接触式手势动作实现鼠标功能的装置及方法 |
Non-Patent Citations (2)
Title |
---|
See also references of EP3139256A4 * |
XIANG, BINBIN: "Research on Touch Screen Key Technology Based on Image Detection Technology", ELECTRONIC TECHNOLOGY & INFORMATION SCIENCE , CHINA MASTER'S THESES FULL-TEXT DATABASE, 15 July 2013 (2013-07-15), pages 5, XP008182491 * |
Also Published As
Publication number | Publication date |
---|---|
JP6467495B2 (ja) | 2019-02-13 |
CN103995621A (zh) | 2014-08-20 |
KR20150135761A (ko) | 2015-12-03 |
EP3139256B1 (en) | 2022-05-04 |
EP3139256A4 (en) | 2018-04-25 |
CN103995621B (zh) | 2017-02-15 |
JP2017514261A (ja) | 2017-06-01 |
US20160124524A1 (en) | 2016-05-05 |
EP3139256A1 (en) | 2017-03-08 |
US10042443B2 (en) | 2018-08-07 |
KR101790853B1 (ko) | 2017-10-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015165175A1 (zh) | 穿戴式触控装置和穿戴式触控方法 | |
KR101762050B1 (ko) | 착용가능 투영 장비 | |
US9978261B2 (en) | Remote controller and information processing method and system | |
EP3015955B1 (en) | Controlling multiple devices with a wearable input device | |
TWI546724B (zh) | 用以在通訊設備間傳送資訊項目之裝置及方法 | |
US20110090148A1 (en) | Wearable input device | |
KR102437106B1 (ko) | 마찰음을 이용하는 장치 및 방법 | |
US20170075433A1 (en) | Optical Projection Keyboard and Mouse | |
US9760758B2 (en) | Determining which hand is being used to operate a device using a fingerprint sensor | |
WO2019033322A1 (zh) | 手持式控制器、跟踪定位方法以及系统 | |
WO2015165187A1 (zh) | 穿戴式触控装置和穿戴式触控方法 | |
EP3139246A1 (en) | Control method and apparatus, electronic device, and computer storage medium | |
CN105630157A (zh) | 控制方法及控制装置、终端和控制系统 | |
JP6364790B2 (ja) | ポインティングデバイス | |
WO2016049842A1 (zh) | 一种便携或可穿戴智能设备的混合交互方法 | |
KR20160039589A (ko) | 손가락 센싱 방식을 이용한 무선 공간 제어 장치 | |
WO2012129958A1 (zh) | 一种手指鼠标 | |
KR20120051274A (ko) | 터치 감지영역모양 입력장치용 감지영역모양 | |
CN104423560B (zh) | 一种信息处理方法和电子设备 | |
CN110228065A (zh) | 机器人运动控制方法及装置 | |
TWI691888B (zh) | 觸控式輸入裝置 | |
KR100720647B1 (ko) | 반지형 포인팅 장치 | |
TW202427135A (zh) | 滑鼠裝置 | |
US20140111445A1 (en) | Cursor control device and cursor control system | |
KR101468273B1 (ko) | 다기능 입력 인터페이스 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 14436911 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2014859317 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014859317 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 20157012703 Country of ref document: KR Kind code of ref document: A Ref document number: 2017508720 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14859317 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |