WO2015165175A1 - 穿戴式触控装置和穿戴式触控方法 - Google Patents

穿戴式触控装置和穿戴式触控方法 Download PDF

Info

Publication number
WO2015165175A1
WO2015165175A1 PCT/CN2014/084820 CN2014084820W WO2015165175A1 WO 2015165175 A1 WO2015165175 A1 WO 2015165175A1 CN 2014084820 W CN2014084820 W CN 2014084820W WO 2015165175 A1 WO2015165175 A1 WO 2015165175A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
image
information
pattern
unit
Prior art date
Application number
PCT/CN2014/084820
Other languages
English (en)
French (fr)
Inventor
赵天月
陈炎顺
许秋实
李耀辉
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Priority to US14/436,911 priority Critical patent/US10042443B2/en
Priority to KR1020157012703A priority patent/KR101790853B1/ko
Priority to JP2017508720A priority patent/JP6467495B2/ja
Priority to EP14859317.1A priority patent/EP3139256B1/en
Publication of WO2015165175A1 publication Critical patent/WO2015165175A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0383Signal control means within the pointing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1662Details related to the integrated keyboard
    • G06F1/1673Arrangements for projecting a virtual keyboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0331Finger worn pointing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the present invention relates to the field of control technologies, and in particular, to a wearable touch device and a wearable touch method. Background technique
  • wearable technology is the science and technology that explores and creates devices that can be worn directly on the body or integrated into the user's clothing or accessories.
  • Wearable smart devices have become the new favorite of technology and the development direction of smart devices in the future.
  • Wearable smart devices are a general term for wearable devices (such as glasses, gloves, watches, apparel, etc.) that have been developed using wearable technology to intelligently design everyday wear.
  • Wearable smart devices include: full-featured, large-sized wearable smart devices (such as smart watches or smart glasses) that can achieve complete or partial intelligence without relying on smartphones; and focus on only one type of smart application function.
  • wearable smart devices such as various smart bracelets for smart sign monitoring, smart jewelry, etc.
  • other devices such as smart phones.
  • the existing wearable touch devices generally use a button-type control method, and even if the touch screen is used to implement the touch, in view of the small size of the wearable touch device, the general touch screen cannot meet the user's needs well.
  • the existing wearable touch devices generally require the touch end to directly contact with the touch screen (ie, its button or touch screen) to make the touch, so that the touch experience of the wearable device is poor. Therefore, it is a technical problem to be solved that provides a convenient wearable touch device and method that enables a user to complete touch without a button or a touch screen.
  • the present invention provides a wearable touch device and a wearable touch control device that can perform touch control without a button or a touch screen, and is compact and portable, and correspondingly worn.
  • the touch method is accurate in touch and has a wide application range.
  • the technical solution for solving the technical problem to be solved by the present invention is a wearable touch device, comprising a carrier, a pattern transmitting unit, an image collecting unit, an image monitoring unit and a processing unit, wherein: the carrier can be worn;
  • the pattern emitting unit is configured to emit a scanning pattern to a touch surface touchable by the touch end;
  • the image collecting unit is configured to collect an image formed by the scanning pattern on the touch surface, and
  • the image information of the captured image is sent to the processing unit;
  • the image monitoring unit is configured to monitor current light energy of each region of the scanning surface of the scanning surface, and send current light energy information to the processing
  • the processing unit is configured to process the image information of the captured image and the current light energy information to determine a touch position of the touch end on the touch surface and generate corresponding command information.
  • the pattern emitting unit and the image collecting unit are both disposed on the carrier and located in the same area of the carrier.
  • the pattern emitting unit is an infrared projection pattern transmitting unit
  • the image collecting unit is an infrared image collecting unit
  • the image monitoring unit is an infrared image monitoring unit.
  • the scanning pattern emitted by the pattern emitting unit is a grid-shaped pattern.
  • the processing unit includes a comparison module and a command module, and the image information of the captured image includes image coordinate information of the captured image, and the processing unit pre-stores a position coordinate and a command information mapping table, and the scanning Initial light energy information and pattern coordinate information of the pattern;
  • the comparison module is configured to compare the image coordinate information of the captured image and the current light energy information with the pattern coordinate information and the initial light energy information of the scan pattern, thereby obtaining the determined touch position coordinate information.
  • the command module is configured to receive according to a location coordinate and a command information mapping table The determined touch position coordinate information is converted into corresponding command information.
  • the wearable touch device further includes an execution unit, and the execution unit is configured to perform a corresponding action according to the received command information.
  • the touch end is a human body finger
  • the touch surface touched by the touch end is an area touched by a human finger.
  • the pattern emitting unit, the image collecting unit, the image monitoring unit, and the processing unit are all disposed on the carrier.
  • the pattern emitting unit, the image collecting unit and the image monitoring unit are all disposed on an outer surface of the carrier near the palm side.
  • the outer surfaces of the pattern emitting unit, the image collecting unit and the image monitoring unit are respectively flush with the outer surface of the carrier.
  • the wearable touch device includes a plurality of carriers, the pattern emission unit and the image collection unit are disposed on one carrier, and the image monitoring unit and the processing unit are disposed on another carrier. .
  • the carrier is in a closed loop and the carrier is formed from a PVC material.
  • the technical solution for solving the technical problem to be solved by the present invention is a wearable touch method, which includes the following steps: transmitting a scan pattern to a touch surface touched by the touch terminal; collecting the scan pattern in the An image formed by projection on the touch surface, and simultaneously monitoring current light energy information of each region of the touch surface of the scan pattern; processing image information of the captured image and current light energy information to determine The touch position of the touch end on the touch surface generates corresponding command information.
  • the image information of the captured image includes image coordinate information of the captured image, and is provided with a position coordinate and command information mapping table, initial light energy information and pattern coordinate information of the scan pattern, and is determined.
  • the step of the touch end on the touch surface and generating corresponding command information includes the following steps: comparing current light energy information in an area of the touch surface with an initial of the scan pattern The light energy information, when the amount of change of the current light energy information relative to the initial light energy information is greater than or equal to the set value, initially determining that the area includes approximate touch Pointing; comparing image coordinate information of the collected image of the region with pattern coordinate information of the scan pattern, when a change occurs between the captured image and the scan pattern at a corresponding coordinate, determining The coordinate point is the actual touch point, thereby obtaining the coordinates of the touch position; and obtaining the command information corresponding to the coordinates of the touch position according to the position coordinate and the command information mapping table.
  • the set value is 5%.
  • the wearable touch method further includes the step of: performing a corresponding action according to the command information corresponding to the coordinates of the touch position.
  • the wearable touch device can be a closed ring smart wearable device that can be worn on a human finger, and the touch surface can be touched by the touch emitting unit (for example, a human finger) by the pattern emitting unit.
  • the touch emitting unit for example, a human finger
  • the pattern emitting unit monitors the current light energy of the formed image.
  • the information processing unit analyzes and processes the current light energy information and the image information of the captured image, determines the coordinates of the touch position of the finger, and determines the precise touch position of the finger, thereby performing touch feedback.
  • the wearable touch device enables the user to complete the touch without pressing a button or a touch screen, and is compact and portable, and the corresponding wearable touch method has accurate touch and wide application.
  • FIG. 1 is a schematic structural view of a wearable touch device according to Embodiment 1 of the present invention.
  • 2 is a schematic view showing the operation of the wearable touch device of FIG. 1.
  • FIG. 3 is a schematic diagram of the touch control performed by the wearable touch device of FIG. 4 is a flow chart of touch control of the wearable touch method in Embodiment 1 of the present invention. detailed description
  • a wearable touch device including a pattern emitting unit, an image collecting unit, an image monitoring unit, and a processing unit, wherein: the carrier is wearable; the pattern emitting unit is configured to emit a scanning pattern to a touch surface touchable by the touch end; The image collection unit is configured to collect an image formed by the scan pattern on the touch surface, and send image information of the captured image to the processing unit; the image monitoring unit is configured to monitor the scan Patterning current light energy in each area of the touch surface, and transmitting current light energy information to the processing unit; the processing unit is configured to process image information of the captured image and current light energy information, to Determining a touch position of the touch end on the touch surface and generating corresponding command information.
  • a wearable touch method includes the steps of: transmitting a scan pattern to a touch surface touchable by a touch end; and collecting the scan pattern on the touch surface Forming an image, and simultaneously monitoring a current optical energy signal of each area of the scanning surface of the scanning surface; processing image information of the collected image and current light energy information, thereby determining that the touch end is The touch position on the touch surface generates corresponding command information.
  • the user can perform touch without directly contacting the wearable touch device, and the wearable touch device is compact and portable, and the corresponding wearable touch method is accurate in touch, and the applicable surface is more suitable. wide.
  • Example 1
  • This embodiment provides a wearable touch device and a corresponding wearable touch method.
  • the wearable touch device in this embodiment includes a carrier 1, a pattern transmitting unit 2, an image collecting unit 3, an image monitoring unit 4, and a processing unit 5.
  • the carrier 1 can be worn, for example, on a human finger.
  • the pattern emitting unit 2 is configured to emit a scanning pattern to a touch surface that can be touched by the touch terminal.
  • the image collection unit 3 is configured to collect an image formed by the scan pattern on the touch surface, and send the image information of the collected image to the processing unit 5.
  • the image monitoring unit 4 is configured to monitor current light energy information of the scan pattern in each area of the touch surface, and send the current light energy information to the processing unit 5.
  • the processing unit 5 is configured to process the image information of the captured image and the current light energy information to determine a touch position of the touch end on the touch surface and generate corresponding command information.
  • both the pattern emitting unit 2 and the image collecting unit 3 are disposed on the carrier 1 and located in the same area of the carrier 1, that is, the transmitting portion of the pattern and the receiving portion of the image are included in the same area to facilitate emission. Fixed point recognition and comparison between the scanned pattern and the received captured image.
  • the carrier 1 is preferably in a closed loop
  • the image monitoring unit 4 is also disposed on the carrier 1
  • the pattern emitting unit 2 and the image collecting unit 3 are disposed at one end of the carrier 1, the image The monitoring unit 4 is disposed at the opposite end.
  • the pattern emitting unit 2 is an infrared projection type pattern emitting unit
  • the image collecting unit 3 is an infrared image collecting unit, such as a camera. It can be understood that the camera can be a rotatable camera for the touch device and the touch surface. When the relative position changes, the normal operation of the pattern emitting unit 2, the image collecting unit 3, and the image monitoring unit 4 is ensured.
  • the image monitoring unit 4 is an infrared image monitoring unit such as an optical detection sensor (op t i ca l sens or ).
  • the scanning pattern emitted by the pattern emitting unit 1 is, for example, a grid-shaped pattern.
  • the scanning pattern emitted by the pattern emitting unit 2 is an infrared fringe (if rared gr i d ) pattern, that is, an infrared matrix pattern. Since the matching infrared emitting pattern emitting unit 2, the image collecting unit 3 and the image monitoring unit 4 are used, when the wearable touch device is used, the touch end and the touch device can be performed without direct contact. Touch, touch is more flexible.
  • each of the above units can be implemented by using a smaller microchip to have a smaller volume, thereby ensuring the compact size and portability of the wearable touch device.
  • the carrier 1 be formed of a PVC material.
  • the material is strong and insulated, so that the structure and position of each unit disposed therein are stable, ensuring the effective operation of the wearable touch device and ensuring the safety of the human body.
  • the touch end can be in direct contact with or not in contact with the touch surface
  • the touch surface can be a regular plane, an irregular plane, or a surface. Any plane, irregular plane or surface with bumps can be regarded as a plane when its area is small enough, and has corresponding coordinate information. It can be understood that, when the touch end is not in direct contact with the touch surface, although the touch position can be determined by the current light energy information and the image information of the captured image, the touch end and the touch surface may be A certain distance, the projection of the touch end on the touch surface and the actual point to be touched have a certain deviation, so the calculated touch position may have a certain touch positioning error. It is easy to understand that when the touch end is in direct contact with the touch surface, since the projection of the touch end on the touch surface is substantially the same as the actual point to be touched, the touch positioning is more accurate.
  • the touch end 7 (see FIG. 3) is a finger of a human hand, and the carrier 1 is worn on the finger of the other hand of the human body, and the area touched by the touch end 7 is a carrier.
  • the touch end 7 is a finger of a human hand, and the carrier 1 is worn on a finger as the touch end 7, and the touched area of the touch end 7 is the palm of the other hand of the human body. .
  • the pattern transmitting unit 2, the image collecting unit 3, the image monitoring unit 4 and the processing unit 5 are all disposed in the carrier 1, and the pattern transmitting unit 2, the image collecting unit 3 and the image monitoring unit 4 are all disposed on the carrier 1 Close to the outer layer on the palm side.
  • the outer surfaces of the pattern emitting unit 2, the image collecting unit 3 and the image monitoring unit 4 are flush with the outer surface of the carrier 1.
  • the processing unit 5 further includes a comparison module and a command module (not shown in FIG. 1 ), and the image information of the captured image includes image coordinate information of the captured image, and the processing unit 5
  • the position coordinate and the command information mapping table, the initial light energy information of the scan pattern, and the pattern coordinate information are prestored, and the comparison module is configured to respectively use the image coordinate information and the current light energy information of the captured image with the pattern coordinate information and the initial light of the scan pattern.
  • the energy information is compared to obtain the determined touch position coordinate information;
  • the command module is configured to convert the received determined touch position coordinate information into corresponding command information according to the position coordinate and the command information mapping table.
  • the processing unit 5 can be implemented using a microprocessor such as a microcontroller.
  • the wearable touch device further includes an execution unit, and the execution unit is configured to perform a corresponding action according to the received command information.
  • the pattern emitting unit 2 for example, an infrared mesh projection unit
  • the image collecting unit 3 for example, an infrared camera
  • the pattern emitting unit 2 will continuously emit an infrared mesh projection to the touch surface (for example, the palm or the palm of the hand wearing the carrier 1), for example, the infrared grid 6 Projected on the palm or palm of the hand as the touch surface 8 on which the carrier 1 is worn (see Fig. 3).
  • the touch surface for example, the palm or the palm of the hand wearing the carrier 1
  • the infrared grid 6 Projected on the palm or palm of the hand as the touch surface 8 on which the carrier 1 is worn
  • the touch surface 8 when the touch surface 8 is touched, when the touch end 7 (for example, a human finger) falls into an area within the infrared grid, a part of the infrared that forms the infrared grid pattern is blocked.
  • the infrared grid pattern where the finger is located is blocked and the light energy changes (ie, the loss or loss of light is generated), and the current light energy is collected by the image monitoring unit 4 (for example, an optical detecting sensor), that is, The current light energy information is monitored; on the other hand, the image collection unit 3 collects the change of the infrared grid pattern where the finger is located, that is, the image formed by the infrared grid pattern projected on the touch surface 8.
  • the initial light energy information of the scan pattern and the pattern coordinates of the scan pattern (in this embodiment, infrared grid coordinates) information are pre-set in the processing unit 5, and the current light energy information and the image collection unit 3 monitored by the image monitoring unit 4
  • the image coordinate information of the collected image is fed back to the processing unit 5, and the comparison module in the processing unit 5 processes and analyzes the current light energy information and the image coordinate information of the captured image, if the current light of a certain region
  • the amount of change of the energy information relative to the initial light energy information is greater than or equal to a set value (for example, 5%), and then initially determining that the region includes an approximate touch point; further, comparing the image coordinate information of the captured image with the scan pattern
  • the pattern coordinate information when the image is changed between the image and the scan pattern at the corresponding coordinate, the coordinate point is determined as the actual touch point, and the coordinates of the corresponding touch position are obtained; and further, according to the position coordinate and the command information
  • the mapping table obtains command information corresponding to the coordinate
  • the image monitoring unit When the finger touches, the image monitoring unit The light energy change is sensed, and the light energy change information is transmitted to the processing unit, so that the touch action can be confirmed, and an area is initially determined to include the touch position according to the relationship between the light energy change and the set value.
  • the image collection unit acquires the projection image formed by the finger occlusion scan pattern, and accordingly records the image coordinate information of the captured image, and the image coordinate information is compared with the pattern coordinate information of the scan pattern to determine the image and scan of the image. The corresponding coordinates of the change between the patterns, resulting in a precise touch position.
  • the pattern emitting unit 2, the image collecting unit 3, and the image monitoring unit 4 may be units other than the infrared mode, and only need to be matched with each other, and the specific form thereof is not limited.
  • the specific pattern emitted by the pattern emitting unit 1 may also be other patterns than the grid pattern. In a specific application, the setting may be flexibly performed, and details are not described herein again.
  • the present embodiment provides a wearable touch method, including the following steps: transmitting a scan pattern to a touch surface touched by the touch end;
  • the corresponding command action is executed according to the command information corresponding to the touch position.
  • the image information of the captured image includes image coordinate information of the captured image, and is provided with a position coordinate and a command information mapping table, initial light energy information of the scan pattern, and pattern coordinate information, and the touch end is determined in the above method.
  • the step of touching the touch position on the touch surface and generating corresponding command information includes the following steps:
  • the initial determination of the area includes an approximation Touch point
  • FIG. 4 shows a touch flow chart of the wearable touch method.
  • the wearable touch device is worn on the human finger by the carrier 1, and the initial light energy information and the pattern coordinate information of the specific pattern are set during the initializing process.
  • the pattern emitting unit 2 projects a red grid pattern (Pa t tern ) toward, for example, the palm or the palm of the touch surface 8.
  • the wearable touch device When the touchless end 7 (for example, a human finger) performs touch, the wearable touch device remains in a standby state.
  • the infrared grid pattern When a human finger performs a touch, the infrared grid pattern will change, and the image gathering unit 3 (for example, a camera) will project an image formed by the infrared grid pattern where the human finger is located, and the image monitoring unit 4 (for example, an optical detection sensor) monitors current light energy information in an infrared grid pattern.
  • the image gathering unit 3 for example, a camera
  • the image monitoring unit 4 for example, an optical detection sensor
  • the image coordinate information and the current light energy information of the captured image formed when the human finger performs the touch are transmitted to the processing unit 5 for processing and analysis.
  • the current light energy information and the initial light energy information are compared to confirm that there is The touch action occurs, and an area is initially determined to be an approximate area including the touch position according to the relationship between the change of the light energy and the set value; and then, according to the image coordinate information of the captured image and the pattern coordinate of the scan pattern
  • the coordinate point is determined as the actual touch point, thereby obtaining the coordinates of the precise touch position.
  • the touch command according to the command information corresponding to the coordinates of the touch position.
  • each unit in the wearable touch device can be separately disposed on one or more carriers. That is, in Embodiment 1, the pattern emitting unit, the image collecting unit, the image monitoring unit, and the processing unit are disposed in the same carrier; In this embodiment, the pattern transmitting unit and the image collecting unit may be disposed in one carrier, and the image monitoring unit and the processing unit are disposed in another carrier.
  • the carrier comprises a first carrier and a second carrier, the first carrier and the second carrier being formed of the same material.
  • the pattern emitting unit and the image collecting unit may be disposed on the first carrier and located in the same area of the first carrier; and the image monitoring unit is disposed on the second carrier, and the first carrier and the second carrier may be worn in the same The same hand or two hands of a person, or can be worn on different people's hands.
  • each unit in the wearable touch device of this embodiment is the same as the configuration of the corresponding unit in Embodiment 1, and details are not described herein again.
  • the corresponding wearable touch method of the embodiment is the same as the wearable touch method of the first embodiment, and all of the infrared matrix patterns are emitted by the pattern emitting unit when the human finger touches the infrared matrix pattern of the touch surface.
  • the image monitoring unit detects the loss of infrared light at these points (change in light energy), and then according to the change of the infrared grid pattern where the finger collected by the image collection unit is located, and then the processing unit According to the change of light energy and pattern change, the specific position of the touch is determined, and the coordinates of the touch point are obtained, and the command information corresponding to the coordinates of the touch point is obtained.
  • the wearable touch device of the present invention may be a closed-loop smart wearable device that can be worn on a human finger.
  • the pattern emitting unit projects a specific pattern to the touch surface touched by the touch end.
  • the image forming unit images the image formed by the specific pattern of the finger, and the image monitoring unit monitors the current light energy information of the image, and the processing unit analyzes the image information of the captured image and the current light energy information. Processing, determining the touch position coordinates of the finger, determining the position of the precise touch of the finger, and performing touch feedback.
  • the wearable touch device enables the user to complete the touch without pressing a button or a touch screen, and is compact and portable, and the corresponding wearable touch method has accurate touch and wide application. While an exemplary embodiment is employed, the invention is not limited thereto. Various modifications and improvements can be made by those skilled in the art without departing from the spirit and scope of the invention. These modifications and improvements are also considered to be within the scope of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种穿戴式触控装置和一种穿戴式触控方法。该穿戴式触控装置包括载体(1)、图案发射单元(2)、图像采集单元(3)、图像监测单元(4)和处理单元(5)。所述载体(1)能穿戴,所述图案发射单元(2)向触控端能触摸到的触控面发射扫描图案,所述图像采集单元(3)采集所述扫描图案在所述触控面上投影形成的图像并将采集图像的图像信息发送至处理单元(5),所述图像监测单元(4)监测所述扫描图案在所述触控面的各区域的当前光能量并将当前光能量信息发送至处理单元,所述处理单元(5)对采集图像的图像信息和当前光能量信息进行处理,以确定所述触控端在所述触控面上的触控位置并产生相应的命令信息。

Description

穿戴式触控装置和穿戴式触控方法 技术领域
本发明属于控制技术领域, 具体涉及穿戴式触控装置和穿戴 式触控方法。 背景技术
随着科学技术的发展, 目前出现了穿戴式技术。 简言之, 穿 戴式技术是指探索和创造能直接穿在身上、 或整合进用户的衣服 或配件的设备的科学技术。 穿戴式智能设备已成为技术新宠、 成 为未来智能设备的发展方向。
穿戴式智能设备是应用穿戴式技术对日常穿戴进行智能化设 计而开发出的可以穿戴的设备 (如眼镜、 手套、 手表、 服饰等) 的总称。 穿戴式智能设备包括: 功能全、 尺寸较大、 可不依赖智 能手机而实现完整或者部分智能功能的穿戴式智能设备(例如智 能手表或智能眼镜等) ; 以及只专注于某一类智能应用功能, 并 需要和其它设备 (如智能手机) 配合使用的穿戴式智能设备(如 各类进行体征监测的智能手环、 智能首饰等) 。 随着技术的进步 以及用户需求的变迁, 穿戴式智能设备的形态与应用热点也在不 断的变化。
但是,现有的穿戴式触控装置一般都是利用按键式控制方式, 即使利用触摸屏来实现触控, 鉴于穿戴式触控装置的体积小的需 求, 一般触摸屏也无法很好地满足用户的需求, 而且, 现有的穿 戴式触控装置一般都需要触控端与其 (即, 其按键或触摸屏) 直 接接触才能进行触控, 使得穿戴式装置的触控体验感差。 因此, 提供便捷的、 无需按键也无需触摸屏就能够使用户完成触控的穿 戴式触控装置和方法成为目前亟待解决的技术问题。 发明内容 足, 提供一种穿戴式触控装置和一种穿戴式触控方法, 该穿戴式 触控装置无需按键也无需触摸屏即可使用户完成触控, 且体积小 巧、 便于携带, 相应的穿戴式触控方法触控准确, 适用面较广。
解决本发明所要解决的技术问题所釆用的技术方案是一种穿 戴式触控装置, 包括载体、 图案发射单元、 图像釆集单元、 图像 监测单元和处理单元, 其中: 所述载体能穿戴; 所述图案发射单 元用于向触控端能触摸到的触控面发射扫描图案; 所述图像釆集 单元用于釆集所述扫描图案在所述触控面上投射形成的图像, 并 将釆集图像的图像信息发送至所述处理单元; 所述图像监测单元 用于监测所述扫描图案在所述触控面的各区域的当前光能量, 并 将当前光能量信息发送至所述处理单元; 所述处理单元用于对釆 集图像的图像信息和当前光能量信息进行处理, 以确定所述触控 端在所述触控面上的触控位置并产生相应的命令信息。
优选的是, 所述图案发射单元和所述图像釆集单元均设置于 所述载体上, 且位于所述载体的同一区域内。
优选的是, 所述图案发射单元为红外投影式图案发射单元, 所述图像釆集单元为红外式图像釆集单元, 所述图像监测单元为 红外式图像监测单元。
优选的是, 所述图案发射单元发射的扫描图案为网格形状的 图案。
优选的是, 所述处理单元包括对比模块与命令模块, 所述釆 集图像的图像信息包括釆集图像的图像坐标信息, 所述处理单元 内预存有位置坐标与命令信息映射表、 所述扫描图案的初始光能 量信息和图案坐标信息;
所述对比模块用于将所述釆集图像的图像坐标信息和所述当 前光能量信息分别与所述扫描图案的图案坐标信息和初始光能量 信息进行对比, 从而得到确定的触控位置坐标信息;
所述命令模块用于根据位置坐标与命令信息映射表, 将接收 到的所述确定的触控位置坐标信息转化成相应的命令信息。
优选的是, 所述穿戴式触控装置还包括执行单元, 所述执行 单元用于根据接收到的命令信息执行相应的动作。
优选的是, 所述触控端为人体手指, 所述触控端能触摸到的 触控面为人体手指所能触摸到的区域。
优选的是, 所述图案发射单元、 所述图像釆集单元、 所述图 像监测单元和所述处理单元均设置所述载体上。
优选的是, 所述图案发射单元、 所述图像釆集单元和所述图 像监测单元均设置于所述载体的靠近手心侧的外表层上。
优选的是, 所述图案发射单元、 所述图像釆集单元和所述图 像监测单元的外表面分别与所述载体的外表面齐平。
优选的是, 所述穿戴式触控装置包括多个载体, 所述图案发 射单元和所述图像釆集单元设置在一个载体上, 所述图像监测单 元和所述处理单元设置在另一个载体上。
优选的是, 所述载体为闭合环状, 所述载体釆用 PVC材料形 成。
解决本发明所要解决的技术问题所釆用的技术方案是一种穿 戴式触控方法, 包括以下步骤: 向触控端能触摸到的触控面发射 扫描图案; 釆集所述扫描图案在所述触控面上投射形成的图像, 并同时监测所述扫描图案在所述触控面的各区域的当前光能量信 息; 对釆集图像的图像信息和当前光能量信息进行处理, 从而确 定出所述触控端在所述触控面上的触控位置并产生相应的命令信 息。
优选的是, 所述釆集图像的图像信息包括釆集图像的图像坐 标信息, 且预设有位置坐标与命令信息映射表、 所述扫描图案的 初始光能量信息和图案坐标信息, 并且确定出所述触控端在所述 触控面上的触控位置并产生相应的命令信息的步骤包括以下步 骤: 对比所述触控面的一个区域内的当前光能量信息与所述扫描 图案的初始光能量信息, 当当前光能量信息相对初始光能量信息 的变化量大于等于设定值时, 则初步确定该区域内包括近似触控 点; 对比该区域的所述釆集图像的图像坐标信息与所述扫描图案 的图案坐标信息, 当在对应的坐标处所述釆集图像与所述扫描图 案之间发生变化时, 则确定该坐标点为实际触控点, 从而获得触 控位置的坐标; 以及根据位置坐标与命令信息映射表, 获取该触 控位置的坐标对应的命令信息。
优选的是, 所述设定值为 5%。
优选的是, 所述穿戴式触控方法进一步包括步骤: 根据触控 位置的坐标对应的命令信息执行对应的动作。
在本发明实施例中, 该穿戴式触控装置可以为可穿戴于人体 手指的闭合环状智能穿戴装置, 通过图案发射单元向触控端 (例 如人体手指) 能触摸到的触控面投射特定图案, 当手指执行触控 时, 通过图像釆集单元釆集手指所到之处所述特定图案在所述触 控面上投射形成的图像, 并配合图像监测单元监测所形成图像的 当前光能量信息, 处理单元对当前光能量信息和釆集图像的图像 信息进行分析和处理, 确定手指的触控位置的坐标, 判断手指的 精确触控的位置, 从而进行触控反馈。 该穿戴式触控装置无需按 键也无需触摸屏即可使用户完成触控, 且体积小巧、 便于携带, 相应的穿戴式触控方法触控准确, 适用面较广。 附图说明
图 1为本发明的实施例 1中的穿戴式触控装置的结构示意图。 图 2为图 1中的穿戴式触控装置的工作示意图。
图 3为釆用图 1中的穿戴式触控装置进行触控的示意图。 图 4为本发明的实施例 1中的穿戴式触控方法的触控流程图。 具体实施方式
为使本领域技术人员更好地理解本发明的技术方案, 下面结 合附图和具体实施方式对本发明的穿戴式触控装置和穿戴式触控 方法作进一步详细的描述。
根据本发明的一个方面, 提供一种穿戴式触控装置, 包括载 体、 图案发射单元、 图像釆集单元、 图像监测单元和处理单元, 其中: 所述载体能穿戴; 所述图案发射单元用于向触控端能触摸 到的触控面发射扫描图案; 所述图像釆集单元用于釆集所述扫描 图案在所述触控面上投射形成的图像, 并将釆集图像的图像信息 发送至所述处理单元; 所述图像监测单元用于监测所述扫描图案 在所述触控面的各区域的当前光能量, 并将当前光能量信息发送 至所述处理单元; 所述处理单元用于对釆集图像的图像信息和当 前光能量信息进行处理, 以确定所述触控端在所述触控面上的触 控位置并产生相应的命令信息。
根据本发明的另一个方面, 提供一种穿戴式触控方法, 包括 以下步骤: 向触控端能触摸到的触控面发射扫描图案; 釆集所述 扫描图案在所述触控面上投射形成的图像, 并同时监测所述扫描 图案在所述触控面的各区域的当前光能量信; 对釆集图像的图像 信息和当前光能量信息进行处理, 从而确定出所述触控端在所述 触控面上的触控位置并产生相应的命令信息。
在本发明实施例中, 用户无需直接接触该穿戴式触控装置即 可进行触控, 且该穿戴式触控装置体积小巧、 便于携带, 相应的 穿戴式触控方法触控准确, 适用面较广。 实施例 1 :
本实施例提供一种穿戴式触控装置和相应的穿戴式触控方 法。
如图 1 所示, 本实施例中的穿戴式触控装置包括载体 1、 图 案发射单元 2、 图像釆集单元 3、 图像监测单元 4和处理单元 5。
载体 1能穿戴, 例如, 穿戴于人体手指上。
图案发射单元 2用于向触控端能够触摸到的触控面发射扫描 图案。
图像釆集单元 3用于釆集所述扫描图案在所述触控面上投射 形成的图像, 并将釆集到的釆集图像的图像信息发送至处理单元 5。 图像监测单元 4用于监测所述扫描图案在所述触控面的各区 域的当前光能量信息, 并将当前光能量信息发送至处理单元 5。
处理单元 5用于对釆集图像的图像信息和当前光能量信息进 行处理, 以确定触控端在所述触控面上的触控位置并产生相应的 命令信息。
优选的是, 图案发射单元 2和图像釆集单元 3均设置于载体 1上、且位于载体 1的同一区域内, 即在同一个区域内包括图案的 发射部分和图像的接收部分, 以便于发射的扫描图案与接收的釆 集图像之间的定点识别和比较。 在本实施例中, 如图 1 所示, 载 体 1优选为闭合环状, 图像监测单元 4也设置于载体 1上, 且图 案发射单元 2和图像釆集单元 3设置于载体 1的一端, 图像监测 单元 4设置于相对的另一端。
图案发射单元 2为红外投影式图案发射单元, 图像釆集单元 3为红外式图像釆集单元, 例如摄像头(camera ) , 可以理解, 摄 像头可以是可旋转摄像头, 以在触控装置与触控面的相对位置变 化时, 保证图案发射单元 2、 图像釆集单元 3 和图像监测单元 4 的正常工作。 图像监测单元 4 为红外式图像监测单元, 例如光学 检测传感器 (op t i ca l sens or ) 。 另外, 图案发射单元 1发射的 扫描图案例如为网格形状的图案, 即本实施例中, 图案发射单元 2 发射的扫描图案为红外网格( inf rared gr i d ) 图案, 也即红外矩 阵图案。 由于釆用相匹配的红外式的图案发射单元 2、 图像釆集单 元 3和图像监测单元 4, 因此在釆用该穿戴式触控装置时,触控端 与触控装置无需直接接触即可进行触控, 触控更灵活。
优选的是, 上述各单元均可选用体积较小的微型芯片实现, 以具有较小的体积, 从而保证该穿戴式触控装置的小巧体积以及 便携性。
同时优选的是, 载体 1可以釆用 PVC材料形成。 该材料结实、 绝缘, 使得设置于其内的各单元的结构和位置稳定, 保证该穿戴 式触控装置的有效工作, 还能保证人体的安全。
在本实施例中, 触控端可以与触控面直接接触或不接触, 且 触控面可以为规则平面、 不规则平面或曲面上。 任何一个平面、 具有凹凸点的不规则平面或曲面在其面积足够小的时候均可以被 视为一个平面, 并具有相应的坐标信息。 可以理解的是, 当触控 端不与触控面直接接触时, 虽然能够通过当前光能量信息和釆集 图像的图像信息确定出触控位置, 但此时由于触控端与触控面可 能相距一定的距离, 触控端在触控面上的投影与实际要触摸的点 之间存在一定的偏差, 因此经计算得到的触控位置可能存在一定 的触控定位误差。 容易理解的是, 触控端与触控面直接接触时, 由于触控端在触控面上的投影与实际要触摸的点基本一致, 因此 触控定位更^ "确。
在一种应用示例中, 触控端 7 (参见图 3 )为人体一只手的手 指, 载体 1 穿戴于人体另一只手的手指上, 触控端 7能触摸到的 区域为穿戴有载体 1的手的手掌。在另一种应用示例中, 触控端 7 为人体一只手的手指, 载体 1 穿戴于作为触控端 7的手指上, 触 控端 7 能触摸到的区域为人体另一只手的手掌。 一般的, 载体 1 距离触控端 7能触摸到的区域或距离触控端 7的距离越近, 图像 监测单元 4得到的光能量变化越准确, 触控的定位也相对更精确。
相应的, 图案发射单元 2、 图像釆集单元 3、 图像监测单元 4 和处理单元 5均设置在载体 1中, 且图案发射单元 2、 图像釆集单 元 3和图像监测单元 4均设置于载体 1靠近手心侧的外表层。 优 选的是, 图案发射单元 2、 图像釆集单元 3和图像监测单元 4的外 表面与载体 1的外表面齐平。
本实施例的穿戴式触控装置中, 处理单元 5 中还包括对比模 块与命令模块 (图 1 中未示出) , 釆集图像的图像信息包括釆集 图像的图像坐标信息, 处理单元 5 内预存有位置坐标与命令信息 映射表、 扫描图案的初始光能量信息和图案坐标信息, 对比模块 用于将釆集图像的图像坐标信息和当前光能量信息分别与扫描图 案的图案坐标信息和初始光能量信息进行对比, 从而得到确定的 触控位置坐标信息; 命令模块用于根据位置坐标与命令信息映射 表, 将接收到的确定的触控位置坐标信息转化成相应的命令信息。 优选的是, 处理单元 5可以釆用微型处理器(例如单片机) 实现。 进一步的, 所述穿戴式触控装置还包括执行单元, 执行单元 用于根据接收到的命令信息执行相应的动作。
例如, 如图 2所示, 载体 1 中的图案发射单元 2 (例如红外 网格投影单元)和釆集图像用的图像釆集单元 3(例如红外摄像头) 均安装在载体 1 的靠近手心侧, 该穿戴式触控装置处于待机状态 或工作状态时, 图案发射单元 2 将持续向触控面 (例如穿戴有载 体 1的手的手心或手掌)发射红外网格投影, 例如, 将红外网格 6 投影在作为触控面 8的穿戴有载体 1 的手的手心或手掌上 (参见 图 3 ) 。 如图 3所示, 当对触控面 8进行触控时, 当触控端 7 (例 如人体手指) 落入红外网格范围内的某个区域时, 会挡住形成红 外网格图案的一部分红外光, 一方面手指所到之处的红外网格图 案受遮挡而发生光能量变化 (即产生光的损耗或损失) , 当前光 能量被图像监测单元 4 (例如光学检测传感器)所釆集, 即当前光 能量信息被监测; 另一方面图像釆集单元 3釆集手指所到之处的 红外网格图案的变化, 即釆集红外网格图案在触控面 8 上投射形 成的图像。
处理单元 5 中预设有扫描图案的初始光能量信息和扫描图案 的图案坐标(本实施例中为红外网格坐标)信息, 图像监测单元 4 监测到的当前光能量信息和图像釆集单元 3釆集到的釆集图像的 图像坐标信息均反馈给处理单元 5,处理单元 5中的对比模块对上 述当前光能量信息和釆集图像的图像坐标信息进行处理、 分析, 若某区域的当前光能量信息相对初始光能量信息的变化量大于等 于设定值 (例如 5% ) , 则初步确定该区域内包括近似触控点; 进 一步的, 对比模块对比釆集图像的图像坐标信息与扫描图案的图 案坐标信息, 当在对应的坐标处釆集图像与扫描图案之间发生变 化, 即可确定该坐标点为实际触控点, 得出相应的触控位置的坐 标; 进而根据位置坐标与命令信息映射表, 获取该触控位置的坐 标对应的命令信息。
利用该穿戴式触控装置, 当手指进行触控时, 图像监测单元 感知光能量变化, 并将该光能量变化信息传送给处理单元, 从而 可以确认有触控动作发生, 根据光能量变化与设定值之间的大小 关系将某区域初步确定为包括触控位置的近似区域; 同时, 图像 釆集单元获取手指遮挡扫描图案形成的投射图像, 相应地记录釆 集图像的图像坐标信息, 该图像坐标信息与扫描图案的图案坐标 信息进行对比, 确定釆集图像与扫描图案之间发生变化的对应坐 标, 从而得出精确的触控位置。
应该理解的是, 图案发射单元 2、 图像釆集单元 3和图像监 测单元 4 可以是除红外方式以外的其他方式的单元, 只需能互相 匹配即可, 而对其具体形式不做限定。 图案发射单元 1 所发射的 特定图案也可以是除网格图案之外的其他图案, 在具体应用中, 可以灵活地进行设置, 这里不再赘述。
相应的, 本实施例提供一种穿戴式触控方法, 包括以下步骤: 向触控端能触摸到的触控面发射扫描图案;
釆集所述扫描图案在所述触控面上投射形成的图像, 并同时 监测所述扫描图案在所述触控面的各区域的当前光能量信息; 对釆集图像的图像信息和当前光能量信息进行处理, 获得图 像变化以及光能量变化, 从而确定出触控端在触控面上的触控位 置并产生相应的命令信息;
进一步的, 根据与触控位置对应的命令信息执行对应的命令 动作。
其中, 釆集图像的图像信息包括釆集图像的图像坐标信息, 且预设有位置坐标与命令信息映射表、 扫描图案的初始光能量信 息和图案坐标信息, 并且上述方法中确定出触控端在触控面上的 触控位置并产生相应的命令信息的步骤包括以下步骤:
对比触控面的某区域内的当前光能量信息与扫描图案的初始 光能量信息, 当当前光能量信息相对初始光能量信息的变化量大 于等于设定值时, 则初步确定该区域内包括近似触控点;
对比该区域内的釆集图像的图像坐标信息与扫描图案的图案 坐标信息, 当在对应的坐标处釆集图像与扫描图案之间发生变化 时, 则确定该坐标点为实际触控点, 从而获得触控位置的坐标; 进而根据位置坐标与命令信息映射表, 获取与该触控位置的 坐标对应的命令信息。
图 4示出了该穿戴式触控方法的触控流程图。
穿戴式触控装置通过载体 1 穿戴于例如人体手指上, 开机初 始化过程中设定特定图案的初始光能量信息和图案坐标信息。
图案发射单元 2向作为触控面 8的例如手心或手掌上投射红 外网格图案 (Pa t tern ) 。
当无触控端 7 (例如人体手指) 执行触控时, 该穿戴式触控 装置保持待机状态。
当有人体手指执行触控时, 红外网格图案将发生变化, 图像 釆集单元 3 (例如摄像头)将釆集人体手指所到之处的红外网格图 案投射形成的图像, 同时图像监测单元 4 (例如光学检测传感器) 监测红外网格图案中的当前光能量信息。
上述在人体手指执行触控时形成的釆集图像的图像坐标信息 和当前光能量信息均传送到处理单元 5 进行处理、 分析, 首先通 过将当前光能量信息和初始光能量信息进行对比来确认有触控动 作发生, 并根据光能量变化与设定值之间的大小关系将某区域初 步确定为包括触控位置的近似区域; 然后, 进一步根据釆集图像 的图像坐标信息与扫描图案的图案坐标信息之间的对比, 当在对 应的坐标 (例如中心坐标) 处釆集图像与扫描图案之间发生变化 时, 则确定该坐标点为实际触控点, 从而得出精确的触控位置的 坐标; 进而根据与该触控位置的坐标对应的命令信息, 执行触控 命令。 实施例 2 :
本实施例提供一种穿戴式触控装置和相应的穿戴式触控方 法, 与实施例 1 相比, 该穿戴式触控装置中的各单元可以分离设 置于一个以上的载体上。 即在实施例 1 中, 图案发射单元、 图像 釆集单元、 图像监测单元和处理单元设置在同一个载体中; 而在 本实施例中, 可以将图案发射单元和图像釆集单元设置在一个载 体中, 而将图像监测单元和处理单元设置另一个载体中。
在本实施例中, 载体包括第一载体和第二载体, 第一载体和 第二载体釆用相同的材料形成。 例如, 图案发射单元和图像釆集 单元可以设置于第一载体上, 且位于第一载体的同一区域内; 而 图像监测单元设置于第二载体上, 第一载体和第二载体可以穿戴 在同一人的同一只手或者两只手上, 或者可以穿戴在不同人的手 上。
本实施例的穿戴式触控装置中的各单元的配置与实施例 1 中 的相应单元的配置相同, 这里不再赘述。
本实施例的相应的穿戴式触控方法与实施例 1 的穿戴式触控 方法相同, 均为通过图案发射单元发射红外矩阵图案, 当人体手 指触摸到触控面的红外矩阵图案的某些点上时, 图像监测单元就 监测到这些点处的红外光有损失 (光能量变化) , 再根据图像釆 集单元釆集到的手指所到之处的红外网格图案的变化, 进而由处 理单元根据光能量变化和图案变化进行分析、 处理, 判断出触控 的具体位置, 得出准确的触控点坐标, 进而获取与该触控点坐标 对应的命令信息。 本发明中的穿戴式触控装置可以为可穿戴于人体手指的闭合 环状智能穿戴装置, 通过图案发射单元向触控端能触摸到的触控 面投射特定图案, 当手指进行触控时, 通过图像釆集单元釆集手 指所到之处特定图案投射形成的图像, 并配合图像监测单元监测 该图像的当前光能量信息, 处理单元对釆集图像的图像信息和当 前光能量信息进行分析和处理, 确定手指的触控位置坐标, 判断 手指的精确触控的位置, 从而进行触控反馈。 该穿戴式触控装置 无需按键也无需触摸屏即可使用户完成触控, 且体积小巧、 便于 携带, 相应的穿戴式触控方法触控准确, 适用面较广。 而釆用的示例性实施方式, 然而本发明并不局限于此。 对于本领 域内的普通技术人员而言, 在不脱离本发明的精神和实质的情况 下, 可以做出各种变型和改进, 这些变型和改进也视为本发明的 保护范围。

Claims

权 利 要 求 书
1、 一种穿戴式触控装置, 其特征在于, 包括:
载体, 所述载体能穿戴;
图案发射单元, 所述图案发射单元用于向触控端能触摸到的 触控面发射扫描图案;
图像釆集单元, 所述图像釆集单元用于釆集所述扫描图案在 所述触控面上投射形成的图像, 并将釆集图像的图像信息发送至 所述处理单元;
图像监测单元, 所述图像监测单元用于监测所述扫描图案在 所述触控面的各区域的当前光能量, 并将当前光能量信息发送至 所述处理单元; 以及
处理单元, 所述处理单元用于对釆集图像的图像信息和当前 光能量信息进行处理, 以确定所述触控端在所述触控面上的触控 位置并产生相应的命令信息。
2、 根据权利要求 1所述的穿戴式触控装置, 其特征在于, 所 述图案发射单元和所述图像釆集单元均设置于所述载体上, 且位 于所述载体的同一区域内。
3、 根据权利要求 1所述的穿戴式触控装置, 其特征在于, 所 述图案发射单元为红外投影式图案发射单元, 所述图像釆集单元 为红外式图像釆集单元, 所述图像监测单元为红外式图像监测单 元。
4、 根据权利要求 1所述的穿戴式触控装置, 其特征在于, 所 述图案发射单元发射的扫描图案为网格形状的图案。
5、 根据权利要求 1所述的穿戴式触控装置, 其特征在于, 所 述处理单元包括对比模块与命令模块, 所述釆集图像的图像信息 包括釆集图像的图像坐标信息, 所述处理单元内预存有位置坐标 与命令信息映射表、 所述扫描图案的初始光能量信息和图案坐标 信息;
所述对比模块用于将所述釆集图像的图像坐标信息和所述当 前光能量信息分别与所述扫描图案的图案坐标信息和初始光能量 信息进行对比, 从而得到确定的触控位置坐标信息;
所述命令模块用于根据位置坐标与命令信息映射表, 将接收 到的所述确定的触控位置坐标信息转化成相应的命令信息。
6、 根据权利要求 1所述的穿戴式触控装置, 其特征在于, 所 述穿戴式触控装置还包括执行单元, 所述执行单元用于根据接收 到的命令信息执行相应的动作。
7、 根据权利要求 1所述的穿戴式触控装置, 其特征在于, 所 述触控端为人体手指, 所述触控端能触摸到的触控面为人体手指 所能触摸到的区域。
8、 根据权利要求 1所述的穿戴式触控装置, 其特征在于, 所 述图案发射单元、 所述图像釆集单元、 所述图像监测单元和所述 处理单元均设置在所述载体上。
9、 根据权利要求 8所述的穿戴式触控装置, 其特征在于, 所 述图案发射单元、 所述图像釆集单元和所述图像监测单元均设置 于所述载体的靠近手心侧的外表层上。
1 0、 根据权利要求 9所述的穿戴式触控装置, 其特征在于, 所述图案发射单元、 所述图像釆集单元和所述图像监测单元的外 表面分别与所述载体的外表面齐平。
1 1、 根据权利要求 1 所述的穿戴式触控装置, 其特征在于, 包括多个载体, 所述图案发射单元和所述图像釆集单元设置在一 个载体上, 所述图像监测单元和所述处理单元设置在另一个载体 上。
12、 根据权利要求 1 所述的穿戴式触控装置, 其特征在于, 所述载体为闭合环状, 所述载体釆用 PVC材料形成。
1 3、 一种穿戴式触控方法, 其特征在于, 包括以下步骤: 向触控端能触摸到的触控面发射扫描图案;
釆集所述扫描图案在所述触控面上投射形成的图像, 并同时 监测所述扫描图案在所述触控面的各区域的当前光能量信息; 以 及
对釆集图像的图像信息和当前光能量信息进行处理, 从而确 定出所述触控端在所述触控面上的触控位置并产生相应的命令信 息。
14、 根据权利要求 1 3所述的穿戴式触控方法, 其特征在于, 所述釆集图像的图像信息包括釆集图像的图像坐标信息, 且预设 有位置坐标与命令信息映射表、 所述扫描图案的初始光能量信息 和图案坐标信息, 并且
确定出所述触控端在所述触控面上的触控位置并产生相应的 命令信息的步骤包括:
对比所述触控面的一个区域内的当前光能量信息与所述扫描 图案的初始光能量信息, 当当前光能量信息相对初始光能量信息 的变化量大于等于设定值时, 则初步确定该区域内包括近似触控 点;
对比该区域的所述釆集图像的图像坐标信息与所述扫描图案 的图案坐标信息, 当在对应的坐标处所述釆集图像与所述扫描图 案之间发生变化时, 则确定该坐标点为实际触控点, 从而获得触 控位置的坐标; 以及 根据位置坐标与命令信息映射表, 获取与该触控位置的坐标 对应的命令信息。
15、 根据权利要求 14所述的穿戴式触控方法, 其特征在于, 所述设定值为 5%。
16、 根据权利要求 1 3所述的穿戴式触控方法, 其特征在于, 进一步包括步骤: 根据与触控位置的坐标对应的命令信息执行对 应的操作。
PCT/CN2014/084820 2014-04-28 2014-08-20 穿戴式触控装置和穿戴式触控方法 WO2015165175A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US14/436,911 US10042443B2 (en) 2014-04-28 2014-08-20 Wearable touch device and wearable touch method
KR1020157012703A KR101790853B1 (ko) 2014-04-28 2014-08-20 착용가능 터치 디바이스 및 착용가능 터치 방법
JP2017508720A JP6467495B2 (ja) 2014-04-28 2014-08-20 着用型タッチ装置及び着用型タッチ方法
EP14859317.1A EP3139256B1 (en) 2014-04-28 2014-08-20 Wearable touch apparatus and wearable touch method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410175468.4 2014-04-28
CN201410175468.4A CN103995621B (zh) 2014-04-28 2014-04-28 一种穿戴式触控装置和穿戴式触控方法

Publications (1)

Publication Number Publication Date
WO2015165175A1 true WO2015165175A1 (zh) 2015-11-05

Family

ID=51309807

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/084820 WO2015165175A1 (zh) 2014-04-28 2014-08-20 穿戴式触控装置和穿戴式触控方法

Country Status (6)

Country Link
US (1) US10042443B2 (zh)
EP (1) EP3139256B1 (zh)
JP (1) JP6467495B2 (zh)
KR (1) KR101790853B1 (zh)
CN (1) CN103995621B (zh)
WO (1) WO2015165175A1 (zh)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101524575B1 (ko) * 2014-08-20 2015-06-03 박준호 웨어러블 디바이스
CN104536556B (zh) * 2014-09-15 2021-01-15 联想(北京)有限公司 一种信息处理方法及电子设备
WO2016060461A1 (en) 2014-10-15 2016-04-21 Jun Ho Park Wearable device
CN104317398B (zh) * 2014-10-15 2017-12-01 天津三星电子有限公司 一种手势控制方法、穿戴式设备及电子设备
WO2016106481A1 (en) * 2014-12-29 2016-07-07 Empire Technology Development Llc Quick command entry for wearable devices
KR101586759B1 (ko) * 2015-02-27 2016-01-19 주식회사 디엔엑스 웨어러블 장치 및 그 제어방법
US10599216B2 (en) 2015-03-02 2020-03-24 Tap Systems Inc. Arbitrary surface and finger position keyboard
KR20170028130A (ko) 2015-09-03 2017-03-13 박준호 웨어러블 디바이스
CN105589607B (zh) * 2016-02-14 2018-09-07 京东方科技集团股份有限公司 触控系统、触控显示系统和触控交互方法
US10638316B2 (en) * 2016-05-25 2020-04-28 Intel Corporation Wearable computer apparatus with same hand user authentication
US11009968B1 (en) 2017-03-29 2021-05-18 Tap Systems Inc. Bi-directional tap communication device
US10691205B1 (en) 2017-08-29 2020-06-23 Tap Systems Inc. Tap device with dynamically switchable modes

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201548938U (zh) * 2009-10-29 2010-08-11 北京汇冠新技术股份有限公司 一种触摸屏及触摸系统
CN101859209A (zh) * 2010-05-28 2010-10-13 程宇航 红外线检测装置和方法、红外线输入装置以及图形用户设备
CN101963868A (zh) * 2009-07-22 2011-02-02 影来腾贸易(上海)有限公司 红外线扩展光源式多点触控系统
CN102915153A (zh) * 2012-10-26 2013-02-06 苏州瀚瑞微电子有限公司 一种由非接触式手势动作实现鼠标功能的装置及方法
CN103546181A (zh) * 2012-07-17 2014-01-29 高寿谦 可拆卸并可自由组合功能的穿戴式无线智能电子装置
US20140098067A1 (en) * 2012-10-02 2014-04-10 Autodesk, Inc. Always-available input through finger instrumentation

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2740952B2 (ja) 1988-04-13 1998-04-15 オリエント時計株式会社 腕時計
US6771294B1 (en) 1999-12-29 2004-08-03 Petri Pulli User interface
FI20010387A0 (fi) * 2001-02-27 2001-02-27 Rto Holding Oy Menetelmä ja järjestelmä tiedon tai valinnan syöttämiseksi
DE102004044999A1 (de) * 2004-09-16 2006-04-06 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Eingabesteuerung für Geräte
US20080018591A1 (en) * 2006-07-20 2008-01-24 Arkady Pittel User Interfacing
KR20090061179A (ko) 2007-12-11 2009-06-16 한국전자통신연구원 데이터 입력 장치 및 이를 이용한 데이터 처리 방법
SE534411C2 (sv) * 2009-11-02 2011-08-09 Stanley Wissmar Elektronisk Finger Ring och tillverkning av densamme
EP2378394A3 (en) * 2010-04-15 2015-03-25 Electronics and Telecommunications Research Institute User interface device and method for recognizing user interaction using same
CN203287855U (zh) * 2010-05-24 2013-11-13 卡尼特·布迪派特 用于移动计算装置的虚拟输入装置的设备
KR20120071551A (ko) 2010-12-23 2012-07-03 한국전자통신연구원 구조 영상을 이용한 사용자 인터랙션 장치 및 방법
JP2012208926A (ja) * 2011-03-15 2012-10-25 Nikon Corp 検出装置、入力装置、プロジェクタ、及び電子機器
US10061387B2 (en) * 2011-03-31 2018-08-28 Nokia Technologies Oy Method and apparatus for providing user interfaces
KR101446902B1 (ko) 2011-08-19 2014-10-07 한국전자통신연구원 사용자 인터랙션 장치 및 방법
JP2014048691A (ja) 2012-08-29 2014-03-17 Sharp Corp リストバンド型入力デバイスおよびリストバンド型入力デバイスを用いた文字入力方法
CN105027030B (zh) 2012-11-01 2018-10-23 艾卡姆有限公司 用于三维成像、映射、建网和界面连接的无线腕式计算和控制设备和方法
CN203178918U (zh) * 2013-03-21 2013-09-04 联想(北京)有限公司 一种电子设备
JP2015041052A (ja) * 2013-08-23 2015-03-02 ソニー株式会社 リストバンド型情報処理装置および記憶媒体

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101963868A (zh) * 2009-07-22 2011-02-02 影来腾贸易(上海)有限公司 红外线扩展光源式多点触控系统
CN201548938U (zh) * 2009-10-29 2010-08-11 北京汇冠新技术股份有限公司 一种触摸屏及触摸系统
CN101859209A (zh) * 2010-05-28 2010-10-13 程宇航 红外线检测装置和方法、红外线输入装置以及图形用户设备
CN103546181A (zh) * 2012-07-17 2014-01-29 高寿谦 可拆卸并可自由组合功能的穿戴式无线智能电子装置
US20140098067A1 (en) * 2012-10-02 2014-04-10 Autodesk, Inc. Always-available input through finger instrumentation
CN102915153A (zh) * 2012-10-26 2013-02-06 苏州瀚瑞微电子有限公司 一种由非接触式手势动作实现鼠标功能的装置及方法

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
See also references of EP3139256A4 *
XIANG, BINBIN: "Research on Touch Screen Key Technology Based on Image Detection Technology", ELECTRONIC TECHNOLOGY & INFORMATION SCIENCE , CHINA MASTER'S THESES FULL-TEXT DATABASE, 15 July 2013 (2013-07-15), pages 5, XP008182491 *

Also Published As

Publication number Publication date
JP6467495B2 (ja) 2019-02-13
CN103995621A (zh) 2014-08-20
KR20150135761A (ko) 2015-12-03
EP3139256B1 (en) 2022-05-04
EP3139256A4 (en) 2018-04-25
CN103995621B (zh) 2017-02-15
JP2017514261A (ja) 2017-06-01
US20160124524A1 (en) 2016-05-05
EP3139256A1 (en) 2017-03-08
US10042443B2 (en) 2018-08-07
KR101790853B1 (ko) 2017-10-26

Similar Documents

Publication Publication Date Title
WO2015165175A1 (zh) 穿戴式触控装置和穿戴式触控方法
KR101762050B1 (ko) 착용가능 투영 장비
US9978261B2 (en) Remote controller and information processing method and system
EP3015955B1 (en) Controlling multiple devices with a wearable input device
TWI546724B (zh) 用以在通訊設備間傳送資訊項目之裝置及方法
US20110090148A1 (en) Wearable input device
KR102437106B1 (ko) 마찰음을 이용하는 장치 및 방법
US20170075433A1 (en) Optical Projection Keyboard and Mouse
US9760758B2 (en) Determining which hand is being used to operate a device using a fingerprint sensor
WO2019033322A1 (zh) 手持式控制器、跟踪定位方法以及系统
WO2015165187A1 (zh) 穿戴式触控装置和穿戴式触控方法
EP3139246A1 (en) Control method and apparatus, electronic device, and computer storage medium
CN105630157A (zh) 控制方法及控制装置、终端和控制系统
JP6364790B2 (ja) ポインティングデバイス
WO2016049842A1 (zh) 一种便携或可穿戴智能设备的混合交互方法
KR20160039589A (ko) 손가락 센싱 방식을 이용한 무선 공간 제어 장치
WO2012129958A1 (zh) 一种手指鼠标
KR20120051274A (ko) 터치 감지영역모양 입력장치용 감지영역모양
CN104423560B (zh) 一种信息处理方法和电子设备
CN110228065A (zh) 机器人运动控制方法及装置
TWI691888B (zh) 觸控式輸入裝置
KR100720647B1 (ko) 반지형 포인팅 장치
TW202427135A (zh) 滑鼠裝置
US20140111445A1 (en) Cursor control device and cursor control system
KR101468273B1 (ko) 다기능 입력 인터페이스 장치

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 14436911

Country of ref document: US

REEP Request for entry into the european phase

Ref document number: 2014859317

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014859317

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20157012703

Country of ref document: KR

Kind code of ref document: A

Ref document number: 2017508720

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14859317

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE