WO2018036229A1 - 一种投影触控方法、装置及设备 - Google Patents

一种投影触控方法、装置及设备 Download PDF

Info

Publication number
WO2018036229A1
WO2018036229A1 PCT/CN2017/085550 CN2017085550W WO2018036229A1 WO 2018036229 A1 WO2018036229 A1 WO 2018036229A1 CN 2017085550 W CN2017085550 W CN 2017085550W WO 2018036229 A1 WO2018036229 A1 WO 2018036229A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
reflected light
intensity
information image
pulse signal
Prior art date
Application number
PCT/CN2017/085550
Other languages
English (en)
French (fr)
Inventor
姜訢
曹腾
黄永顺
Original Assignee
青岛海尔股份有限公司
北京一数科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 青岛海尔股份有限公司, 北京一数科技有限公司 filed Critical 青岛海尔股份有限公司
Publication of WO2018036229A1 publication Critical patent/WO2018036229A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen

Definitions

  • the present invention relates to the field of projection technologies, and in particular, to a projection touch method, device and device.
  • projection equipment is widely used in places such as work, study and entertainment.
  • the projection device When the projection device is used, it is generally connected with electronic devices such as computers and mobile phones.
  • the user can only control the switching of the projected image content through the mouse, keyboard, etc. of the electronic device.
  • the user needs to frequently click the mouse or the keyboard to complete the projection.
  • the manipulation of the screen content obviously, this kind of human-computer interaction is very troublesome and the user experience is not good.
  • the appearance of the portable projection device with a touch screen effectively improves the above-mentioned problems of the human-computer interaction mode, and the user can directly manipulate the projection screen content on the touch screen of the projection device, and the manipulation method thereof and the pair with the touch screen
  • the touch screen of electronic devices such as mobile phones and tablet computers is similarly controlled, and operations such as clicking and sliding can be performed.
  • the touch screen is generally small, the operation of the projected screen is controlled by clicking or sliding the touch screen operation, and the deviation is easily caused when the interface content is selected, and the user experience is still not good.
  • projection touch In recent years, research on direct manipulation of projection content on the projection screen (projection touch) has yielded certain results. Although this projection touch method improves the user experience, it generally recognizes the user by means of camera shooting. Click or slide, the projection touch cannot be accurately performed, and the power consumption of the camera is large.
  • the embodiment of the invention discloses a projection touch method, device and device, which solve the problem that the projection touch technology has low accuracy and large power consumption in the prior projection touch technology.
  • the technical solutions are as follows:
  • an embodiment of the present invention provides a projection touch method, which is applied to a scanning projection device, which projects an invisible laser and transmits an ultrasonic pulse signal to a projection area, wherein the invisible laser The formed invisible laser region covers the projection area, and the method includes:
  • the touch track formed by the determined touch position coordinates is obtained during the target time period, and the touch object is determined according to the time of transmitting the ultrasonic pulse signal, the time sequence and the intensity of the reflected signal of the ultrasonic pulse signal.
  • a distance from the projection area, wherein the target time period is: a time period from a current time to a preset time period;
  • determining, according to the intensity of the reflected light before and after the change, determining the touch position coordinates of the touch object including:
  • the reflected light intensity information image of the touch object is detected by a tip detection algorithm, and the touch position coordinates of the touch object are determined.
  • obtaining the touch track formed by the determined touch position coordinates includes:
  • a touch track is determined according to the moving path and the moving direction.
  • determining the distance between the touch object and the projection area according to the time of transmitting the ultrasonic pulse signal, the time sequence and the intensity of the reflected signal of the ultrasonic pulse signal including:
  • the third depth information image is detected by a tip detection algorithm to determine a distance between the touch object and the projection area.
  • the invisible laser is an infrared laser.
  • an embodiment of the present invention further provides a projection touch device applied to a scanning projection device, wherein the scanning projection device projects an invisible laser light and transmits an ultrasonic pulse signal to a projection area, wherein the invisible An invisible laser region formed by the laser covers the projection area, the device comprising:
  • a receiving module configured to receive the reflected light of the invisible laser, determine the intensity of the reflected light, and receive a reflected signal of the ultrasonic pulse signal to determine the intensity of the reflected signal;
  • a first determining module configured to determine a touch position coordinate of the touch object according to the intensity of the reflected light before and after the change when the intensity of the reflected light changes
  • a second determining module configured to obtain a touch track formed by the determined touch position coordinates in a target time period, where the target time period is: a time period from a current time to a preset time length;
  • a third determining module configured to determine, according to a time period during which the ultrasonic pulse signal is transmitted in the target time period, a time sequence and an intensity of a reflected signal that receives the ultrasonic pulse signal, between the touch object and the projection area distance;
  • a determining module configured to determine whether the distance is less than a preset threshold
  • the response module is configured to respond to the touch action corresponding to the touch track when the distance is less than a preset threshold.
  • the first determining module includes:
  • a first intensity information image determining unit configured to determine a reflected light intensity information image before the change occurs according to the intensity of the reflected light before the change occurs, and determine the reflected light intensity after the change according to the intensity of the reflected light after the change Information image
  • a second intensity information image determining unit configured to differentially process the reflected light intensity information image before the change and the reflected light intensity information image to obtain a reflected light intensity information image of the touch object
  • a touch position coordinate determining unit configured to detect a reflected light intensity information image of the touch object by a tip detection algorithm, Determining a touch position coordinate of the touch object.
  • the second determining module includes:
  • a path and direction determining unit configured to obtain a moving path and a moving direction of the touch object according to the determined touch position coordinates in the target time period
  • the touch track determining unit is configured to determine the touch track according to the moving path and the moving direction.
  • the third determining module includes:
  • a propagation speed determining unit configured to determine a propagation speed of the current period ultrasonic pulse signal according to the current operating temperature
  • a reflection signal determining unit configured to determine a surface reflection signal of a surface of the projection area by an ultrasonic echo processing algorithm according to an intensity of the reflected signal of the ultrasonic pulse signal;
  • a propagation time determining unit configured to calculate a second propagation time sequence of the ultrasonic wave according to a time period of transmitting the ultrasonic pulse signal and receiving a time sequence of the surface reflection signal;
  • a depth of field information determining unit configured to determine a second depth information image of the projection area according to the second propagation time sequence and the propagation speed
  • a difference processing unit configured to perform differential processing on the first depth information image of the projection area determined in the previous cycle and the second depth information image to obtain a third depth information image
  • a distance determining unit configured to detect the third depth information image by a tip detection algorithm to determine a distance between the touch object and the projection area.
  • the invisible laser is an infrared laser.
  • the embodiment of the present invention further provides a projection touch device, including: a laser projection device, an ultrasonic pulse signal transmitting device, a reflective laser receiving device, a reflected ultrasonic pulse signal receiving device, and a data processing device;
  • the laser projection device is configured to scan a visible laser to a projection area to form a projection image, and project an invisible laser to the projection area in a scanning manner to form an invisible laser region, wherein the invisible laser region covers the The projection area;
  • the ultrasonic pulse signal transmitting device is configured to transmit an ultrasonic pulse signal to the projection area;
  • the reflective laser receiving device is configured to receive the reflected light of the invisible laser light, determine the intensity of the reflected light, and transmit the intensity of the reflected light to the data processing device;
  • the reflected ultrasonic pulse signal receiving device is configured to receive a reflected signal of the ultrasonic pulse signal, determine an intensity of the reflected signal, and obtain a time period of transmitting the ultrasonic pulse signal and a time sequence of receiving the reflected signal, and Transmitting the obtained time series and the intensity of the reflected signal to the data processing device;
  • the data processing device is configured to receive the intensity of the reflected light sent by the reflective laser receiving device, and receive the intensity of the time series and the reflected signal sent by the reflected ultrasonic pulse signal receiving device, when the reflected light
  • the touch position coordinates of the touch object are determined according to the intensity of the reflected light before and after the change, and the touch track formed by the touch position coordinates determined in the target time period is obtained, and according to the target time a time during which the ultrasonic signal is transmitted in the segment, a time sequence and an intensity of the received reflected signal, determining a distance between the touch object and the projection area, and determining whether the distance is less than a preset threshold, and if so, responding
  • the touch action corresponding to the touch track wherein the target time period is: a time period from the current time to the preset time length.
  • the data processing device is configured to determine, according to the intensity of the reflected light before the change, an image of the reflected light intensity information before the change occurs, and determine the changed color according to the intensity of the reflected light after the change occurs.
  • a reflected light intensity information image ; performing differential processing on the reflected light intensity information image before the change and the reflected light intensity information image after the change Obtaining a reflected light intensity information image of the touch object; detecting a reflected light intensity information image of the touch object by a tip detection algorithm, and determining a touch position coordinate of the touch object.
  • the data processing device is configured to obtain a moving path and a moving direction of the touch object according to the determined touch position coordinates in the target time period; according to the moving path and the moving direction, Determine the touch track.
  • the projection touch device further includes: a temperature sensor; the temperature sensor is configured to measure a current operating temperature, and send the current working temperature to the data processing device;
  • the data processing device is specifically configured to receive the current operating temperature sent by the temperature sensor, and determine a propagation speed of the ultrasonic pulse signal in a current period according to the current working temperature; and a reflected signal according to the ultrasonic pulse signal Intensity, determining a surface reflection signal of the surface of the projection area by an ultrasonic echo processing algorithm; calculating the ultrasonic wave according to a time period during which the ultrasonic pulse signal is transmitted in the target time period, and a time sequence of receiving the surface reflection signal a first second propagation time sequence for the projection area and a second propagation time for the touch object; determining first depth information of the projection area according to the first propagation time and the propagation speed And determining, according to the second propagation time sequence and the propagation speed, a second depth information image of the touch object of the projection area; and the first depth information of the projection area determined in a previous cycle Performing differential processing on the image and the second depth information image to obtain a third depth information image; End of the third detection algorithm to detect an image depth information
  • the invisible laser is an infrared laser.
  • the reflected light of the invisible laser light is received, the intensity of the reflected light is determined, and the reflected signal of the ultrasonic pulse signal is received, and when the intensity of the reflected light changes, the intensity of the reflected light before and after the change occurs.
  • Determining the touch position coordinates of the touch object and then obtaining a touch track formed by the touch position coordinates determined in the target time period, determining a distance between the touch object and the projection area, and finally determining whether the distance is If the value is less than the preset threshold, if yes, respond to the touch action corresponding to the touch track.
  • the touch position coordinate of the touch object is determined by the intensity of the reflected light
  • the distance between the touch object and the projection area is determined by the ultrasonic pulse signal, and the camera shooting is not required, and the touch action can be quickly and accurately determined, and the touch motion is effectively reduced.
  • the power consumption of the projection device can be accurately touched even when the distance between the projection area and the projection device is only 10 cm, which greatly improves the accuracy of the projection touch.
  • FIG. 1 is a flowchart of a first projection touch method according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of a projection touch device according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of a projection touch device according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a difference image according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a projection touch according to an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of a depth of field image according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of a touch action according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of a touch track provided by an embodiment of the present invention.
  • the embodiment of the invention provides a projection touch method, device and device.
  • a projection touch method provided by an embodiment of the present invention is first introduced.
  • the projection touch method provided by the present invention is applied to a scanning projection device, which scans a visible laser to form a projection image on a solid plane, that is, a projection area is formed to scan.
  • the invisible laser is projected onto the projection area
  • the solid plane forms an invisible laser region, wherein the invisible laser region formed by the invisible laser covers the projection region.
  • the visible laser and the invisible laser they are beamed by optical design, so that each pixel in the projected image displayed in the projection area is superimposed with an invisible laser element in addition to the visible laser element. That is to say, each pixel in the projected picture contains two parts, a visible laser and an invisible laser.
  • the visible laser light may be a visible laser of three colors of R, G, or B, or may be a visible laser of any color, and is not specifically limited herein.
  • a projection touch method includes the following steps:
  • the reflected light of the invisible laser is formed.
  • the intensity of the reflected light can be determined. Since each pixel in the projected picture contains two parts of the visible laser and the invisible laser, the reflected light corresponding to each pixel corresponds to the visible laser contained in the pixel, that is, the reflected light is included with the visible laser.
  • the pixels correspond one by one.
  • the invisible laser is preferably an infrared laser.
  • the reflected light is an invisible laser light reflected by the entire projection area on the solid plane, instead of reflected light at a certain position. It should be further noted that the manner of determining the intensity of the reflected light may be any existing manner, and is not specifically limited herein.
  • the scanning projection device can receive the reflected signal, thereby determining the reflected signal.
  • Strength of. It should be noted that the reflected signal is all reflected signals of the ultrasonic pulse signal.
  • the ultrasonic pulse signal is reflected when it encounters the skin on the back surface of the hand, and also passes through the skin and encounters the human body.
  • the blood vessels and the like also reflect, so in order to calculate the result more accurately, it is necessary to receive all the reflected signals of the ultrasonic pulse signal.
  • the intensity of the reflected light changes, it is indicated that there may be a touch object entering the projection area. Since the distance between the touch object and the scanning projection device is generally closer than the distance between the projection area and the scanning projection device, the reflected light of the invisible laser light projected onto the touch object is invisible to the invisible laser light projected onto the projection area. The reflected light is stronger.
  • the specific manner of determining the touch position coordinates of the touch object may be:
  • the reflected light intensity information image of the touch object is detected by a tip detection algorithm, and the touch position coordinates of the touch object are determined.
  • the reflected light intensity information image before the change and the reflected light intensity information image after the change may be respectively determined according to the intensity of the reflected light before and after the change, and the reflected light of the invisible laser light projected onto the touch object. Since the reflected light of the invisible laser light projected to the projection area is stronger, the reflected light intensity information image before and after the change is subjected to differential processing, and the reflected light intensity information image of the touch object can be obtained.
  • the difference processing is an image of the reflected light intensity information before and after the change, and for example, a difference processing algorithm such as an interframe difference algorithm or a background difference algorithm can be used. As shown in FIG. 4, if the touch object is a finger, the reflected light intensity information image of several typical touch objects in FIG. 4 may be obtained through differential processing.
  • the reflected light intensity information images of the touch objects shown in FIG. 4 are only the reflected light intensity information images of several common touch objects that may be obtained in the present invention, and are not available in the present invention.
  • each pixel in the projection picture includes two parts of the visible laser and the invisible laser, and the reflected light intensity information image of the touch object is obtained, the area in the projected picture blocked by the touch object can be obtained.
  • touch objects such as fingers, stylus pens, etc.
  • touch objects are operated with a tip at the time of touch, so it is easy to find a common strip or stick touch by a simple image processing method such as a tip detection algorithm.
  • the touch position coordinates of the tip of the object are operated with a tip at the time of touch, so it is easy to find a common strip or stick touch by a simple image processing method such as a tip detection algorithm. The touch position coordinates of the tip of the object.
  • the plane of the solid where the projection area is defined is the xy plane, and the touch position coordinates (x, y) of the tip of the touch object (finger) can be determined by the above method.
  • the direction indicated by the coordinate z-axis is the direction in which the physical plane of the projection area is connected with the scanning projection device, that is, the direction of the reflected light of the invisible laser. The method of determining the coordinate z will be explained in the subsequent steps.
  • the touch position coordinate may also be a coordinate in the reflected light intensity information image. Since the coordinates in the reflected light intensity information image are in one-to-one correspondence with the coordinates in the projected area, it is determined as long as it is determined. The coordinates in the reflected light intensity information image also determine the position of the touch object in the projected area.
  • image difference algorithm and the edge detection algorithm are all commonly used image processing methods in the art, and are not specifically described herein.
  • the target time period is a time period from the current time to the preset time length.
  • the target time period can be set according to actual needs, for example, it can be set to 150 milliseconds, 200 milliseconds, etc., and is not specifically limited herein.
  • the distance between the touch object and the projection area is often close, especially when projecting at a close distance, for example, when the wrist projection device shown in FIG. 5 projects onto the back of the hand, the projection area and the projection area The distance between them is only a dozen centimeters, and the distance between the touch object and the projection area is closer.
  • the distance between the touch object and the projection area cannot be accurately determined by other existing methods such as camera shooting. Since the ultrasonic pulse signal can perform close distance measurement, in order to more accurately determine the distance between the touch object and the projection area, the touch can be determined by the time of transmitting the ultrasonic pulse signal and the time of receiving the reflected signal. The distance between the object and the projection area.
  • the manner of determining the touch track may include:
  • a touch track is determined according to the moving path and the moving direction.
  • a plurality of touch position coordinates (x, y) may be determined, and the moving path and the moving direction of the touch object may be obtained by using the touch position coordinates (x, y).
  • the touch position coordinates corresponding to each two adjacent moments can determine the moving path and the moving direction of the touch object in the time period corresponding to the two adjacent moments, and further according to the target time period.
  • the movement path and the moving direction of all the moments determined within can be determined to determine a touch trajectory in the xy plane.
  • determining the touch trajectory since the scanning projection device receives the reflected signal of the ultrasonic pulse signal in real time, it can be based on the time of transmitting the ultrasonic pulse signal and the time series and intensity of the reflected signal of the ultrasonic pulse signal. , determining the distance between the touch object and the projection area. The determining the distance between the touch object and the projection area may include the following steps:
  • the third depth information image is detected by a tip detection algorithm to determine a distance between the touch object and the projection area.
  • the propagation speed of the ultrasonic pulse signal is affected by the temperature, in order to ensure the accuracy of the data, the propagation speed of the current periodic ultrasonic pulse signal can be determined first according to the current operating temperature. It should be noted that the correspondence relationship between the propagation speed of the ultrasonic pulse signal and the temperature is common knowledge and will not be specifically described herein.
  • the distance between the touch object and the scanning projection device and the distance between the projection area and the scanning projection device are different, the reflection timings of the ultrasonic pulse signals are different, and then received. The time of the reflected signals of the two is also different.
  • the distance between the touch object and the scanning projection device and the distance between the projection area and the scanning projection device refer to a straight line determined by the ultrasonic pulse signal emission position and the touch object. The distance above is the distance along the z-axis in Figure 5, not the normal direction of the plane in which the projection area lies.
  • the received reflected signal includes reflected signals on the surface of the projection area and internal reflection.
  • the reflected signal reflected inside the projection area cannot be used to calculate the distance between the touch object and the projection area.
  • the ultrasonic pulse signal will reflect when it encounters the skin on the back surface of the hand, and Through the skin, the blood vessels of the human body will also reflect, and the reflection signal of the blood vessels of the human body cannot be used to calculate the distance between the touch object and the projection area. Therefore, it is necessary to reflect the signal according to the ultrasonic pulse signal. Intensity, the surface reflection signal of the surface of the projection area is determined by an ultrasonic echo processing algorithm.
  • the intensity of the surface reflection signal can still determine the reflection signal of the surface of the touch object.
  • the touch object is a finger
  • the ultrasonic echo processing algorithm is used.
  • the reflected signal on the surface of the finger can be determined.
  • the surface reflection signal of the surface of the determined projection area is a reflection signal of the surface of the touch object and a surface reflection signal of the projection area that is blocked by the touch object.
  • the ultrasonic pulse signals can be transmitted according to The time of the number, the time series of receiving the surface reflection signal, and the second propagation time sequence of the ultrasonic wave is calculated.
  • the second propagation time sequence is not a time corresponding to a certain point in the projection area, but a time difference between the time of transmitting the ultrasonic pulse signal corresponding to the entire projection area of the current period and the time of receiving the surface reflection signal, due to surface reflection.
  • the signal is the surface reflection signal of the entire projection area, and the projection area may be a slope or a rugged surface, so the time for receiving the surface reflection signal is different, which is a series of different times, that is, the above time series. .
  • the second depth information image of the projection area can be determined according to Time Of Flight (TOF) and ultrasonic imaging techniques.
  • TOF Time Of Flight
  • the second depth of field information image may be obtained according to the second depth of field information by using an ultrasonic imaging technique, for example, as shown in (2) of FIG. 5, the darker the color in the figure indicates the scanning device with the scanning type. The closer the distance is, the more the depth of field information image is the corresponding depth of field information image projected onto a slope, such as the second depth of field information image projected by the wrist projection device onto the back of the hand. It can be seen that the depth of field information of the touch object appears on the right side of the second depth information image, because the distance between the touch object and the scanning projection device is closer than the distance between the projection area and the scanning projection device, so The color of the area in the second depth information image corresponding to the object is darker. It should be emphasized that the first depth information image shown in (2) of FIG. 5 is only one possible situation provided by the present invention, and does not constitute a limitation of the second depth information image provided by the present invention.
  • the propagation speed, t 1 is the time when the ultrasonic pulse signal is transmitted in the previous cycle, and the time difference of the time series of the received surface reflection signal, that is, the first propagation time sequence.
  • the calculated first depth information d 1 is also a series of different depth value information corresponding to the entire projection area.
  • the first depth of field information may be determined according to the first depth information. Since the specific determination manner is the same as the second depth information image, the details are not described herein.
  • the first depth information image shown in (1) of FIG. 5 can be obtained from the figure, and the first depth information image is corresponding to (2) in FIG. 5, and when there is no touch object, the projection is performed.
  • a corresponding depth of field information image on a slope such as that obtained by projecting a wrist projection device onto the back of the hand. It should be emphasized that the first depth information image shown in (1) of FIG. 5 is only one possible situation provided by the present invention, and does not constitute a limitation of the first depth information image provided by the present invention.
  • the third depth information image corresponding to the distance between the touch object and the projection area can be obtained by subtracting the two.
  • the reflection signal of the next period is continuously received, and the above process is repeated.
  • the distance obtained is the distance between the entire touch object and the projection area, instead of a point on the touch object.
  • the distance between the projection areas since the general touch object (such as a finger, a stylus, etc.) is operated with the tip when touching, the tip detection algorithm can detect the touch from the third depth information image.
  • the position near the tip of the object is controlled to obtain the distance between the position near the tip end of the touch object and the projected area.
  • the minimum or maximum distance may be determined according to the actual projection situation and the manipulation manner.
  • the value or the average value of all the distances between the tip position of the touch object and the projection area is used as the distance between the touch object and the projection area. limited. It can be understood that after determining the distance, the coordinate z of the touch object in FIG. 5 is also determined.
  • the depth of field information of the touch object may exist in the third depth information image corresponding to the plurality of cycles, and the touch object and the projection area obtained by the plurality of cycles may be The average value or the minimum value or the maximum value of the distance between the two is used as the distance between the final touch object and the projection area, and is not specifically limited herein.
  • step S104 determining whether the distance is less than a preset threshold, and if so, executing step S105;
  • the distance may be compared with the preset threshold to determine whether the touch action is effective.
  • the touch operation is performed based on the comparison result.
  • the preset threshold may be set according to actual operation requirements and user habits. Since the general touch object itself has a certain thickness, it may be further set according to the touch object used, for example, it may be set to 1 cm, 1.5 cm, etc., are not specifically limited herein. Of course, different thresholds can be set for different touch tracks to achieve more precise touch.
  • the touch action may be a touch action such as clicking, sliding, long press, and double tap, and is not specifically limited herein.
  • step S101 is continued.
  • 701 is a wrist projection device
  • 702 is a device for projecting visible laser light and invisible laser light
  • 703 is a reflected light receiving device
  • 704 is a touch object (finger)
  • 706 is an ultrasonic wave. Pulse signal transmitting and receiving device. It can be seen that the distances of the touch object from the projection area in (1) and (2) in FIG. 7 are different.
  • the tip of the touch object 704 that is, the distance between the finger tip and the projection area is very close. It is calculated to be 0.8 cm.
  • the tip of the touch object 704 is the distance between the tip of the finger and the projection area, which is calculated to be 2.5 cm.
  • the touch action corresponding to the touch object 704 in (1) is an effective action
  • the touch action corresponding to the touch object 704 in (2) is an invalid action, Do anything.
  • the position where the touch object 704 actually clicks on the projection area in (2) should be the shadow portion on the back of the hand in the figure, if the distance between the tip end of the touch object 704 and the projection area is not Judging with the preset threshold value, it is considered as an effective action, and the touch position coordinate of the touch object 704 in (2) is directly used as the click action position, and the touch action performed is wrong.
  • the reflected light of the invisible laser light is first received, the intensity of the reflected light is determined, and the reflected signal of the ultrasonic pulse signal is received, and when the intensity of the reflected light changes, the intensity of the reflected light before and after the change is determined.
  • Controlling the touch position coordinates then obtaining the touch track formed by the touch position coordinates determined in the target time period, determining the distance between the touch object and the projection area, and finally determining whether the distance is less than a preset
  • the threshold value if yes, responds to the touch action corresponding to the touch track.
  • the touch position coordinate of the touch object is determined by the intensity of the reflected light
  • the distance between the touch object and the projection area is determined by the ultrasonic pulse signal, and the camera shooting is not required, and the touch action can be quickly and accurately determined, and the touch motion is effectively reduced.
  • the power consumption of the projection device can be accurately touched even when the distance between the projection area and the projection device is only 10 cm, which greatly improves the accuracy of the projection touch.
  • a projection touch method applied to a scanning projection device may include the following steps:
  • the touch track formed by the touch position coordinates determined in the target time period is a point, as shown in FIG. 8(1).
  • the minimum value of the distance between the tip position of the touch object and the projection area is determined as the distance between the touch object and the projection area.
  • Embodiment 1 The difference from Embodiment 1 is:
  • the touch track formed by the touch position coordinates determined in the target time period is a straight line in the set area, as shown in FIG. 8 (2), the set area can be set according to user operation habits, for example, It may be a circular area having a diameter of 10 pixels or a square area having a side length of 15 pixels, and the like, and is not particularly limited herein.
  • determining an average value of the distance between the tip position of the touch object and the projection area is a distance between the touch object and the projection area.
  • Embodiment 1 The difference from Embodiment 1 is:
  • the touch track formed by the touch position coordinates determined in the target time period is a curve in the set area, as shown in FIG. 8 (3)
  • the set area can be set according to user operation habits, for example, It may be a circular area having a diameter of 10 pixels or the like, or a square area having a side length of 15 pixels, which is not specifically limited herein.
  • determining a maximum value of the distance between the tip position of the touch object and the projection area is a distance between the touch object and the projection area.
  • a projection touch method applied to a scanning projection device may include the following steps:
  • the touch track formed by the touch position coordinates determined in the target time period is a straight line, as shown in FIG. 8(4).
  • the range of the y value in the touch position coordinate on the line does not exceed the preset value.
  • the preset value can be set according to user operation habits, for example, 100 pixels, 150 pixels, etc. , there is no specific limit here.
  • the minimum value of the distance between the tip position of the touch object and the projection area is determined as the distance between the touch object and the projection area.
  • the direction of the line connecting the position coordinates of the start point of the touch track and the position coordinate of the end point is used as the sliding direction of the sliding touch action.
  • Embodiment 4 The difference from Embodiment 4 is:
  • the touch track formed by the touch position coordinates determined in the target time period is a broken line, as shown in FIG. 8(5).
  • the range of the y value of the touch position coordinate on the fold line does not exceed the preset value.
  • the preset value can be set according to user operation habits, for example, 100 pixels, 150 pixels, etc. , there is no specific limit here.
  • determining a maximum value of the distance between the tip position of the touch object and the projection area is a distance between the touch object and the projection area.
  • Embodiment 4 The difference from Embodiment 4 is:
  • the touch track formed by the touch position coordinates determined in the target time period is a curve, as shown in FIG. 8 (6).
  • the range of the y value in the touch position coordinate on the curve does not exceed the preset value.
  • the preset value can be set according to user operation habits, for example, 100 pixels, 150 pixels, etc. , there is no specific limit here.
  • determining an average value of the distance between the tip position of the touch object and the projection area is a distance between the touch object and the projection area.
  • the embodiment of the present invention further provides a projection touch device, and a projection touch device according to an embodiment of the present invention is described below.
  • the projection touch device provided by the present invention is applied to a scanning projection device that projects an invisible laser and transmits an ultrasonic pulse signal to a projection area, wherein the invisible laser is formed.
  • the visible laser area covers the projected area.
  • a projection touch device is applied to a scanning projection device, and may include:
  • the receiving module 210 is configured to receive the reflected light of the invisible laser, determine the intensity of the reflected light, and receive a reflected signal of the ultrasonic pulse signal to determine the intensity of the reflected signal;
  • the first determining module 220 is configured to determine a touch position coordinate of the touch object according to the intensity of the reflected light before and after the change when the intensity of the reflected light changes;
  • the second determining module 230 is configured to obtain a touch track formed by the determined touch position coordinates during the target time period;
  • the target time period is: a time period from the current time to the preset time duration.
  • a second determining module 240 configured to receive the ultrasonic wave according to a time when the ultrasonic pulse signal is transmitted in the target time period a time sequence and an intensity of the reflected signal of the pulse signal, determining a distance between the touch object and the projection area;
  • the determining module 250 is configured to determine whether the distance is less than a preset threshold
  • the response module 260 is configured to respond to the touch action corresponding to the touch track when the distance is less than a preset threshold.
  • the reflected light of the invisible laser light is first received, the intensity of the reflected light is determined, and the reflected signal of the ultrasonic pulse signal is received, and when the intensity of the reflected light changes, the intensity of the reflected light before and after the change is determined.
  • Controlling the touch position coordinates then obtaining the touch track formed by the touch position coordinates determined in the target time period, determining the distance between the touch object and the projection area, and finally determining whether the distance is less than a preset
  • the threshold value if yes, responds to the touch action corresponding to the touch track.
  • the touch position coordinate of the touch object is determined by the intensity of the reflected light
  • the distance between the touch object and the projection area is determined by the ultrasonic pulse signal, and the camera shooting is not required, and the touch action can be quickly and accurately determined, and the touch motion is effectively reduced.
  • the power consumption of the projection device can be accurately touched even when the distance between the projection area and the projection device is only 10 cm, which greatly improves the accuracy of the projection touch.
  • the invisible laser is an infrared laser.
  • the first determining module 220 may include:
  • a first intensity information image determining unit configured to determine a reflected light intensity information image before the change occurs according to the intensity of the reflected light before the change occurs, and determine the reflected light intensity after the change according to the intensity of the reflected light after the change Information image
  • a second intensity information image determining unit configured to differentially process the reflected light intensity information image before the change and the reflected light intensity information image to obtain a reflected light intensity information image of the touch object
  • the touch position coordinate determining unit is configured to detect the reflected light intensity information image of the touch object by using a tip detection algorithm to determine a touch position coordinate of the touch object.
  • the second determining module 230 may include:
  • a path and direction determining unit configured to obtain a moving path and a moving direction of the touch object according to the determined touch position coordinates in the target time period
  • the touch track determining unit is configured to determine the touch track according to the moving path and the moving direction.
  • the third determining module 240 may include:
  • a propagation speed determining unit configured to determine a propagation speed of the current period ultrasonic pulse signal according to the current operating temperature
  • a reflection signal determining unit configured to determine a surface reflection signal of a surface of the projection area by an ultrasonic echo processing algorithm according to an intensity of the reflected signal of the ultrasonic pulse signal;
  • a propagation time determining unit configured to calculate a second propagation time sequence of the ultrasonic wave according to a time period of transmitting the ultrasonic pulse signal and receiving a time sequence of the surface reflection signal;
  • a depth of field information determining unit configured to determine a second depth information image of the projection area according to the second propagation time sequence and the propagation speed
  • a difference processing unit configured to perform differential processing on the first depth information image of the projection area determined in the previous cycle and the second depth information image to obtain a third depth information image
  • a distance determining unit configured to detect the third depth information image by a tip detection algorithm to determine a distance between the touch object and the projection area.
  • the embodiment of the invention further provides a projection touch device, and a projection touch device according to an embodiment of the invention is introduced below.
  • a projection touch device includes: a laser projection device 310, an ultrasonic pulse signal transmitting device 320, a reflective laser receiving device 330, a reflected ultrasonic pulse signal receiving device 340, and a data processing device 350;
  • a laser projection device 310 configured to scan a visible laser to a projection area to form a projection image, and scan the invisible laser to the projection area to form an invisible laser region;
  • the invisible laser region covers the projection area.
  • An ultrasonic pulse signal transmitting device 320 configured to transmit an ultrasonic pulse signal to the projection area
  • the reflected laser receiving device 330 is configured to receive the reflected light of the invisible laser light, determine the intensity of the reflected light, and send the intensity of the reflected light to the data processing device 350;
  • a reflected ultrasonic pulse signal receiving device 340 configured to receive a reflected signal of the ultrasonic pulse signal, determine an intensity of the reflected signal, and obtain a time period of transmitting the ultrasonic pulse signal and a time series of receiving the reflected signal, and The obtained time series and the intensity of the reflected signal are sent to the data processing device 350;
  • the data processing device 350 is configured to receive the intensity of the reflected light sent by the reflection mechanism receiving device, and receive the intensity of the time series and the reflected signal sent by the reflected ultrasonic pulse signal receiving device, when the intensity of the reflected light
  • the touch position coordinates of the touch object are determined according to the intensity of the reflected light before and after the change, and the touch track formed by the touch position coordinates determined in the target time period is obtained, and according to the target time period.
  • the time of transmitting the ultrasonic pulse signal, the time series and intensity of the received reflected signal, determining the distance between the touch object and the projection area, and determining whether the distance is less than a preset threshold, and if so, responding The touch action corresponding to the touch track.
  • the target time period is: a time period from the current time to the preset time duration.
  • the laser projection device projects the invisible laser to the projection area
  • the ultrasonic pulse signal transmitting device transmits the ultrasonic pulse signal to the projection area
  • the reflected laser receiving device receives the reflected light of the invisible laser
  • the reflected ultrasonic pulse signal receiving device receives the ultrasonic wave.
  • the reflected signal of the pulse signal when the intensity of the reflected light changes, the data processing device determines the touch position coordinate of the touch object according to the intensity of the reflected light before and after the change, and then obtains the touch position coordinate determined within the target time period.
  • the invisible laser is an infrared laser.
  • the data processing device 350 is configured to:
  • the reflected light intensity information image is differentially processed with the reflected light intensity information image to obtain a reflected light intensity information image of the touch object; and the reflected light intensity information image of the touch object is detected by a tip detection algorithm Determining a touch position coordinate of the touch object.
  • the data processing device 350 is configured to:
  • the projection touch device may further include:
  • a temperature sensor for measuring a current operating temperature and transmitting the current operating temperature to the data processing device 350;
  • the data processing device 350 specifically receives the current operating temperature sent by the temperature sensor, and determines a propagation speed of the current periodic ultrasonic pulse signal according to the current operating temperature; and according to the intensity of the reflected signal of the ultrasonic pulse signal Determining, by an ultrasonic echo processing algorithm, a surface reflection signal of the surface of the projection area; calculating a second propagation time sequence of the ultrasonic wave according to the time of transmitting the ultrasonic pulse signal and receiving a time sequence of the reflected signal; a second propagation time sequence and the propagation speed, determining a second depth information image of the projection area; and differentiating the first depth information image of the projection area determined in the previous period from the second depth information image Processing, obtaining a third depth of field information image; detecting the third depth information image by a tip detection algorithm to determine a distance between the touch object and the projection area.
  • the data processing device 350 can transmit the first control signal to the projection device 310 to control the laser projection device 310 to project visible laser light to the projection area to form a projection image, and project the invisible laser light to the projection area to form an invisible laser. region. It is also possible to transmit a second control signal to the ultrasonic pulse signal transmitting means 320 to control the ultrasonic pulse signal transmitting means 320 to transmit an ultrasonic pulse signal to the projection area.
  • the projection device 310 and the ultrasonic pulse signal transmitting device 320 can also start to operate at the same time that the projection touch device is activated, which is reasonable.

Abstract

本发明实施例公开了一种投影触控方法、装置及设备,方法包括:接收不可见激光的反射光,确定反射光的强度,并接收超声波脉冲信号的反射信号,确定反射信号的强度,当反射光的强度发生变化时,确定触控物的触控位置坐标,在目标时间段内,获得所确定的触控位置坐标形成的触控轨迹,并根据发射超声波脉冲信号的时间、接收超声波脉冲信号的反射信号的时间序列及强度,确定触控物与投影区域之间的距离,判断距离是否小于预设的阈值,如果是,响应触控轨迹对应的触控动作。可见,由于采用反射光的强度确定触控位置坐标,通过超声波脉冲信号确定触控物与投影区域的距离,可以快速精确地确定触控动作,大大提高了投影触控的准确度。

Description

一种投影触控方法、装置及设备
本申请要求了申请日为2016年8月26日,申请号为201610741260.3,发明名称为“一种投影触控方法、装置及设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及投影技术领域,特别涉及一种投影触控方法、装置及设备。
背景技术
目前,投影设备广泛应用于工作、学习及娱乐等场所。投影设备在使用时一般与电脑、手机等电子设备连接,用户只能通过电子设备的鼠标、键盘等控制投影画面内容的切换,在投影过程中,用户需要频繁地点击鼠标或键盘来完成对投影画面内容的操控,显然,这种人机交互方式是非常麻烦的,用户体验不佳。
带有触摸屏幕的便携式投影设备的出现有效地改善了上述人机交互方式存在的问题,用户可以直接在投影设备的触摸屏幕上对投影画面内容进行操控,其操控方法与对带有触摸屏幕的手机、平板电脑等电子设备的触摸屏幕的操控方法类似,可以进行点击、滑动等操作。但是,由于触摸屏幕一般较小,通过点击或滑动触摸屏幕的操作方式来控制投影画面内容,在选择界面内容时很容易出现偏差,用户体验仍然不好。
近年来,对在投影画面上直接对投影内容进行操控(投影触控)的研究得到了一定成果,这种投影触控方式虽然提高了用户体验,但其一般通过摄像头拍摄的方式来识别用户的点击或滑动操作,不能精准地进行投影触控,且摄像头的耗电量很大。
发明内容
本发明实施例公开了一种投影触控方法、装置及设备,以解决现有投影触控技术中投影触控准确度低,耗电量大的问题。技术方案如下:
第一方面,本发明实施例提供了一种投影触控方法,应用于扫描式投影设备,所述扫描式投影设备投射不可见激光并发射超声波脉冲信号至投影区域,其中,所述不可见激光形成的不可见激光区域覆盖所述投影区域,所述方法包括:
接收所述不可见激光的反射光,确定所述反射光的强度,并接收所述超声波脉冲信号的反射信号,确定所述反射信号的强度;
当所述反射光的强度发生变化时,根据发生变化前后所述反射光的强度,确定触控物的触控位置坐标;
在目标时间段内,获得所确定的触控位置坐标形成的触控轨迹,并根据发射超声波脉冲信号的时间、接收所述超声波脉冲信号的反射信号的时间序列及强度,确定所述触控物与所述投影区域之间的距离,其中,所述目标时间段为:从当前时刻开始向前预设时长的时间段;
判断所述距离是否小于预设的阈值;
如果是,则响应所述触控轨迹对应的触控动作。
可选的,所述根据发生变化前后所述反射光的强度,确定触控物的触控位置坐标,包括:
根据发生变化前所述反射光的强度确定发生变化前的反射光强度信息图像,根据发生变化后所述反射光的强度确定发生变化后的反射光强度信息图像;
将所述发生变化前的反射光强度信息图像与所述发生变化后的反射光强度信息图像进行差分 处理,得到触控物的反射光强度信息图像;
通过尖端检测算法对所述触控物的反射光强度信息图像进行检测,确定所述触控物的触控位置坐标。
可选的,所述在目标时间段内,获得所确定的触控位置坐标形成的触控轨迹,包括:
根据目标时间段内所确定的触控位置坐标,获得所述触控物的移动路径和移动方向;
根据所述移动路径和所述移动方向,确定触控轨迹。
可选的,所述根据发射超声波脉冲信号的时间、接收所述超声波脉冲信号的反射信号的时间序列及强度,确定所述触控物与所述投影区域之间的距离,包括:
根据当前工作温度确定当前周期超声波脉冲信号的传播速度;
根据所述超声波脉冲信号的反射信号的强度,通过超声回波处理算法确定所述投影区域表面的表面反射信号;
根据发射所述超声波脉冲信号的时间、接收所述表面反射信号的时间序列,计算所述超声波的第二传播时间序列;
根据所述第二传播时间序列及所述传播速度,确定所述投影区域的第二景深信息图像;
将上一周期确定的所述投影区域的第一景深信息图像与所述第二景深信息图像进行差分处理,得到第三景深信息图像;
通过尖端检测算法对所述第三景深信息图像进行检测,确定所述触控物与所述投影区域之间的距离。
可选的,所述不可见激光为红外激光。
第二方面,本发明实施例还提供了一种投影触控装置,应用于扫描式投影设备,所述扫描式投影设备投射不可见激光并发射超声波脉冲信号至投影区域,其中,所述不可见激光形成的不可见激光区域覆盖所述投影区域,所述装置包括:
接收模块,用于接收所述不可见激光的反射光,确定所述反射光的强度,并接收所述超声波脉冲信号的反射信号,确定所述反射信号的强度;
第一确定模块,用于当所述反射光的强度发生变化时,根据发生变化前后所述反射光的强度,确定触控物的触控位置坐标;
第二确定模块,用于在目标时间段内,获得所确定的触控位置坐标形成的触控轨迹,其中,所述目标时间段为:从当前时刻开始向前预设时长的时间段;
第三确定模块,用于根据所述目标时间段内发射超声波脉冲信号的时间、接收所述超声波脉冲信号的反射信号的时间序列及强度,确定所述触控物与所述投影区域之间的距离;
判断模块,用于判断所述距离是否小于预设的阈值;
响应模块,用于在所述距离小于预设的阈值时响应所述触控轨迹对应的触控动作。
可选的,所述第一确定模块包括:
第一强度信息图像确定单元,用于根据发生变化前的所述反射光的强度确定发生变化前的反射光强度信息图像,根据发生变化后所述反射光的强度确定发生变化后的反射光强度信息图像;
第二强度信息图像确定单元,用于将所述发生变化前的反射光强度信息图像与所述发生变化后的反射光强度信息图像进行差分处理,得到触控物的反射光强度信息图像;
触控位置坐标确定单元,用于通过尖端检测算法对所述触控物的反射光强度信息图像进行检测, 确定所述触控物的触控位置坐标。
可选的,所述第二确定模块包括:
路径和方向确定单元,用于根据目标时间段内所确定的触控位置坐标,获得所述触控物的移动路径和移动方向;
触控轨迹确定单元,用于根据所述移动路径和所述移动方向,确定触控轨迹。
可选的,所述第三确定模块包括:
传播速度确定单元,用于根据当前工作温度确定当前周期超声波脉冲信号的传播速度;
反射信号确定单元,用于根据所述超声波脉冲信号的反射信号的强度,通过超声回波处理算法确定所述投影区域表面的表面反射信号;
传播时间确定单元,用于根据发射所述超声波脉冲信号的时间、接收所述表面反射信号的时间序列,计算所述超声波的第二传播时间序列;
景深信息确定单元,用于根据所述第二传播时间序列及所述传播速度,确定所述投影区域的第二景深信息图像;
差分处理单元,用于将上一周期确定的所述投影区域的第一景深信息图像与所述第二景深信息图像进行差分处理,得到第三景深信息图像;
距离确定单元,用于通过尖端检测算法对所述第三景深信息图像进行检测,确定所述触控物与所述投影区域之间的距离。
可选的,所述不可见激光为红外激光。
第三方面,本发明实施例还提供了一种投影触控设备,包括:激光投射装置、超声波脉冲信号发射装置、反射激光接收装置、反射超声波脉冲信号接收装置和数据处理装置;其中,
所述激光投射装置,用于以扫描方式投射可见激光至投影区域形成投影画面,并以扫描方式投射不可见激光至所述投影区域形成不可见激光区域,其中,所述不可见激光区域覆盖所述投影区域;
所述超声波脉冲信号发射装置,用于发射超声波脉冲信号至所述投影区域;
所述反射激光接收装置,用于接收所述不可见激光的反射光,确定所述反射光的强度,并将所述反射光的强度发送至所述数据处理装置;
所述反射超声波脉冲信号接收装置,用于接收所述超声波脉冲信号的反射信号,确定所述反射信号的强度,并获得发射所述超声波脉冲信号的时间和接收所述反射信号的时间序列,并将所获得的时间序列及所述反射信号的强度发送至所述数据处理装置;
所述数据处理装置,用于接收所述反射激光接收装置发送的所述反射光强度,接收所述反射超声波脉冲信号接收装置发送的所述时间序列及反射信号的强度,当所述反射光的强度发生变化时,根据发生变化前后所述反射光的强度,确定触控物的触控位置坐标,获得目标时间段内所确定的触控位置坐标形成的触控轨迹,并根据所述目标时间段内发射超声波信号的时间、接收反射信号的时间序列及强度,确定所述触控物与所述投影区域之间的距离,并判断所述距离是否小于预设的阈值,如果是,则响应所述触控轨迹对应的触控动作,其中,所述目标时间段为:从当前时刻开始向前预设时长的时间段。
可选的,所述数据处理装置,具体用于根据发生变化前的所述反射光的强度确定发生变化前的反射光强度信息图像,根据发生变化后所述反射光的强度确定发生变化后的反射光强度信息图像;将所述发生变化前的反射光强度信息图像与所述发生变化后的反射光强度信息图像进行差分处理, 得到触控物的反射光强度信息图像;通过尖端检测算法对所述触控物的反射光强度信息图像进行检测,确定所述触控物的触控位置坐标。
可选的,所述数据处理装置,具体用于根据目标时间段内所确定的触控位置坐标,获得所述触控物的移动路径和移动方向;根据所述移动路径和所述移动方向,确定触控轨迹。
可选的,所述投影触控设备还包括:温度传感器;所述温度传感器,用于测量当前工作温度,并将所述当前工作温度发送至所述数据处理装置;
所述数据处理装置,具体用于接收所述温度传感器发送的所述当前工作温度,并根据所述当前工作温度确定当前周期所述超声波脉冲信号的传播速度;根据所述超声波脉冲信号的反射信号的强度,通过超声回波处理算法确定所述投影区域表面的表面反射信号;根据所述目标时间段内发射所述超声波脉冲信号的时间、接收所述表面反射信号的时间序列,计算所述超声波针对所述投影区域的的第一第二传播时间序列和针对所述触控物的第二传播时间;根据所述第一传播时间及所述传播速度,确定所述投影区域的第一景深信息图像,;并根据所述第二传播时间序列及所述传播速度,确定所述投影区域触控物的第二景深信息图像;将上一周期确定的所述投影区域的所述第一景深信息图像与所述第二景深信息图像进行差分处理,得到第三景深信息图像;通过尖端检测算法对所述第三景深信息图像进行检测,确定所述触控物与所述投影区域之间的距离。
可选的,所述不可见激光为红外激光。
本发明实施例提供的方案中,首先接收不可见激光的反射光,确定反射光的强度,并接收超声波脉冲信号的反射信号,当反射光的强度发生变化时,根据发生变化前后反射光的强度,确定触控物的触控位置坐标,然后获得目标时间段内所确定的触控位置坐标形成的触控轨迹,并确定触控物与所述投影区域之间的距离,最后判断该距离是否小于预设的阈值,如果是,则响应所述触控轨迹对应的触控动作。可见,由于采用反射光的强度确定触控物的触控位置坐标,通过超声波脉冲信号确定触控物与投影区域的距离,不需要摄像头拍摄,可以快速精确地确定触控动作,同时有效降低了投影设备的耗电量,即使在投影区域与投影设备距离仅为10厘米时也可以进行精准触控,大大提高了投影触控的准确度。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例所提供的第一种投影触控方法的流程图。
图2为本发明实施例所提供的一种投影触控装置的示意图。
图3为本发明实施例所提供的一种投影触控设备的示意图。
图4为本发明实施例所提供的差分图像示意图。
图5为本发明实施例所提供的投影触控示意图。
图6为本发明实施例所提供的景深图像示意图。
图7为本发明实施例所提供的触控动作示意图。
图8为本发明实施例所提供的触控轨迹示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显 然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
为了提高投影触控的准确度,降低投影设备的耗电量,本发明实施例提供了一种投影触控方法、装置及设备。
下面首先对本发明实施例所提供的一种投影触控方法进行介绍。
首先需要说明的是,本发明提供的一种投影触控方法应用于扫描式投影设备,该扫描式投影设备以扫描方式投射可见激光形成投影画面于一实体平面,即形成了投影区域,以扫描方式投射不可见激光至该投影区域时,该实体平面又形成了不可见激光区域,其中,该不可见激光形成的不可见激光区域覆盖该投影区域。对于该可见激光与不可见激光,通过光学设计的方法将它们和束,使得在投影区域显示的投影画面中的每个像素,除了具有可见激光元素之外,还叠加了一个不可见激光元素,也就是说,投影画面中的每个像素包含可见激光及不可见激光两个部分。该可见激光可以为R、G、B三种颜色的可见激光,也可以是任意一种颜色的可见激光,在此不做具体限定。
如图1所示,一种投影触控方法,包括以下步骤:
S101,接收所述不可见激光的反射光,确定所述反射光的强度,并接收所述超声波脉冲信号的反射信号,确定所述反射信号的强度;
当投射的不可见激光到达该实体平面进而被该实体平面反射时,就形成了该不可见激光的反射光。扫描式投影设备接收到该不可见激光的反射光后,便可以确定该反射光的强度。由于投影画面中的每个像素包含可见激光及不可见激光两个部分,所以每个像素对应的反射光是对应于该像素包含的可见激光的,也就是说,该反射光是与包含可见激光的像素一一对应的。其中,该不可见激光优选为红外激光。
需要说明的是,该反射光为该实体平面上整个投影区域所反射的不可见激光,而不是某一位置的反射光。进一步需要说明的是,确定该反射光的强度的方式可以采用现有任意方式,在此不做具体限定。
同理的,发射超声波脉冲信号至该投影区域后,当该超声波脉冲信号遇到该实体平面后便会形成反射信号,扫描式投影设备便可以接收到该反射信号,进而便可以确定该反射信号的强度。需要说明的是,该反射信号为该超声波脉冲信号的所有反射信号,例如,当投影区域为人体手背时,该超声波脉冲信号遇到手背表面皮肤时会反射,也会穿过皮肤,遇到人体的血管等也会发生反射,所以为了计算结果更加准确,需要接收该超声波脉冲信号的所有反射信号。当然,也可以根据实验经验设定一个合理的接收时间,超过该接收时间的反射信号便不再进行接收。
S102,当所述反射光的强度发生变化时,根据发生变化前后所述反射光的强度,确定触控物的触控位置坐标;
当该反射光的强度发生变化时,说明可能有触控物进入投影区域。由于触控物与扫描式投影设备的距离一般比投影区域与扫描式投影设备的距离的距离近,所以投射到触控物上的不可见激光的反射光会比投射到投影区域的不可见激光的反射光更强。
根据发生变化前后反射光的强度,确定触控物的触控位置坐标的具体方式可以为:
根据发生变化前的所述反射光的强度确定发生变化前的反射光强度信息图像,根据发生变化后所述反射光的强度确定发生变化后的反射光强度信息图像;
将所述发生变化前的反射光强度信息图像与所述发生变化后的反射光强度信息图像进行差分处理,得到触控物的反射光强度信息图像;
通过尖端检测算法对所述触控物的反射光强度信息图像进行检测,确定所述触控物的触控位置坐标。
具体的,根据发生变化前后所述反射光的强度可以分别确定发生变化前的反射光强度信息图像及发生变化后的反射光强度信息图像,由于投射到触控物上的不可见激光的反射光比投射到投影区域的不可见激光的反射光强,所以对发生变化前后的反射光强度信息图像进行差分处理,便可以得到触控物的反射光强度信息图像。该差分处理即为将发生变化前后的反射光强度信息图像,例如可以采用帧间差分算法、背景差分算法等差分处理算法。如图4所示,如果该触控物为手指,通过差分处理,便可能得到图4中的几种典型的触控物的反射光强度信息图像。
需要说明的是,图4所示的几种触控物的反射光强度信息图像只是本发明中可能得到的几种常见的触控物的反射光强度信息图像,并不是本发明中可以得到的所有触控物的反射光强度信息图像。
由于投影画面中的每个像素包含可见激光及不可见激光两个部分,得到该触控物的反射光强度信息图像后,便可以得到该触控物所遮挡住的投影画面中的区域,由于一般触控物(例如手指、触控笔等)在触控时都是使用尖端进行操作,所以可以通过尖端检测算法等简单的图像处理的方法很容易地找到常见的条状或棒状的触控物的尖端的触控位置坐标。
如图5所示,定义投影区域所在实体平面为xy平面,那么通过上述方法便可以确定出触控物(手指)的尖端的触控位置坐标(x,y)。其中,需要说明的是,坐标z轴表示的方向为投影区域所在实体平面与扫描式投影设备连线的方向,即为不可见激光的反射光的方向。坐标z的确定方法将在后续步骤中说明。
需要说明的是,该触控位置坐标也可以是在该反射光强度信息图像中的坐标,由于该反射光强度信息图像中的坐标与投影区域中的坐标是一一对应的,所以只要确定了该反射光强度信息图像中的坐标,也就确定了该触控物对应在投影区域中的位置。
进一步需要说明的是,上述图像差分算法及尖端检测算法均为本领域常用的图像处理方法,在此不做具体说明。
S103,在目标时间段内,获得所确定的触控位置坐标形成的触控轨迹,并根据发射超声波脉冲信号的时间、接收所述超声波脉冲信号的反射信号的时间序列及强度,确定所述触控物与所述投影区域之间的距离;
其中,该目标时间段为从当前时刻开始向前预设时长的时间段。该目标时间段一般可以根据实际需要进行设定,例如可以设定为150毫秒、200毫秒等,在此不做具体限定。
由于在实际应用中,触控物与投影区域之间的距离往往较近,特别是在近距离投影时,例如图5所示的腕式投影设备向手背上投影时,投影区域与投影区域之间的距离仅为十几厘米,触控物与投影区域之间的距离则更近,通过摄像头拍摄等其他现有方式均不能准确的确定该触控物与投影区域之间的距离。由于超声波脉冲信号可以进行近距离的测距,所以为了更加准确的确定该触控物与投影区域之间的距离,可以通过发射超声波脉冲信号的时间及接收反射信号的时间,确定所述触控物与所述投影区域之间的距离。
具体的,确定该触控轨迹的方式可以包括:
根据目标时间段内所确定的触控位置坐标,获得该触控物的移动路径和移动方向;
根据该移动路径和所述移动方向,确定触控轨迹。
具体的,在目标时间段内,可以确定若干个触控位置坐标(x,y),通过这些触控位置坐标(x,y)可以获得该触控物的移动路径和移动方向。可以理解的是,每两个相邻时刻对应的触控位置坐标都可以确定该触控物在该两个相邻时刻对应的时间段内的移动路径和移动方向,进而根据在该目标时间段内确定的所有时刻的移动路径和移动方向便可以确定出一条在xy平面内的触控轨迹。
进一步的,在确定该触控轨迹的同时,由于扫描式投影设备在实时接收超声波脉冲信号的反射信号,所以可以根据发射超声波脉冲信号的时间及接收该超声波脉冲信号的反射信号的时间序列及强度,确定该触控物与投影区域之间的距离。其中,确定该触控物与投影区域之间的距离的确定方式可以包括以下步骤:
根据当前工作温度确定当前周期超声波脉冲信号的传播速度;
根据所述超声波脉冲信号的反射信号的强度,通过超声回波处理算法确定所述投影区域表面的表面反射信号;
根据发射所述超声波脉冲信号的时间、接收所述表面反射信号的时间序列,计算该超声波的第二传播时间序列;
根据所述第二传播时间序列及所述传播速度,确定所述投影区域的第二景深信息图像;
将上一周期确定的所述投影区域的第一景深信息图像与所述第二景深信息图像进行差分处理,得到第三景深信息图像;
通过尖端检测算法对所述第三景深信息图像进行检测,确定所述触控物与所述投影区域之间的距离。
由于超声波脉冲信号的传播速度会受到温度的影响,所以为了保证数据的准确性,首先可以根据当前工作温度确定当前周期超声波脉冲信号的传播速度。需要说明的是,超声波脉冲信号的传播速度与温度的对应关系为公知常识,在此不做具体说明。
进一步的,由于该触控物与扫描式投影设备之间的距离和投影区域与扫描式投影设备之间的距离是不同的,所以二者对该超声波脉冲信号的反射时刻也不同,进而接收到二者的反射信号的时间也是不同的。需要说明的是,所述该触控物与扫描式投影设备之间的距离以及投影区域与扫描式投影设备之间的距离指的是在由超声波脉冲信号发射位置和触控物所确定的直线上的距离,即为图5中沿z轴方向的距离,而不是投影区域所在平面的法线方向的距离。
由于投影区域的表面和内部均会对超声波脉冲信号进行反射,所以接收到的反射信号中包括投影区域表面和内部反射的反射信号。显然,投影区域内部反射的反射信号并不能用来计算触控物与投影区域之间的距离,例如,当投影区域为人体手背时,该超声波脉冲信号遇到手背表面皮肤时会反射,也会穿过皮肤,遇到人体的血管等也会发生反射,而人体的血管的反射信号并不能用来计算触控物与投影区域之间的距离,所以,需要根据该超声波脉冲信号的反射信号的强度,通过超声回波处理算法确定投影区域表面的表面反射信号。
需要说明的是,若当前周期投影区域内存在触控物,那么通过表面反射信号的强度依然可以确定触控物表面的反射信号,例如,触控物为手指,那么通过超声回波处理算法便可以确定手指表面的反射信号。此时,确定的投影区域表面的表面反射信号即为触控物表面的反射信号和除去被触控物遮挡的投影区域的表面反射信号。
确定了当前周期接收到的反射信号中哪些是表面反射信号后,便可以根据发射该超声波脉冲信 号的时间、接收该表面反射信号的时间序列,计算该超声波的第二传播时间序列。
需要说明的是,第二传播时间序列不是投影区域中某一个点对应的一个时间,而是当前周期整个投影区域对应的发射超声波脉冲信号的时间以及接收表面反射信号的时间的时间差,由于表面反射信号是整个投影区域的表面反射信号,而投影区域可能是斜面,也可能是凹凸不平的表面,所以接收到该表面反射信号的时间是不同的,是一系列不同的时间,也就是上述时间序列。
计算出当前周期超声波的第二传播时间序列后,就可以根据飞行时间算法(Time Of Flight,TOF)和超声波成像技术确定出投影区域的第二景深信息图像。
具体的,计算公式可以为:d2=v2t2/2,其中,v2为当前温度对应的当前周期超声波脉冲信号的传播速度,t2为第二传播时间序列。可以理解的是,由于t2是一系列不同的时间,所以计算得到的第二景深信息d2也是对应于整个投影区域的一系列不同的景深值信息。
为了方便计算以及后续步骤的进行,可以利用超声波成像技术根据该第二景深信息得出第二景深信息图像,例如图5中(2)所示,图中颜色越深表明与扫描式投影设备的距离越近,从图中可以看出,该第一景深信息图像为投影到一个斜面上对应的景深信息图像,例如腕式投影设备投影到手背上得到的第二景深信息图像。可以看出,该第二景深信息图像中右侧出现明显的触控物的景深信息,由于触控物与扫描式投影设备的距离相对于投影区域与扫描式投影设备的距离更近,所以触控物对应的第二景深信息图像中区域的颜色更深。需要强调的是,图5中(2)所示的第一景深信息图像只是本发明提供的一种可能的情况,并不能构成对本发明所提供的第二景深信息图像的限定。
同理的,可以根据上述方式确定上一周期的投影区域的第一景深信息图像,具体计算公式可以为:d1=v1t1/2,其中,v1为上一周期超声波脉冲信号的传播速度,t1为上一周期发射超声波脉冲信号的时间、接收表面反射信号的时间序列的时间差,即第一传播时间序列。可以理解的是,由于t1也是一系列不同的时间,所以计算得到的第一景深信息d1也是对应于整个投影区域的一系列不同的景深值信息。根据该第一景深信息可以确定第一景深信息图像,由于具体确定方式与第二景深信息图像的确定方式相同,所以在此不再进行赘述。
举例而言,图5中(1)所示的第一景深信息图像,从图中可出,该第一景深信息图像为与图5中(2)对应的,没有触控物时,投影到一个斜面上对应的景深信息图像,例如腕式投影设备投影到手背上得到的。需要强调的是,图5中(1)所示的第一景深信息图像只是本发明提供的一种可能的情况,并不能构成对本发明所提供的第一景深信息图像的限定。
得到上述第一景深信息图像及第二景深信息图像后,将二者相减即通过差分处理算法便可以得到触控物与投影区域之间的距离对应的第三景深信息图像。
具体的,若得到的第三景深信息图像中的景深信息为0,说明当前周期内没有触控物进入投影区域,那么继续接收下一周期的反射信号,重复上述过程即可。若将二者通过差分处理得到的第三景深信息图像中有触控物的景深信息存在,那么得到的距离是整个触控物与投影区域之间的距离,而不是触控物上某一点与投影区域之间的距离,由于一般触控物(例如手指、触控笔等)在触控时都是使用尖端进行操作,所以此时可以通过尖端检测算法从第三景深信息图像中检测出触控物的尖端附近位置,从而得到触控物的尖端附近位置与投影区域之间的距离。
需要说明的是,由于触控物的尖端附近位置与投影区域之间可能存在多个距离,且该多个距离可能不相等,所以可以根据实际投影情况以及操控方式将该距离中最小值或最大值或得到的触控物的尖端位置与投影区域之间的所有距离的平均值作为触控物与投影区域之间的距离,在此不做具体 限定。可以理解的是,确定了该距离后,也就确定了图5中触控物的坐标z。
进一步需要说明的是,在目标时间段内,可能有多个周期对应的第三景深信息图像中都存在触控物的景深信息,那么可以将该多个周期得到的触控物与投影区域之间的距离的平均值或最小值或最大值等作为最终的触控物与投影区域之间的距离,在此不做具体限定。
S104,判断所述距离是否小于预设的阈值,如果是,执行步骤S105;
为了避免用户误操作对投影触控的准确性造成不良影响,计算出触控物与投影区域之间的距离后,可以将该距离与该预设的阈值进行比较以判定触控动作是否有效,在根据比较结果进行触控操作。
其中,该预设的阈值可以根据实际操作需要及用户习惯进行设定,由于一般触控物本身是具有一定厚度的,所以可以进一步根据所使用的触控物进行设定,例如,可以设置为1厘米、1.5厘米等,在此不做具体限定。当然也可以针对不同的触控轨迹设定不同的阈值,以达到更加精准触控的目的。
S105,响应所述触控轨迹对应的触控动作。
当判断出该距离小于该预设的阈值时,说明该触控轨迹对应的触控动作是有效的,那么便可以响应该触控动作。该触控动作可能为点击、滑动、长按以及双击等触控动作,在此不做具体限定。
而判断出该距离不小于该预设的阈值时,说明该触控轨迹对应的触控动作是无效的,很可能是用户的误操作,那么便不进行任何操作,继续进行步骤S101即可。
举例而言,如图7所示,图中701为腕式投影设备,702为投射可见激光及不可见激光的装置,703为反射光接收装置,704为触控物(手指),706为超声波脉冲信号发射及接收装置。可以看出,图7中(1)和(2)中触控物与投影区域的距离不同,在(1)中,触控物704的尖端即手指尖与投影区域之间的距离很近,通过计算得出为0.8厘米。而在(2)中,触控物704的尖端即手指尖与投影区域之间的距离较远,通过计算得出为2.5厘米,如果预设的阈值为1厘米,由于0.8厘米小于1厘米,而2.5厘米大于1厘米,所以(1)中触控物704对应的触控动作为有效动作,响应该触控动作,而(2)中触控物704对应的触控动作为无效动作,不进行任何操作。显然,按照手指点击的习惯动作,(2)中触控物704实际点击到投影区域上的位置应该是图中手背上的阴影部分,如果不对触控物704的尖端与投影区域之间的距离和预设的阈值进行判断,认为其为有效动作,直接将(2)中触控物704的触控位置坐标作为点击动作的位置,进行点击动作,那么执行的触控动作就是错误的。
可见,本方案中,首先接收不可见激光的反射光,确定反射光的强度,并接收超声波脉冲信号的反射信号,当反射光的强度发生变化时,根据发生变化前后反射光的强度,确定触控物的触控位置坐标,然后获得目标时间段内所确定的触控位置坐标形成的触控轨迹,并确定触控物与所述投影区域之间的距离,最后判断该距离是否小于预设的阈值,如果是,则响应所述触控轨迹对应的触控动作。可见,由于采用反射光的强度确定触控物的触控位置坐标,通过超声波脉冲信号确定触控物与投影区域的距离,不需要摄像头拍摄,可以快速精确地确定触控动作,同时有效降低了投影设备的耗电量,即使在投影区域与投影设备距离仅为10厘米时也可以进行精准触控,大大提高了投影触控的准确度。
下面以点击触控动作以及滑动触控动作为例,对本发明所提供的一种投影触控方法进行详细说明。
实施例1
一种投影触控方法,应用于扫描式投影设备,可以包括以下步骤:
接收所述不可见激光的反射光,确定所述反射光的强度,并接收所述超声波脉冲信号的反射信号;
当所述反射光的强度发生变化时,根据发生变化前后所述反射光的强度,确定触控物的触控位置坐标;
获得目标时间段内所确定的触控位置坐标形成的触控轨迹,并根据所述目标时间段内发射超声波脉冲信号的时间、接收反射信号的时间,确定所述触控物与所述投影区域之间的距离;
具体的,在目标时间段内所确定的触控位置坐标形成的触控轨迹为一个点,如图8(1)所示。
进一步的,确定该触控物的尖端位置与投影区域之间的距离的最小值为该触控物与所述投影区域之间的距离。
判断所述距离是否小于预设的阈值;
如果是,响应所述点击触控动作。
实施例2
与实施例1的区别在于:
在目标时间段内所确定的触控位置坐标形成的触控轨迹为在设定区域内的一条直线,如图8(2)所示,该设定区域可以根据用户操作习惯进行设定,例如可以为10个像素为直径的圆形区域,或者边长为15个像素的正方形区域等,在此不做具体限定。
进一步的,确定该触控物的尖端位置与投影区域之间的距离的平均值为该触控物与所述投影区域之间的距离。
实施例3
与实施例1的区别在于:
在目标时间段内所确定的触控位置坐标形成的触控轨迹为在设定区域内的一条曲线,如图8(3)所示,该设定区域可以根据用户操作习惯进行设定,例如可以为10个像素为直径的圆形区域等,或者边长为15个像素的正方形区域,在此不做具体限定。
进一步的,确定该触控物的尖端位置与投影区域之间的距离的最大值为该触控物与所述投影区域之间的距离。
实施例4
一种投影触控方法,应用于扫描式投影设备,可以包括以下步骤:
接收所述不可见激光的反射光,确定所述反射光的强度,并接收所述超声波脉冲信号的反射信号;
当所述反射光的强度发生变化时,根据发生变化前后所述反射光的强度,确定触控物的触控位置坐标;
获得目标时间段内所确定的触控位置坐标形成的触控轨迹,并根据所述目标时间段内发射超声波脉冲信号的时间、接收反射信号的时间,确定所述触控物与所述投影区域之间的距离;
具体的,在目标时间段内所确定的触控位置坐标形成的触控轨迹为一条直线,如图8(4)所示。 且该直线上的触控位置坐标形中y值的范围不超过预设值,需要说明的是,该预设值可以根据用户操作习惯进行设定,例如可以为100个像素、150个像素等,在此不做具体限定。
进一步的,确定该触控物的尖端位置与投影区域之间的距离的最小值为该触控物与所述投影区域之间的距离。
判断所述距离是否小于预设的阈值;
如果是,响应所述滑动触控动作。
其中,以该触控轨迹的开始点的位置坐标与终止点的位置坐标的连线方向作为该滑动触控动作的滑动方向。
实施例5
与实施例4的区别在于:
在目标时间段内所确定的触控位置坐标形成的触控轨迹为一条折线,如图8(5)所示。且该折线上的触控位置坐标形中y值的范围不超过预设值,需要说明的是,该预设值可以根据用户操作习惯进行设定,例如可以为100个像素、150个像素等,在此不做具体限定。
进一步的,确定该触控物的尖端位置与投影区域之间的距离的最大值为该触控物与所述投影区域之间的距离。
实施例6
与实施例4的区别在于:
在目标时间段内所确定的触控位置坐标形成的触控轨迹为一条曲线,如图8(6)所示。且该曲线上的触控位置坐标形中y值的范围不超过预设值,需要说明的是,该预设值可以根据用户操作习惯进行设定,例如可以为100个像素、150个像素等,在此不做具体限定。
进一步的,确定该触控物的尖端位置与投影区域之间的距离的平均值为该触控物与所述投影区域之间的距离。
相应于上述方法实施例,本发明实施例还提供了投影触控装置,下面对本发明实施例所提供一种投影触控装置进行介绍。
首先需要说明的是,本发明提供的一种投影触控装置应用于扫描式投影设备,该扫描式投影设备投射不可见激光并发射超声波脉冲信号至投影区域,其中,该不可见激光形成的不可见激光区域覆盖投影区域。
如图2所示,一种投影触控装置,应用于扫描式投影设备,可以包括:
接收模块210,用于接收所述不可见激光的反射光,确定所述反射光的强度,并接收所述超声波脉冲信号的反射信号,确定所述反射信号的强度;
第一确定模块220,用于当所述反射光的强度发生变化时,根据发生变化前后所述反射光的强度,确定触控物的触控位置坐标;
第二确定模块230,用于在目标时间段内,获得所确定的触控位置坐标形成的触控轨迹;
其中,所述目标时间段为:从当前时刻开始向前预设时长的时间段。
第二确定模块240,用于根据所述目标时间段内发射超声波脉冲信号的时间、接收所述超声波 脉冲信号的反射信号的时间序列及强度,确定所述触控物与所述投影区域之间的距离;
判断模块250,用于判断所述距离是否小于预设的阈值;
响应模块260,用于在所述距离小于预设的阈值时响应所述触控轨迹对应的触控动作。
可见,本方案中,首先接收不可见激光的反射光,确定反射光的强度,并接收超声波脉冲信号的反射信号,当反射光的强度发生变化时,根据发生变化前后反射光的强度,确定触控物的触控位置坐标,然后获得目标时间段内所确定的触控位置坐标形成的触控轨迹,并确定触控物与所述投影区域之间的距离,最后判断该距离是否小于预设的阈值,如果是,则响应所述触控轨迹对应的触控动作。可见,由于采用反射光的强度确定触控物的触控位置坐标,通过超声波脉冲信号确定触控物与投影区域的距离,不需要摄像头拍摄,可以快速精确地确定触控动作,同时有效降低了投影设备的耗电量,即使在投影区域与投影设备距离仅为10厘米时也可以进行精准触控,大大提高了投影触控的准确度。
优选的,所述不可见激光为红外激光。
具体的,所述第一确定模块220可以包括:
第一强度信息图像确定单元,用于根据发生变化前的所述反射光的强度确定发生变化前的反射光强度信息图像,根据发生变化后所述反射光的强度确定发生变化后的反射光强度信息图像;
第二强度信息图像确定单元,用于将所述发生变化前的反射光强度信息图像与所述发生变化后的反射光强度信息图像进行差分处理,得到触控物的反射光强度信息图像;
触控位置坐标确定单元,用于通过尖端检测算法对所述触控物的反射光强度信息图像进行检测,确定所述触控物的触控位置坐标。
具体的,所述第二确定模块230可以包括:
路径和方向确定单元,用于根据目标时间段内所确定的触控位置坐标,获得所述触控物的移动路径和移动方向;
触控轨迹确定单元,用于根据所述移动路径和所述移动方向,确定触控轨迹。
具体的,所述第三确定模块240可以包括:
传播速度确定单元,用于根据当前工作温度确定当前周期超声波脉冲信号的传播速度;
反射信号确定单元,用于根据所述超声波脉冲信号的反射信号的强度,通过超声回波处理算法确定所述投影区域表面的表面反射信号;
传播时间确定单元,用于根据发射所述超声波脉冲信号的时间、接收所述表面反射信号的时间序列,计算所述超声波的第二传播时间序列;
景深信息确定单元,用于根据所述第二传播时间序列及所述传播速度,确定所述投影区域的第二景深信息图像;
差分处理单元,用于将上一周期确定的所述投影区域的第一景深信息图像与所述第二景深信息图像进行差分处理,得到第三景深信息图像;
距离确定单元,用于通过尖端检测算法对所述第三景深信息图像进行检测,确定所述触控物与所述投影区域之间的距离。
本发明实施例还提供了投影触控设备,下面对本发明实施例所提供一种投影触控设备进行介绍。
如图3所示,一种投影触控设备,包括:激光投射装置310、超声波脉冲信号发射装置320、反射激光接收装置330、反射超声波脉冲信号接收装置340和数据处理装置350;其中,
激光投射装置310,用于以扫描方式投射可见激光至投影区域形成投影画面,并以扫描方式投射不可见激光至所述投影区域形成不可见激光区域;
其中,所述不可见激光区域覆盖所述投影区域。
超声波脉冲信号发射装置320,用于发射超声波脉冲信号至所述投影区域;
反射激光接收装置330,用于接收所述不可见激光的反射光,确定所述反射光的强度,并将所述反射光的强度发送至所述数据处理装置350;
反射超声波脉冲信号接收装置340,用于接收所述超声波脉冲信号的反射信号,确定所述反射信号的强度,并获得发射所述超声波脉冲信号的时间和接收所述反射信号的时间序列,并将所获得的时间序列及所述反射信号的强度发送至所述数据处理装置350;
数据处理装置350,用于接收所述反射机关接收装置发送的所述反射光强度,接收所述反射超声波脉冲信号接收装置发送的所述时间序列及反射信号的强度,当所述反射光的强度发生变化时,根据发生变化前后所述反射光的强度,确定触控物的触控位置坐标,获得目标时间段内所确定的触控位置坐标形成的触控轨迹,并根据所述目标时间段内发射超声波脉冲信号的时间、接收反射信号的时间序列及强度,确定所述触控物与所述投影区域之间的距离,并判断所述距离是否小于预设的阈值,如果是,则响应所述触控轨迹对应的触控动作。
其中,所述目标时间段为:从当前时刻开始向前预设时长的时间段。
可见,本设备中,激光投射装置投射不可见激光至投影区域,超声波脉冲信号发射装置发射超声波脉冲信号至投影区域,反射激光接收装置接收不可见激光的反射光,反射超声波脉冲信号接收装置接收超声波脉冲信号的反射信号,数据处理装置在反射光的强度发生变化时,根据发生变化前后反射光的强度,确定触控物的触控位置坐标,然后获得目标时间段内所确定的触控位置坐标形成的触控轨迹,并确定触控物与所述投影区域之间的距离,判断该距离是否小于预设的阈值,如果是,则响应所述触控轨迹对应的触控动作。可见,由于采用反射光的强度确定触控物的触控位置坐标,通过超声波脉冲信号确定触控物与投影区域的距离,不需要摄像头拍摄,可以快速精确地确定触控动作,同时有效降低了投影设备的耗电量,即使在投影区域与投影设备距离仅为10厘米时也可以进行精准触控,大大提高了投影触控的准确度。
优选的,所述不可见激光为红外激光。
具体的,所述数据处理装置350用于:
根据发生变化前的所述反射光的强度确定发生变化前的反射光强度信息图像,根据发生变化后所述反射光的强度确定发生变化后的反射光强度信息图像;将所述发生变化前的反射光强度信息图像与所述发生变化后的反射光强度信息图像进行差分处理,得到触控物的反射光强度信息图像;通过尖端检测算法对所述触控物的反射光强度信息图像进行检测,确定所述触控物的触控位置坐标。
进一步的,所述数据处理装置350用于:
根据目标时间段内所确定的触控位置坐标,获得所述触控物的移动路径和移动方向;根据所述移动路径和所述移动方向,确定触控轨迹。
更进一步的,所述投影触控设备还可以包括:
温度传感器,用于测量当前工作温度,并将所述当前工作温度发送至所述数据处理装置350;
相应的,所述数据处理装置350,具体接收所述温度传感器发送的所述当前工作温度,并根据当前工作温度确定当前周期超声波脉冲信号的传播速度;根据所述超声波脉冲信号的反射信号的强度,通过超声回波处理算法确定所述投影区域表面的表面反射信号;根据所述发射所述超声波脉冲信号的时间、接收反射信号的时间序列,计算所述超声波的第二传播时间序列;根据所述第二传播时间序列及所述传播速度,确定所述投影区域的第二景深信息图像;将上一周期确定的所述投影区域的第一景深信息图像与所述第二景深信息图像进行差分处理,得到第三景深信息图像;通过尖端检测算法对所述第三景深信息图像进行检测,确定所述触控物与所述投影区域之间的距离。
需要说明的是,数据处理装置350可以发送第一控制信号至投射装置310,以控制激光投射装置310投射可见激光至投影区域形成投影画面,并投射不可见激光至所述投影区域形成不可见激光区域。还可以发送第二控制信号至超声波脉冲信号发射装置320,以控制超声波脉冲信号发射装置320发射超声波脉冲信号至所述投影区域。当然,投射装置310及超声波脉冲信号发射装置320也可以在该投影触控设备启动的同时便开始动作,这都是合理的。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
本说明书中的各个实施例均采用相关的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于装置实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。
本领域普通技术人员可以理解实现上述方法实施方式中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,所述的程序可以存储于计算机可读取存储介质中,这里所称得的存储介质,如:ROM/RAM、磁碟、光盘等。
以上所述仅为本发明的较佳实施例而已,并非用于限定本发明的保护范围。凡在本发明的精神和原则之内所作的任何修改、等同替换、改进等,均包含在本发明的保护范围内。

Claims (15)

  1. 一种投影触控方法,应用于扫描式投影设备,其特征在于,所述扫描式投影设备投射不可见激光并发射超声波脉冲信号至投影区域,其中,所述不可见激光形成的不可见激光区域覆盖所述投影区域,所述方法包括:
    接收所述不可见激光的反射光,确定所述反射光的强度,并接收所述超声波脉冲信号的反射信号,确定所述反射信号的强度;
    当所述反射光的强度发生变化时,根据发生变化前后所述反射光的强度,确定触控物的触控位置坐标;
    在目标时间段内,获得所确定的触控位置坐标形成的触控轨迹,并根据发射超声波脉冲信号的时间、接收所述超声波脉冲信号的反射信号的时间序列及强度,确定所述触控物与所述投影区域之间的距离,其中,所述目标时间段为:从当前时刻开始向前预设时长的时间段;
    判断所述距离是否小于预设的阈值;
    如果是,则响应所述触控轨迹对应的触控动作。
  2. 如权利要求1所述的方法,其特征在于,所述根据发生变化前后所述反射光的强度,确定触控物的触控位置坐标,包括:
    根据发生变化前所述反射光的强度确定发生变化前的反射光强度信息图像,根据发生变化后所述反射光的强度确定发生变化后的反射光强度信息图像;
    将所述发生变化前的反射光强度信息图像与所述发生变化后的反射光强度信息图像进行差分处理,得到触控物的反射光强度信息图像;
    通过尖端检测算法对所述触控物的反射光强度信息图像进行检测,确定所述触控物的触控位置坐标。
  3. 如权利要求1所述的方法,其特征在于,所述在目标时间段内,获得所确定的触控位置坐标形成的触控轨迹,包括:
    根据目标时间段内所确定的触控位置坐标,获得所述触控物的移动路径和移动方向;
    根据所述移动路径和所述移动方向,确定触控轨迹。
  4. 如权利要求1所述的方法,其特征在于,所述根据发射超声波脉冲信号的时间、接收所述超声波脉冲信号的反射信号的时间序列及强度,确定所述触控物与所述投影区域之间的距离,包括:
    根据当前工作温度确定当前周期超声波脉冲信号的传播速度;
    根据所述超声波脉冲信号的反射信号的强度,通过超声回波处理算法确定所述投影区域表面的表面反射信号;
    根据发射所述超声波脉冲信号的时间、接收所述表面反射信号的时间序列,计算所述超声波的第二传播时间序列;
    根据所述第二传播时间序列及所述传播速度,确定所述投影区域的第二景深信息图像;
    将上一周期确定的所述投影区域的第一景深信息图像与所述第二景深信息图像进行差分处理, 得到第三景深信息图像;
    通过尖端检测算法对所述第三景深信息图像进行检测,确定所述触控物与所述投影区域之间的距离。
  5. 如权利要求1所述的方法,其特征在于,所述不可见激光为红外激光。
  6. 一种投影触控装置,应用于扫描式投影设备,其特征在于,所述扫描式投影设备投射不可见激光并发射超声波脉冲信号至投影区域,其中,所述不可见激光形成的不可见激光区域覆盖所述投影区域,所述装置包括:
    接收模块,用于接收所述不可见激光的反射光,确定所述反射光的强度,并接收所述超声波脉冲信号的反射信号,确定所述反射信号的强度;
    第一确定模块,用于当所述反射光的强度发生变化时,根据发生变化前后所述反射光的强度,确定触控物的触控位置坐标;
    第二确定模块,用于在目标时间段内,获得所确定的触控位置坐标形成的触控轨迹,其中,所述目标时间段为:从当前时刻开始向前预设时长的时间段;
    第三确定模块,用于根据所述目标时间段内发射超声波脉冲信号的时间、接收所述超声波脉冲信号的反射信号的时间序列及强度,确定所述触控物与所述投影区域之间的距离;
    判断模块,用于判断所述距离是否小于预设的阈值;
    响应模块,用于在所述距离小于预设的阈值时响应所述触控轨迹对应的触控动作。
  7. 如权利要求6所述的装置,其特征在于,所述第一确定模块包括:
    第一强度信息图像确定单元,用于根据发生变化前的所述反射光的强度确定发生变化前的反射光强度信息图像,根据发生变化后所述反射光的强度确定发生变化后的反射光强度信息图像;
    第二强度信息图像确定单元,用于将所述发生变化前的反射光强度信息图像与所述发生变化后的反射光强度信息图像进行差分处理,得到触控物的反射光强度信息图像;
    触控位置坐标确定单元,用于通过尖端检测算法对所述触控物的反射光强度信息图像进行检测,确定所述触控物的触控位置坐标。
  8. 如权利要求6所述的装置,其特征在于,所述第二确定模块包括:
    路径和方向确定单元,用于根据目标时间段内所确定的触控位置坐标,获得所述触控物的移动路径和移动方向;
    触控轨迹确定单元,用于根据所述移动路径和所述移动方向,确定触控轨迹。
  9. 如权利要求6所述的装置,其特征在于,所述第三确定模块包括:
    传播速度确定单元,用于根据当前工作温度确定当前周期超声波脉冲信号的传播速度;
    反射信号确定单元,用于根据所述超声波脉冲信号的反射信号的强度,通过超声回波处理算法确定所述投影区域表面的表面反射信号;
    传播时间确定单元,用于根据发射所述超声波脉冲信号的时间、接收所述表面反射信号的时间序列,计算所述超声波的第二传播时间序列;
    景深信息确定单元,用于根据所述第二传播时间序列及所述传播速度,确定所述投影区域的第 二景深信息图像;
    差分处理单元,用于将上一周期确定的所述投影区域的第一景深信息图像与所述第二景深信息图像进行差分处理,得到第三景深信息图像;
    距离确定单元,用于通过尖端检测算法对所述第三景深信息图像进行检测,确定所述触控物与所述投影区域之间的距离。
  10. 如权利要求6所述的装置,其特征在于,所述不可见激光为红外激光。
  11. 一种投影触控设备,其特征在于,包括:激光投射装置、超声波脉冲信号发射装置、反射激光接收装置、反射超声波脉冲信号接收装置和数据处理装置;其中,
    所述激光投射装置,用于以扫描方式投射可见激光至投影区域形成投影画面,并以扫描方式投射不可见激光至所述投影区域形成不可见激光区域,其中,所述不可见激光区域覆盖所述投影区域;
    所述超声波脉冲信号发射装置,用于发射超声波脉冲信号至所述投影区域;
    所述反射激光接收装置,用于接收所述不可见激光的反射光,确定所述反射光的强度,并将所述反射光的强度发送至所述数据处理装置;
    所述反射超声波脉冲信号接收装置,用于接收所述超声波脉冲信号的反射信号,确定所述反射信号的强度,并获得发射所述超声波脉冲信号的时间和接收所述反射信号的时间序列,并将所获得的时间序列及所述反射信号的强度发送至所述数据处理装置;
    所述数据处理装置,用于接收所述反射激光接收装置发送的所述反射光强度,接收所述反射超声波脉冲信号接收装置发送的所述时间序列及反射信号的强度,当所述反射光的强度发生变化时,根据发生变化前后所述反射光的强度,确定触控物的触控位置坐标,获得目标时间段内所确定的触控位置坐标形成的触控轨迹,并根据所述目标时间段内发射超声波信号的时间、接收反射信号的时间序列及强度,确定所述触控物与所述投影区域之间的距离,并判断所述距离是否小于预设的阈值,如果是,则响应所述触控轨迹对应的触控动作,其中,所述目标时间段为:从当前时刻开始向前预设时长的时间段。
  12. 如权利要求11所述的设备,其特征在于,
    所述数据处理装置,具体用于根据发生变化前的所述反射光的强度确定发生变化前的反射光强度信息图像,根据发生变化后所述反射光的强度确定发生变化后的反射光强度信息图像;将所述发生变化前的反射光强度信息图像与所述发生变化后的反射光强度信息图像进行差分处理,得到触控物的反射光强度信息图像;通过尖端检测算法对所述触控物的反射光强度信息图像进行检测,确定所述触控物的触控位置坐标。
  13. 如权利要求11所述的设备,其特征在于,
    所述数据处理装置,具体用于根据目标时间段内所确定的触控位置坐标,获得所述触控物的移动路径和移动方向;根据所述移动路径和所述移动方向,确定触控轨迹。
  14. 如权利要求11所述的设备,其特征在于,所述投影触控设备还包括:温度传感器;所述温度传感器,用于测量当前工作温度,并将所述当前工作温度发送至所述数据处理装置;
    所述数据处理装置,具体用于接收所述温度传感器发送的所述当前工作温度,并根据所述当前 工作温度确定当前周期所述超声波脉冲信号的传播速度;根据所述超声波脉冲信号的反射信号的强度,通过超声回波处理算法确定所述投影区域表面的表面反射信号;根据所述目标时间段内发射所述超声波脉冲信号的时间、接收所述表面反射信号的时间序列,计算所述超声波针对所述投影区域的的第一第二传播时间序列和针对所述触控物的第二传播时间;根据所述第一传播时间及所述传播速度,确定所述投影区域的第一景深信息图像,;并根据所述第二传播时间序列及所述传播速度,确定所述投影区域触控物的第二景深信息图像;将上一周期确定的所述投影区域的所述第一景深信息图像与所述第二景深信息图像进行差分处理,得到第三景深信息图像;通过尖端检测算法对所述第三景深信息图像进行检测,确定所述触控物与所述投影区域之间的距离。
  15. 如权利要求11所述的设备,其特征在于,所述不可见激光为红外激光。
PCT/CN2017/085550 2016-08-26 2017-05-23 一种投影触控方法、装置及设备 WO2018036229A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610741260.3 2016-08-26
CN201610741260.3A CN106610757B (zh) 2016-08-26 2016-08-26 一种投影触控方法、装置及设备

Publications (1)

Publication Number Publication Date
WO2018036229A1 true WO2018036229A1 (zh) 2018-03-01

Family

ID=58615020

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/085550 WO2018036229A1 (zh) 2016-08-26 2017-05-23 一种投影触控方法、装置及设备

Country Status (2)

Country Link
CN (1) CN106610757B (zh)
WO (1) WO2018036229A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110688032A (zh) * 2019-09-25 2020-01-14 京东方科技集团股份有限公司 触控装置、触控方法及电子设备
CN113220167A (zh) * 2021-05-19 2021-08-06 京东方科技集团股份有限公司 显示模组及超声波触控检测方法
CN113325425A (zh) * 2021-06-25 2021-08-31 湖南友哲科技有限公司 检测试管有无的方法及试管检测装置
CN114756162A (zh) * 2021-01-05 2022-07-15 成都极米科技股份有限公司 触控系统及方法、电子设备及计算机可读存储介质

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106610757B (zh) * 2016-08-26 2019-08-02 北京一数科技有限公司 一种投影触控方法、装置及设备
CN110622115A (zh) * 2017-05-12 2019-12-27 微软技术许可有限责任公司 触摸操作的表面
CN107315509B (zh) * 2017-06-05 2023-09-15 青岛胶南海尔洗衣机有限公司 一种非接触式操控装置、信号处理方法及其家用电器
CN108984042B (zh) * 2017-06-05 2023-09-26 青岛胶南海尔洗衣机有限公司 一种非接触式操控装置、信号处理方法及其家用电器
CN108090336B (zh) * 2017-12-19 2021-06-11 西安易朴通讯技术有限公司 一种应用在电子设备中的解锁方法及电子设备
CN108089772B (zh) * 2018-01-15 2021-04-20 潍坊歌尔电子有限公司 一种投影触控方法和装置
CN108089773B (zh) * 2018-01-23 2021-04-30 歌尔科技有限公司 一种基于景深投影的触控识别方法、装置及投影部件
CN108829294B (zh) * 2018-04-11 2021-08-27 卡耐基梅隆大学 一种投影触控方法、装置及投影触控设备
CN109298798B (zh) * 2018-09-21 2021-08-17 歌尔科技有限公司 触控板的操作控制方法、设备以及智能终端
CN112716117B (zh) * 2020-12-28 2023-07-14 维沃移动通信有限公司 智能手环及其控制方法
CN113293832A (zh) * 2021-05-12 2021-08-24 唐山惠米智能家居科技有限公司 一种智能坐便器的激光投影控制系统及方法
CN113349742A (zh) * 2021-07-07 2021-09-07 异象科技(北京)有限公司 一种进行生命体征监测的智能光影控制手环

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324298A (zh) * 2013-06-05 2013-09-25 海信集团有限公司 超声波激光投影键盘和信息输入方法
CN105589607A (zh) * 2016-02-14 2016-05-18 京东方科技集团股份有限公司 触控系统、触控显示系统和触控交互方法
CN105874414A (zh) * 2014-01-21 2016-08-17 精工爱普生株式会社 位置检测装置、位置检测系统以及位置检测方法
CN106610757A (zh) * 2016-08-26 2017-05-03 北京数科技有限公司 一种投影触控方法、装置及设备

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9569001B2 (en) * 2009-02-03 2017-02-14 Massachusetts Institute Of Technology Wearable gestural interface
CN205375438U (zh) * 2016-02-14 2016-07-06 京东方科技集团股份有限公司 触控系统和触控显示系统
CN105808022B (zh) * 2016-03-10 2018-12-07 海信(山东)空调有限公司 投影按键控制方法、投影按键装置及空调器

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324298A (zh) * 2013-06-05 2013-09-25 海信集团有限公司 超声波激光投影键盘和信息输入方法
CN105874414A (zh) * 2014-01-21 2016-08-17 精工爱普生株式会社 位置检测装置、位置检测系统以及位置检测方法
CN105589607A (zh) * 2016-02-14 2016-05-18 京东方科技集团股份有限公司 触控系统、触控显示系统和触控交互方法
CN106610757A (zh) * 2016-08-26 2017-05-03 北京数科技有限公司 一种投影触控方法、装置及设备

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110688032A (zh) * 2019-09-25 2020-01-14 京东方科技集团股份有限公司 触控装置、触控方法及电子设备
CN110688032B (zh) * 2019-09-25 2023-04-07 京东方科技集团股份有限公司 触控装置、触控方法及电子设备
CN114756162A (zh) * 2021-01-05 2022-07-15 成都极米科技股份有限公司 触控系统及方法、电子设备及计算机可读存储介质
CN114756162B (zh) * 2021-01-05 2023-09-05 成都极米科技股份有限公司 触控系统及方法、电子设备及计算机可读存储介质
CN113220167A (zh) * 2021-05-19 2021-08-06 京东方科技集团股份有限公司 显示模组及超声波触控检测方法
CN113220167B (zh) * 2021-05-19 2024-03-12 京东方科技集团股份有限公司 显示模组及超声波触控检测方法
CN113325425A (zh) * 2021-06-25 2021-08-31 湖南友哲科技有限公司 检测试管有无的方法及试管检测装置
CN113325425B (zh) * 2021-06-25 2024-02-27 湖南友哲科技有限公司 检测试管有无的方法及试管检测装置

Also Published As

Publication number Publication date
CN106610757A (zh) 2017-05-03
CN106610757B (zh) 2019-08-02

Similar Documents

Publication Publication Date Title
WO2018036229A1 (zh) 一种投影触控方法、装置及设备
JP6364505B2 (ja) レーダベースのジェスチャ認識
US9652043B2 (en) Recognizing commands with a depth sensor
EP2135155B1 (en) Touch screen system with hover and click input methods
US10592050B2 (en) Systems and methods for using hover information to predict touch locations and reduce or eliminate touchdown latency
US8169404B1 (en) Method and device for planary sensory detection
KR102335132B1 (ko) 하나의 단일 감지 시스템을 이용하는 멀티 모드 제스처 기반의 상호작용 시스템 및 방법
US8139029B2 (en) Method and device for three-dimensional sensing
US20140215407A1 (en) Method and system implementing user-centric gesture control
US20130194208A1 (en) Information terminal device, method of controlling information terminal device, and program
CA2481396A1 (en) Gesture recognition method and touch system incorporating the same
US20150193000A1 (en) Image-based interactive device and implementing method thereof
CN105260024A (zh) 一种在屏幕上模拟手势运动轨迹的方法及装置
US7847787B1 (en) Method and system for directing a control action
CN101714044B (zh) 一种基于摄像定位的触摸屏系统
TWI521413B (zh) 光學式觸控裝置
TWI454653B (zh) 三維絕對座標偵測系統、互動三維顯示系統以及辨識物體之三維座標的方法
EP3326052A1 (en) Apparatus and method for detecting gestures on a touchpad
US10203774B1 (en) Handheld device and control method thereof
US20160004385A1 (en) Input device
JP4053903B2 (ja) ポインティング方法、装置、およびプログラム
CN110032290A (zh) 用户界面
JP5692764B2 (ja) 対象物検出方法及びこれを用いた装置
TWI434205B (zh) 電子裝置及其相關控制方法
TWM615475U (zh) 無接觸式電梯控制系統

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17842654

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 110719)

122 Ep: pct application non-entry in european phase

Ref document number: 17842654

Country of ref document: EP

Kind code of ref document: A1