CN115562501A - Man-machine interaction method for rotary scanning display - Google Patents
Man-machine interaction method for rotary scanning display Download PDFInfo
- Publication number
- CN115562501A CN115562501A CN202211545375.7A CN202211545375A CN115562501A CN 115562501 A CN115562501 A CN 115562501A CN 202211545375 A CN202211545375 A CN 202211545375A CN 115562501 A CN115562501 A CN 115562501A
- Authority
- CN
- China
- Prior art keywords
- sensor
- human body
- finger
- signal
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
- G05B19/042—Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
- G05B19/0423—Input/output
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09F—DISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
- G09F9/00—Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
- G09F9/30—Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements
- G09F9/33—Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements being semiconductor devices, e.g. diodes
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/25—Pc structure of the system
- G05B2219/25257—Microcontroller
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Automation & Control Theory (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a man-machine interaction method of a rotary scanning display, which comprises a motor driving unit, a driving motor, a human body sensor, a signal conditioning circuit, a signal processing unit, an angle sensing unit, a control unit and a display unit, wherein the motor driving unit is used for controlling the operation of the driving motor, the driving motor is used for driving the scanning display to rotate, and the human body sensor is used for sensing human body information data.
Description
Technical Field
The invention relates to the field of man-machine interaction, in particular to a man-machine interaction method of a rotary scanning display.
Background
The rotary LED display screen is a novel display screen which utilizes mechanical rotation dynamic scanning to replace the traditional progressive scanning mode, has the characteristics of low cost and large visual range, is a new development direction of the LED display screen, and is essentially a dynamic scanning display technology matched with mechanical rotation, and the main control chip is AT89S52, and auxiliary components such as a motor module, a time module, a temperature module, a display module, a speed module and the like;
the invention discloses an LED rotary scanning (POV) display technology for realizing picture display by rotating one or more rows of LED lamp bars, which can achieve very high brightness/definition and a nearly transparent display effect, but the conventional rotary display equipment generally does not have a man-machine interaction function, interaction needs to be realized by an external button/body sensing sensor and the like, and meanwhile, when human interaction is carried out, possible columns (such as a support frame and a support rod) around the rotary display equipment can be mistaken for the man-machine interaction equipment to be an interaction signal, so that the interaction information is changed.
Disclosure of Invention
The invention mainly aims to provide a man-machine interaction method for a rotary scanning display, which can effectively solve the problems in the background technology: the prior rotary display device generally does not have a man-machine interaction function and needs to realize interaction through an external button/somatosensory sensor and the like, the invention adds a photoelectric or capacitive human body sensor on a rotary component of the display, when the equipment rotates, the sensor rotates along with the equipment to scan the space, so that whether a human body exists or not is detected, the position of the human body is obtained, and interaction is realized.
In order to realize the purpose, the invention adopts the technical scheme that:
a man-machine interaction method of a rotary scanning display comprises a motor driving unit, a driving motor, a human body sensor, a signal conditioning circuit, a signal processing unit, an angle sensing unit, a control unit and a display unit, wherein the motor driving unit is used for controlling the driving motor to operate, the driving motor is used for driving the scanning display to rotate, the human body sensor is used for sensing human body information data, the signal conditioning circuit is used for processing and conditioning acquired signals and eliminating interference factors in the acquired signals, the signal processing unit is used for processing the acquired signals to obtain position and posture information of a human body, the angle sensing unit is used for directly or indirectly obtaining touch position angle information, the control unit is used for receiving signals sent by the signal processing unit and controlling the display unit to make corresponding picture changes, the display unit is used for displaying corresponding pictures, the human body sensor is a photoelectric sensor and/or a distributed capacitance sensor, the photoelectric sensor comprises a transmitter and a receiver, the distributed capacitance sensor comprises a metal polar plate and an oscillator, and the angle sensing unit is one of a zero point detection sensor and an angle sensor;
the human-computer interaction method comprises the following specific steps: the first step is as follows: the motor driving unit controls the operation of a driving motor, the driving motor drives the scanning display to rotate, and the human body sensor senses human body information data;
the second step is that: the human body sensor is fixed with the light bar and rotates along with the light bar, when an object such as a human hand is in front of the sensor, the output signal of the sensor changes, is amplified/shaped by the signal conditioning circuit and is output to the signal processing unit;
the third step: the signal processing unit is connected with the angle sensing unit so as to obtain the angle of the current sensor, further calculate the position of an object in front of the display, and feed back the position information to the control unit, namely, the control unit can control the display unit to make corresponding picture changes to realize man-machine interaction;
the human body sensor further comprises a gesture induction strategy, and the gesture induction strategy is used for acquiring human body gesture information.
In a further improvement of the present invention, the signal conditioning circuit comprises one of a driver, an amplifier and a detector/ADC, the transmitter is driven by a driver modulation signal, the amplifier amplifies a received signal, the detector is sensitive only to the frequency of the modulation signal, thereby eliminating the interference caused by static ambient light, and the ADC can also overcome the ambient light effect by collecting the brightness difference received by the receiver when the transmitter is turned on/off.
The invention is further improved in that one human body sensor is arranged and installed at the edge of the rotating part of the lamp strip.
The invention is further improved in that the human body sensor array is arranged on the light bar.
A further improvement of the invention is that the body sensors are arranged one by one, using a linear imaging sensor in combination with a light source.
The invention is further improved in that the human body sensor is arranged in an array and is arranged in the middle of the rotating part of the lamp strip.
The invention has the further improvement that the specific steps of the human body sensor for sensing the human body information data are as follows: 1. by the position of a human body sensorSetting reference cylinder as center of circleSetting the entry reference radius(ii) a 2. By initial entry of the contact into the reference circleFor initial signal, when human hand approaches the display and the human body sensor rotates to the position below the human hand, the sensor gives an approach signal, records the moving track of the human hand, and records that the human hand is located on the reference circle for the last timeEnd point position of inner point(ii) a 3. The signal is amplified by the conditioning circuit and then sent to the information processing unit, the angle of the touch position of the hand can be known by the information processing unit and the angle sensing unit according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize the response to the touch.
The invention has the further improvement that the specific steps of the human body sensor for sensing the human body information data are as follows: the imaging area of the linear imaging sensor is perpendicular to the rotation direction, the light source is an invisible light source such as photoelectric light and the like and is used for illuminating the imaging area, when objects such as hands appear in the imaging area, the hands reflect the photoelectric light to image at the position, corresponding to the hands, on the linear imaging sensor, the processing unit can obtain the horizontal position corresponding to the hands, coordinates where the hands are located are obtained by combining the current angle and are provided for the control unit to realize interaction, signals are amplified by the conditioning circuit and then are sent to the information processing unit, the information processing unit and the angle sensing unit can know the angle of the touch position of the hands according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize response to touch.
The invention has the further improvement that the specific steps of the human body sensor sensing the human body information data also comprise an interactive identification error correction strategy, and the interactive identification error correction strategy comprises the following steps: for a proximity sensor which detects a fault, the reference circle is again entered from a different directionAnd (3) carrying out a step of sensing human body information data by a sensor, and if the human body information data is still identified wrongly, taking the intersection point position of the moving tracks of the two human hands as a final identification point.
The invention has the further improvement that the gesture sensing strategy comprises the following specific steps: the first step, extracting the average width of the user's finger as the base number of the finger width(ii) a Second, the approach person is collected by the approach sensorData signal of hand to obtain the real width close to human hand(ii) a Thirdly, calculating the extending hand index close to the human handRounding the calculated value to an integer value, and extending the hand index of the userRepresenting the number of displays that the user interacts with.
The further improvement of the invention is that the gesture sensing strategy also comprises a finger similarity judgment method: the first step is as follows: collecting the average width of the user's finger as the base number of the finger widthExtreme value of finger gapBase finger lengthAnd secondly, acquiring a data signal of the 'approaching object' through the approaching sensor to obtain the real width of the approaching objectSize of gapAnd length(ii) a And thirdly, calculating the similarity S between the approaching object and the finger, comparing the similarity S with a threshold value, judging whether the finger is pressed, if so, displaying, otherwise, not displaying and alarming.
The invention has the further improvement that the calculation formula of the similarity S is as follows: similarity of one finger:in whichRepresentWhether or not to be atIn the range, if the value is 1, if not, 0 is selected, andrepresentWhether or not to be atIn the range, if yes, the value is 1, if not, 0 is taken; similarity of n fingers:in whichTo representWhether or not to be atIf the similarity is 1, the value is 1, if the similarity is not 0, the similarity S is obtained, the similarity is 1 and indicates that the finger interaction is performed, and the similarity is 0 and indicates that the finger interaction is not performed.
Compared with the prior art, the invention has the following beneficial effects: the invention adds a human body sensor, a signal conditioning circuit and a signal processing unit on the basic structure of the rotary display, wherein the human body sensor can be a photoelectric geminate transistor, a photoelectric sensor array or a capacitance sensor, is fixed with a light bar and rotates along with the light bar, when an object, such as a human hand, is in front of the sensor, the output signal of the sensor changes, is amplified/shaped by the signal conditioning circuit and is output to the signal processing unit, the signal processing unit is simultaneously connected with a zero point detection sensor or an angle sensor, so as to obtain the angle of the current sensor, further calculate the position of the object in front of the display, and the position information is fed back to the control unit, so that the display unit can be controlled to make corresponding picture changes, thereby realizing human-computer interaction, and simultaneously providing a gesture recognition algorithm, being convenient for fast recognition of gestures, and further enhancing the reality sense of human-computer interaction.
Drawings
FIG. 1 is a schematic diagram of a schematic framework of a method for human-computer interaction with a rotating scanning display according to the present invention.
FIG. 2 is a schematic structural diagram of an embodiment 1 of a method for human-computer interaction with a rotating scanning display according to the present invention.
Fig. 3 is a block diagram of the architecture of embodiment 2 of the present invention.
FIG. 4 is a schematic diagram of the architecture of embodiment 3 of a rotational scanning display human-machine interaction method according to the present invention.
FIG. 5 is a block diagram of the architecture of an embodiment 4 of the method for human-computer interaction with a rotating scanning display according to the present invention.
FIG. 6 is a schematic diagram of a detector electro-optical proximity sensor and a signal conditioning circuit according to a rotational scanning display human-computer interaction method of the present invention.
FIG. 7 is a schematic diagram of an ADC electro-optical proximity sensor and a signal conditioning circuit according to a method for human-computer interaction with a rotary scan display of the present invention.
FIG. 8 is a schematic diagram of a capacitive proximity sensor for a method of human-computer interaction with a rotating scanning display according to the present invention.
Detailed Description
In order to make the technical means, the original characteristics, the achieved objects and the functions of the present invention easy to understand, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. indicate the orientation or the positional relationship based on the orientation or the positional relationship shown in the drawings, and are only for the convenience of describing the present invention and simplifying the description, but not for indicating or implying that the referred device or element must have a specific orientation, be constructed and operated in a specific orientation, and thus not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The invention will be further illustrated with reference to specific embodiments.
Example 1
As shown in fig. 1, 2, 6, 7 and 8, a human-computer interaction method for a rotary scanning display includes a motor driving unit, a driving motor, a human body sensor, a signal conditioning circuit, a signal processing unit, an angle sensing unit, a control unit and a display unit, where the motor driving unit is configured to control operation of the driving motor, the driving motor is configured to drive the scanning display to rotate, the human body sensor is configured to sense human body information data, the signal conditioning circuit is configured to process and condition an acquired signal and eliminate an interference factor therein, the signal processing unit is configured to process the acquired signal and obtain position and posture information of a human body, the angle sensing unit is configured to directly or indirectly obtain touch position angle information, the control unit is configured to receive a signal sent by the signal processing unit and control the display unit to make a corresponding picture change, the display unit is configured to display a corresponding picture, the human body sensor is a photoelectric sensor and/or a distributed capacitive sensor, the photoelectric sensor includes a transmitter and a receiver, the distributed capacitive sensor includes a metal plate and an oscillator, and the angle sensing unit is one of a detection sensor and a zero-point sensor;
the human-computer interaction method comprises the following specific steps: the first step is as follows: the motor driving unit controls the operation of a driving motor, the driving motor drives the scanning display to rotate, and the human body sensor senses human body information data;
the second step is that: the human body sensor is fixed with the light bar and rotates along with the light bar, when an object such as a human hand is in front of the sensor, the output signal of the sensor changes, is amplified/shaped by the signal conditioning circuit and is output to the signal processing unit;
the third step: the signal processing unit is connected with the angle sensing unit so as to obtain the angle of the current sensor and further calculate the position of an object in front of the display, and the position information is fed back to the control unit, so that the display unit can be controlled to make corresponding picture changes, and man-machine interaction is realized;
the human body sensor further comprises a gesture induction strategy, and the gesture induction strategy is used for acquiring human body gesture information;
the signal conditioning circuit comprises a driver, an amplifier and one of a detector and an ADC (analog to digital converter), wherein the transmitter is driven by a driver modulation signal, the amplifier amplifies a received signal, the detector is only sensitive to the frequency of the modulation signal, so that the interference generated by static ambient light is eliminated, and the ADC can overcome the influence of the ambient light by collecting the brightness difference received by the receiver when the transmitter is turned on/off;
wherein, human body sensor sets up one, installs at lamp strip rotating part edge, and human body sensor response human body information data's concrete step is: 1. by the position of the body sensorsSetting reference circle for circle centerSetting the entry reference radius(ii) a 2. By initial entry of the contact into the reference circleFor the initial signal, when the hand approaches the edge of the display, the human body sensor rotatesWhen the robot moves to the position below the hand, the signal output by the sensor changes under the influence of the hand, the moving track of the hand is recorded, and the last time the robot is positioned on the reference circle is recordedEnd point position of inner point(ii) a 3. The signal is amplified by the conditioning circuit and then sent to the information processing unit, the information processing unit and the angle sensing unit can know the angle of the touch position of the hand according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize the response to touch;
the specific steps of the human body sensor sensing the human body information data further comprise an interactive identification error correction strategy, and the interactive identification error correction strategy comprises the following steps: for a proximity sensor which recognizes a fault, the reference circle is again entered from a different directionSensing human body information data by a sensor, and if the human body information data are identified wrongly, taking the intersection point position of the moving tracks of the two human hands as a final identification point;
the gesture sensing strategy comprises the following specific steps: the first step, extracting the average finger width of the user as the base finger width(ii) a Secondly, acquiring data signals close to the human hand through a proximity sensor to obtain the real width close to the human hand(ii) a Thirdly, calculating the extending hand index close to the hand of the humanRounding off the calculated value to take an integer value, and extending the hand index of the userThe number of display frames representing user interaction;
the gesture sensing strategy further comprises a finger similarity judgment method: the first step is as follows: collecting the average width of the user's finger as the base of the finger widthExtreme value of finger gapFinger length baseAnd secondly, acquiring a data signal of the 'approaching object' through the approaching sensor to obtain the real width of the approaching objectSize of gapAnd length(ii) a Thirdly, calculating the similarity S between the approaching object and the finger, comparing the similarity S with a threshold value, judging whether the finger is pressed, if the finger is indeed judged to be interacted, displaying, and if the finger is not judged to be interacted, not displaying and alarming;
wherein, the calculation formula of the similarity S is as follows: similarity of one finger:whereinTo representWhether or not to be atIn the range, the value is 1 if it is, 0 if it is not, andto representWhether or not to be atIn the range, if yes, the value is 1, if not, the value is 0; similarity of n fingers:in whichTo representWhether or not to be atIf the similarity is 1, otherwise, 0 is obtained, and the similarity S is obtained, wherein if the similarity is 1, the finger interaction is represented, and if the similarity is 0, the finger interaction is not represented.
The embodiment can realize that: in the above embodiment, the display is in a cylindrical rotary scanning mode, and the proximity sensor is placed at the edge of the rotating portion. When the display part rotates, the proximity sensor rotates along with the rotating structure, when a human hand approaches the edge of the display and the proximity sensor rotates to the position below the human hand, the sensor gives a proximity signal, the signal is amplified by the conditioning circuit and then sent to the processing unit, the processing unit can know the angle of the touch position of the human hand according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize the response to the touch.
Example 2
As shown in fig. 1, 2, 6, 7 and 8, a man-machine interaction method for a rotary scanning display includes a motor driving unit, a driving motor, a body sensor, a signal conditioning circuit, a signal processing unit, an angle sensing unit, a control unit and a display unit, where the motor driving unit is configured to control operation of the driving motor, the driving motor is configured to drive the scanning display to rotate, the body sensor is configured to sense body information data, the signal conditioning circuit is configured to process and condition a collected signal and eliminate interference factors therein, the signal processing unit is configured to process the collected signal and obtain position and posture information of a body, the angle sensing unit is configured to directly or indirectly obtain touch position angle information, the control unit is configured to receive a signal sent by the signal processing unit and control the display unit to make a corresponding frame change, the display unit is configured to display a corresponding frame, the body sensor is a photoelectric sensor and/or a distributed capacitance sensor, the photoelectric sensor includes a transmitter and a receiver, the distributed capacitance sensor includes a metal plate and an oscillator, and the angle sensing unit is one of a detection sensor and a zero-point sensor;
the man-machine interaction method comprises the following specific steps: the first step is as follows: the motor driving unit controls the operation of a driving motor, the driving motor drives the scanning display to rotate, and the human body sensor senses human body information data;
the second step: the human body sensor is fixed with the light bar and rotates along with the light bar, when an object such as a human hand is in front of the sensor, the output signal of the sensor changes, is amplified/shaped by the signal conditioning circuit and is output to the signal processing unit;
the third step: the signal processing unit is connected with the angle sensing unit so as to obtain the angle of the current sensor, further calculate the position of an object in front of the display, and feed back the position information to the control unit, namely, the control unit can control the display unit to make corresponding picture changes to realize man-machine interaction;
the human body sensor also comprises a gesture induction strategy, and the gesture induction strategy is used for acquiring human body gesture information;
the signal conditioning circuit comprises a driver, an amplifier and one of a detector and an ADC (analog to digital converter), wherein the transmitter is driven by a driver modulation signal, the amplifier amplifies a received signal, the detector is only sensitive to the frequency of the modulation signal, so that the interference generated by static ambient light is eliminated, and the ADC can overcome the influence of the ambient light by collecting the brightness difference received by the receiver when the transmitter is turned on/off;
wherein, the human body sensor array sets up on the lamp strip, and the specific step of human body sensor response human body information data is: 1. by the position of the body sensorsSetting reference cylinder as center of circleSetting the entry reference radius(ii) a 2. By initial entry of the contact into the reference circleWhen the human hand approaches the display and the human body sensor rotates to the position below the human hand, the signal output by the sensor changes under the influence of the human hand, the moving track of the human hand is recorded, and the last time the human hand is positioned on the reference circle is recordedEnd point position of inner point(ii) a 3. The signal is amplified by the conditioning circuit and then sent to the information processing unit, the information processing unit and the angle sensing unit can know the angle of the touch position of the hand according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize the response to touch;
wherein, the specific steps of the human body sensor sensing the human body information data further comprise an interactive identification and error correction strategy, and the interaction is carried outIdentifying an error correction policy comprises the steps of: for a proximity sensor which recognizes a fault, the reference circle is again entered from a different directionSensing human body information data by a sensor, and if the human body information data are identified wrongly, taking the intersection point position of the moving tracks of the two human hands as a final identification point;
the gesture sensing strategy comprises the following specific steps: the first step, extracting the average finger width of the user as the base finger width(ii) a Secondly, acquiring data signals close to the human hand through a proximity sensor to obtain the real width close to the human hand(ii) a Thirdly, calculating the extending hand index close to the human handRounding off the calculated value to take an integer value, and extending the hand index of the userThe number of display frames representing user interaction;
the gesture sensing strategy further comprises a finger similarity judgment method: the first step is as follows: collecting the average width of the user's finger as the base number of the finger widthExtreme value of finger gapBase finger lengthAnd secondly, acquiring a data signal of a 'approaching object' through a proximity sensor to obtainTrue width of approaching objectSize of gapAnd length(ii) a Thirdly, calculating the similarity S between the approaching object and the finger, comparing the similarity S with a threshold value, judging whether the finger is pressed, if the finger is indeed judged to be interacted, displaying, and if the finger is not judged to be interacted, not displaying and alarming;
wherein, the calculation formula of the similarity S is as follows: similarity of one finger:whereinRepresentWhether or not to be atIn the range, the value is 1 if it is, 0 if it is not, andto representWhether or not to be atIn the range, if yes, the value is 1, if not, the value is 0; n finger similarity:whereinTo representWhether or not to be atIf the similarity is 1, the value is 1, if the similarity is not 0, the similarity S is obtained, the similarity is 1 and indicates that the finger interaction is performed, and the similarity is 0 and indicates that the finger interaction is not performed.
The embodiment can realize that: by installing the sensors in an array mode in an expanded mode, the touch angle information and the touch horizontal position can be obtained, the sensing area is expanded to the whole display area, and the accuracy of the horizontal position information can be improved by interpolating according to the information of the adjacent sensors.
Example 3
In this embodiment, on the basis of embodiment 2, a photoelectric sensor array mode is adopted to realize human body sensing, but this mode needs to use a large number of sensors, and has a complex structure and high cost, so that one linear imaging sensor (such as a linear CCD) can be used in combination with a light source to realize the same function, and the specific implementation manner is as follows: as shown in fig. 1, 4, 6, 7 and 8, a man-machine interaction method for a rotary scanning display includes a motor driving unit, a driving motor, a body sensor, a signal conditioning circuit, a signal processing unit, an angle sensing unit, a control unit and a display unit, where the motor driving unit is configured to control operation of the driving motor, the driving motor is configured to drive the scanning display to rotate, the body sensor is configured to sense body information data, the signal conditioning circuit is configured to process and condition a collected signal and eliminate interference factors therein, the signal processing unit is configured to process the collected signal and obtain position and posture information of a body, the angle sensing unit is configured to directly or indirectly obtain touch position angle information, the control unit is configured to receive a signal sent by the signal processing unit and control the display unit to make a corresponding frame change, the display unit is configured to display a corresponding frame, the body sensor is a photoelectric sensor and/or a distributed capacitance sensor, the photoelectric sensor includes a transmitter and a receiver, the distributed capacitance sensor includes a metal plate and an oscillator, and the angle sensing unit is one of a detection sensor and a zero-point sensor;
the human-computer interaction method comprises the following specific steps: the first step is as follows: the motor driving unit controls the operation of a driving motor, the driving motor drives the scanning display to rotate, and the human body sensor senses human body information data;
the second step is that: the human body sensor is fixed with the light bar and rotates along with the light bar, when an object such as a human hand is in front of the sensor, the output signal of the sensor changes, is amplified/shaped by the signal conditioning circuit and is output to the signal processing unit;
the third step: the signal processing unit is connected with the angle sensing unit so as to obtain the angle of the current sensor and further calculate the position of an object in front of the display, and the position information is fed back to the control unit, so that the display unit can be controlled to make corresponding picture changes, and man-machine interaction is realized;
the human body sensor also comprises a gesture induction strategy, and the gesture induction strategy is used for acquiring human body gesture information;
the signal conditioning circuit comprises a driver, an amplifier and one of a detector and an ADC (analog to digital converter), wherein the transmitter is driven by a driver modulation signal, the amplifier amplifies a received signal, the detector is only sensitive to the frequency of the modulation signal, so that the interference generated by static ambient light is eliminated, and the ADC can overcome the influence of the ambient light by collecting the brightness difference received by the receiver when the transmitter is turned on/off;
wherein, the human body sensor sets up one, uses a linear imaging sensor to combine the light source, and the specific step of human body sensor response human body information data is: 1. by the position of the body sensorsSetting reference circle for circle centerSetting the entry reference radius(ii) a 2. By initial entry of the contact into the reference circleWhen the human hand approaches the display and the human body sensor rotates to the position below the human hand, the signal output by the sensor changes under the influence of the human hand, the moving track of the human hand is recorded, and the last time the human hand is positioned on the reference circle is recordedEnd point position of inner point(ii) a 3. When objects such as hands appear in the imaging area, the hands reflect light and emit light, imaging is carried out on the linear imaging sensor at the position corresponding to the hands, the processing unit can obtain the horizontal position corresponding to the hands, coordinates where the hands are located are obtained by combining the current angle and are provided for the control unit to realize interaction, signals are amplified by the conditioning circuit and then are sent to the information processing unit, the information processing unit and the angle sensing unit can know the angle of the touch position of the hands according to the angle of the current light bar, and the control unit can realize response to touch after the information is sent to the control unit;
the specific steps of the human body sensor sensing human body information data further comprise an interactive identification error correction strategy, and the interactive identification error correction strategy comprises the following steps: for a proximity sensor which recognizes a fault, the reference circle is again entered from a different directionCarrying out the step of sensing the human body information data by the sensor, and if the human body information data is still identified by errors, taking two handsThe position of the intersection point of the moving track is the final identification point;
the gesture sensing strategy comprises the following specific steps: the first step, extracting the average width of the user's finger as the base number of the finger width(ii) a Secondly, acquiring data signals close to the human hand through a proximity sensor to obtain the real width close to the human hand(ii) a Thirdly, calculating the extending hand index close to the hand of the humanRounding the calculated value to an integer value, and extending the hand index of the userThe number of display frames representing user interaction;
the gesture sensing strategy further comprises a finger similarity judgment method: the first step is as follows: collecting the average width of the user's finger as the base number of the finger widthExtreme value of finger gapFinger length baseAnd secondly, acquiring a data signal of the 'approaching object' through the proximity sensor to obtain the real width of the approaching objectSize of gapAnd length(ii) a Thirdly, calculating the similarity S between the approaching object and the finger, comparing the similarity S with a threshold value, judging whether the finger is pressed, if the finger is indeed judged to be interacted, displaying, and if the finger is not judged to be interacted, not displaying and alarming;
wherein, the calculation formula of the similarity S is as follows: similarity of one finger:whereinTo representWhether or not to be atIn the range, the value is 1 if it is, 0 if it is not, andto representWhether or not to be atIn the range, if yes, the value is 1, if not, the value is 0; similarity of n fingers:whereinRepresentWhether or not to be atIf the similarity is 1, otherwise, 0 is obtained, and the similarity S is obtained, wherein if the similarity is 1, the finger interaction is represented, and if the similarity is 0, the finger interaction is not represented.
The embodiment can realize that: the imaging area of the linear CCD is vertical to the rotating direction, and the light source is an invisible light source such as photoelectric light and the like and is used for illuminating the imaging area. When objects such as hands appear in the imaging area, the hands reflect light to generate electricity, images are formed on the CCD at positions corresponding to the hands, the processing unit can obtain horizontal positions corresponding to the hands, coordinates where the hands are located are obtained by combining the current angles, and the coordinates are provided for the control unit to achieve interaction.
Example 4
As shown in fig. 1, 5, 6, 7 and 8, a man-machine interaction method for a rotary scanning display includes a motor driving unit, a driving motor, a body sensor, a signal conditioning circuit, a signal processing unit, an angle sensing unit, a control unit and a display unit, where the motor driving unit is configured to control operation of the driving motor, the driving motor is configured to drive the scanning display to rotate, the body sensor is configured to sense body information data, the signal conditioning circuit is configured to process and condition a collected signal and eliminate interference factors therein, the signal processing unit is configured to process the collected signal and obtain position and posture information of a body, the angle sensing unit is configured to directly or indirectly obtain touch position angle information, the control unit is configured to receive a signal sent by the signal processing unit and control the display unit to make a corresponding frame change, the display unit is configured to display a corresponding frame, the body sensor is a photoelectric sensor and/or a distributed capacitance sensor, the photoelectric sensor includes a transmitter and a receiver, the distributed capacitance sensor includes a metal plate and an oscillator, and the angle sensing unit is one of a detection sensor and a zero-point sensor;
the human-computer interaction method comprises the following specific steps: the first step is as follows: the motor driving unit controls the operation of a driving motor, the driving motor drives the scanning display to rotate, and the human body sensor senses human body information data;
the second step: the human body sensor is fixed with the light bar and rotates along with the light bar, when an object such as a human hand is in front of the sensor, the output signal of the sensor changes, is amplified/shaped by the signal conditioning circuit and is output to the signal processing unit;
the third step: the signal processing unit is connected with the angle sensing unit so as to obtain the angle of the current sensor and further calculate the position of an object in front of the display, and the position information is fed back to the control unit, so that the display unit can be controlled to make corresponding picture changes, and man-machine interaction is realized;
the human body sensor further comprises a gesture induction strategy, and the gesture induction strategy is used for acquiring human body gesture information;
the signal conditioning circuit comprises a driver, an amplifier and one of a detector and an ADC (analog to digital converter), wherein the transmitter is driven by a driver modulation signal, the amplifier amplifies a received signal, the detector is only sensitive to the frequency of the modulation signal, so that the interference generated by static ambient light is eliminated, and the ADC can overcome the influence of the ambient light by collecting the brightness difference received by the receiver when the transmitter is turned on/off;
wherein, the human body sensor sets up the array setting, installs at lamp strip rotating part middle part, and the specific step of human body sensor response human body information data does: 1. setting a reference circle by taking the position of the human body sensor as the center of a circle, and setting an entering reference radius; 2. the method comprises the following steps that a contact object enters a reference circle for the first time to serve as an initial signal, a proximity sensor array and a light bar scan on a display plane around the center, so that detection of the object in the display plane is achieved, similarly, the position of a human hand in a polar coordinate of a rotating plane can be obtained through the angle and the center distance of an output signal sensor, interaction is achieved, and the final position of a point located in the reference circle for the last time is recorded; 3. the signal is amplified by the conditioning circuit and then sent to the information processing unit, the information processing unit and the angle sensing unit can know the angle of the touch position of the hand according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize the response to touch;
wherein the body sensor sensesIn the specific steps of the human body information data, an interactive identification error correction strategy is further included, and the interactive identification error correction strategy comprises the following steps: for a proximity sensor which recognizes a fault, the reference circle is again entered from a different directionSensing human body information data by a sensor, and if the human body information data are identified wrongly, taking the intersection point position of the moving tracks of the two human hands as a final identification point;
the gesture sensing strategy comprises the following specific steps: the first step, extracting the average width of the user's finger as the base number of the finger width(ii) a Secondly, acquiring data signals close to the human hand through a proximity sensor to obtain the real width close to the human hand(ii) a Thirdly, calculating the extending hand index close to the human handRounding off the calculated value to take an integer value, and extending the hand index of the userThe number of display frames representing user interaction;
the gesture sensing strategy further comprises a finger similarity judgment method: the first step is as follows: collecting the average width of the user's finger as the base of the finger widthExtreme value of finger gapFinger length baseAnd secondly, acquiring a data signal of the 'approaching object' through the approaching sensor to obtain the real width of the approaching objectSize of gapAnd length(ii) a Thirdly, calculating the similarity S between the approaching object and the finger, comparing the similarity S with a threshold value, judging whether the object is pressed by the finger, if so, displaying, otherwise, not displaying and alarming;
wherein, the calculation formula of the similarity S is as follows: similarity of one finger:whereinRepresentWhether or not to be atIn the range, the value is 1 if it is, 0 if it is not, andto representWhether or not to be atIn the range, if yes, the value is 1, if not, the value is 0; n finger similarity:whereinTo representWhether or not to be atIf the similarity is 1, the value is 1, if the similarity is not 0, the similarity S is obtained, the similarity is 1 and indicates that the finger interaction is performed, and the similarity is 0 and indicates that the finger interaction is not performed.
The embodiment can realize that: through the angle and the center distance of the output signal sensor, the position of the human hand in the polar coordinates of the rotating plane can be obtained, and interaction is realized.
The foregoing shows and describes the general principles and broad features of the present invention and advantages thereof. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are described in the specification and illustrated only to illustrate the principle of the present invention, but that various changes and modifications may be made therein without departing from the spirit and scope of the present invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (12)
1. A man-machine interaction method for a rotary scanning display is characterized in that: the human body sensor comprises a motor driving unit, a driving motor, a human body sensor, a signal conditioning circuit, a signal processing unit, an angle sensing unit, a control unit and a display unit, wherein the motor driving unit is used for controlling the operation of the driving motor, the driving motor is used for driving a scanning display to rotate, the human body sensor is used for sensing human body information data, the signal conditioning circuit is used for processing and conditioning acquired signals and eliminating interference factors in the acquired signals, the signal processing unit is used for processing the acquired signals to obtain position and posture information of a human body, the angle sensing unit is used for directly or indirectly obtaining touch position angle information, the control unit is used for receiving signals sent by the signal processing unit and controlling the display unit to make corresponding picture changes, the display unit is used for displaying corresponding pictures, the human body sensor is a photoelectric sensor and/or a distributed capacitance sensor, the photoelectric sensor comprises a zero point and a receiver, the distributed capacitance sensor comprises a metal polar plate and an oscillator, and the angle sensing unit is one of a detection sensor and an angle sensor;
the man-machine interaction method comprises the following specific steps: the first step is as follows: the motor driving unit controls the operation of a driving motor, the driving motor drives the scanning display to rotate, and the human body sensor senses human body information data;
the second step is that: the human body sensor is fixed with the light bar and rotates along with the light bar, when an object such as a human hand is in front of the sensor, the output signal of the sensor changes, is amplified/shaped by the signal conditioning circuit and is output to the signal processing unit;
the third step: the signal processing unit is connected with the angle sensing unit so as to obtain the angle of the current sensor and further calculate the position of an object in front of the display, and the position information is fed back to the control unit, so that the display unit can be controlled to make corresponding picture changes, and man-machine interaction is realized;
the human body sensor further comprises a gesture sensing strategy, and the gesture sensing strategy is used for acquiring human body gesture information.
2. The method of claim 1, wherein the method comprises: the signal conditioning circuit comprises a driver, an amplifier and a detector/ADC (analog to digital converter), wherein the transmitter is driven by a driver modulation signal, the amplifier amplifies a received signal, the detector is only sensitive to the frequency of the modulation signal, so that the interference generated by static ambient light is eliminated, and the ADC can overcome the influence of the ambient light by acquiring the brightness difference received by the receiver when the transmitter is turned on/off.
3. The method of claim 2, wherein the method comprises the following steps: the human body sensor is arranged at one side and is installed at the edge of the rotating part of the lamp strip.
4. The method of claim 2, wherein the method comprises the following steps: the human body sensor array is arranged on the lamp strip.
5. The method of claim 2, wherein the method comprises the following steps: the human body sensors are arranged one by one, and a linear imaging sensor is used to combine a light source.
6. The method of claim 2, wherein the method comprises the following steps: the human body sensors are arranged in an array and are arranged in the middle of the rotating part of the lamp strip.
7. The method according to any one of claims 3, 4 or 6, wherein the step of sensing the human body information data by the human body sensor comprises: 1. by the position of the body sensorsSetting reference cylinder as center of circleSetting the entry reference radius(ii) a 2. By first entering the reference circle with the touch objectFor initial signal, when human hand approaches the display and the human body sensor rotates to the position below the human hand, the sensor gives approach signal, records the moving track of the human hand and recordsLast time located in reference circleEnd point position of inner point(ii) a 3. The signal is amplified by the conditioning circuit and then sent to the information processing unit, the angle of the touch position of the hand can be known by the information processing unit and the angle sensing unit according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize the response to touch.
8. The method of claim 5, wherein the method comprises the following steps: the specific steps of the human body sensor for sensing the human body information data are as follows: the imaging area of the linear imaging sensor is perpendicular to the rotation direction, the light source is an invisible light source such as photoelectric light and the like and is used for illuminating the imaging area, when objects such as hands appear in the imaging area, the hands reflect the photoelectric light to image at the position, corresponding to the hands, on the linear imaging sensor, the processing unit can obtain the horizontal position corresponding to the hands, coordinates where the hands are located are obtained by combining the current angle and are provided for the control unit to realize interaction, signals are amplified by the conditioning circuit and then are sent to the information processing unit, the information processing unit and the angle sensing unit can know the angle of the touch position of the hands according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize response to touch.
9. A method of human-computer interaction for a rotating scanning display according to any of claims 3-6, characterized in that: in the specific steps of the human body sensor sensing the human body information data, the human body sensor further comprises an interactive identification error correction strategy, and the interactive identification error correction strategy comprises the following steps: for a proximity sensor which detects a fault, the reference circle is again entered from a different directionAnd (3) carrying out a step of sensing human body information data by a sensor, and if the human body information data is still identified wrongly, taking the intersection point position of the moving tracks of the two human hands as a final identification point.
10. The method of claim 9, wherein the method comprises the following steps: the gesture sensing strategy specifically comprises the following steps: the first step, extracting the average width of the user's finger as the base number of the finger width(ii) a Secondly, acquiring data signals close to the human hand through a proximity sensor to obtain the real width close to the human hand(ii) a Thirdly, calculating the extending hand index close to the hand of the humanRounding the calculated value to an integer value, and extending the hand index of the userRepresenting the number of displays that the user interacts with.
11. The method of claim 10, wherein the method comprises: the gesture sensing strategy also comprises a finger similarity judgment method: the first step is as follows: collecting the average width of the user's finger as the base number of the finger widthExtreme value of finger gapFinger length baseAnd secondly, acquiring a data signal of the 'approaching object' through the approaching sensor to obtain the real width of the approaching objectSize of gapAnd length(ii) a And thirdly, calculating the similarity S between the approaching object and the finger, comparing the similarity S with a threshold value, judging whether the finger is pressed, if so, displaying, otherwise, not displaying and alarming.
12. The method of claim 11, wherein the method comprises the steps of: the calculation formula of the similarity S is as follows: similarity of one finger:in whichTo representWhether or not to be atIn the range, the value is 1 if it is, 0 if it is not, andrepresentWhether or not to be atIn the range, if yes, the value is 1, if not, the value is 0; n finger similarity:in whichRepresentWhether or not to be atIf the similarity is 1, otherwise, 0 is obtained, and the similarity S is obtained, wherein if the similarity is 1, the finger interaction is represented, and if the similarity is 0, the finger interaction is not represented.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211545375.7A CN115562501B (en) | 2022-12-05 | 2022-12-05 | Man-machine interaction method for rotary scanning display |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211545375.7A CN115562501B (en) | 2022-12-05 | 2022-12-05 | Man-machine interaction method for rotary scanning display |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115562501A true CN115562501A (en) | 2023-01-03 |
CN115562501B CN115562501B (en) | 2023-03-03 |
Family
ID=84770494
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211545375.7A Active CN115562501B (en) | 2022-12-05 | 2022-12-05 | Man-machine interaction method for rotary scanning display |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115562501B (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103714322A (en) * | 2013-12-26 | 2014-04-09 | 四川虹欧显示器件有限公司 | Real-time gesture recognition method and device |
CN103809880A (en) * | 2014-02-24 | 2014-05-21 | 清华大学 | Man-machine interaction system and method |
US8854433B1 (en) * | 2012-02-03 | 2014-10-07 | Aquifi, Inc. | Method and system enabling natural user interface gestures with an electronic system |
CN105579929A (en) * | 2013-10-29 | 2016-05-11 | 英特尔公司 | Gesture based human computer interaction |
CN106325523A (en) * | 2016-09-07 | 2017-01-11 | 讯飞幻境(北京)科技有限公司 | Man-machine interaction display device and system |
CN106603945A (en) * | 2016-12-28 | 2017-04-26 | Tcl集团股份有限公司 | Mobile playing device and control method thereof |
CN107197223A (en) * | 2017-06-15 | 2017-09-22 | 北京有初科技有限公司 | The gestural control method of micro-projection device and projector equipment |
CN107272893A (en) * | 2017-06-05 | 2017-10-20 | 上海大学 | Man-machine interactive system and method based on gesture control non-touch screen |
CN208240332U (en) * | 2018-06-22 | 2018-12-14 | 南京达斯琪数字科技有限公司 | A kind of real time human-machine interaction holographic display system |
-
2022
- 2022-12-05 CN CN202211545375.7A patent/CN115562501B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8854433B1 (en) * | 2012-02-03 | 2014-10-07 | Aquifi, Inc. | Method and system enabling natural user interface gestures with an electronic system |
US20150062004A1 (en) * | 2012-02-03 | 2015-03-05 | Aquifi, Inc. | Method and System Enabling Natural User Interface Gestures with an Electronic System |
CN105579929A (en) * | 2013-10-29 | 2016-05-11 | 英特尔公司 | Gesture based human computer interaction |
CN103714322A (en) * | 2013-12-26 | 2014-04-09 | 四川虹欧显示器件有限公司 | Real-time gesture recognition method and device |
CN103809880A (en) * | 2014-02-24 | 2014-05-21 | 清华大学 | Man-machine interaction system and method |
CN106325523A (en) * | 2016-09-07 | 2017-01-11 | 讯飞幻境(北京)科技有限公司 | Man-machine interaction display device and system |
CN106603945A (en) * | 2016-12-28 | 2017-04-26 | Tcl集团股份有限公司 | Mobile playing device and control method thereof |
CN107272893A (en) * | 2017-06-05 | 2017-10-20 | 上海大学 | Man-machine interactive system and method based on gesture control non-touch screen |
CN107197223A (en) * | 2017-06-15 | 2017-09-22 | 北京有初科技有限公司 | The gestural control method of micro-projection device and projector equipment |
CN208240332U (en) * | 2018-06-22 | 2018-12-14 | 南京达斯琪数字科技有限公司 | A kind of real time human-machine interaction holographic display system |
Also Published As
Publication number | Publication date |
---|---|
CN115562501B (en) | 2023-03-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11775076B2 (en) | Motion detecting system having multiple sensors | |
US8619060B2 (en) | Multi-touch positioning method and multi-touch screen | |
TWI450159B (en) | Optical touch device, passive touch system and its input detection method | |
US20130194240A1 (en) | Optical Input Devices with Sensors | |
CN102754048A (en) | Imaging methods and systems for position detection | |
JP4915367B2 (en) | Display imaging apparatus and object detection method | |
JP2004118312A (en) | Input device, information device and control information generation method | |
US20130342491A1 (en) | Control Method, Control Device, Display Device And Electronic Device | |
US10884518B2 (en) | Gesture detection device for detecting hovering and click | |
TW201800901A (en) | Method and pixel array for detecting gesture | |
CN115562501B (en) | Man-machine interaction method for rotary scanning display | |
KR100942431B1 (en) | Complementary metal oxide semiconductor, source of light using the touch coordinates preception method and the touch screen system | |
CN107563259B (en) | Method for detecting action information, photosensitive array and image sensor | |
US9489077B2 (en) | Optical touch panel system, optical sensing module, and operation method thereof | |
CN101825971B (en) | Laser scanning input device | |
CN1674042A (en) | Contact screen information inputting positioning method based on image recognition | |
US20140210715A1 (en) | Gesture detection device for detecting hovering and click | |
CN104375631A (en) | Non-contact interaction method based on mobile terminal | |
JP6233941B1 (en) | Non-contact type three-dimensional touch panel, non-contact type three-dimensional touch panel system, non-contact type three-dimensional touch panel control method, program, and recording medium | |
CN109145711B (en) | Fingerprint sensor supporting waking up finger and navigation | |
KR20090037535A (en) | Method for processing input of touch screen | |
CN101833398A (en) | Position detecting device and method thereof | |
CN104866112A (en) | Non-contact interaction method based on mobile terminal | |
US11287897B2 (en) | Motion detecting system having multiple sensors | |
CN110502095B (en) | Three-dimensional display with gesture sensing function |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |