CN115562501A - Man-machine interaction method for rotary scanning display - Google Patents

Man-machine interaction method for rotary scanning display Download PDF

Info

Publication number
CN115562501A
CN115562501A CN202211545375.7A CN202211545375A CN115562501A CN 115562501 A CN115562501 A CN 115562501A CN 202211545375 A CN202211545375 A CN 202211545375A CN 115562501 A CN115562501 A CN 115562501A
Authority
CN
China
Prior art keywords
sensor
human body
finger
signal
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211545375.7A
Other languages
Chinese (zh)
Other versions
CN115562501B (en
Inventor
周全
罗鸿飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing DseeLab Digital Technology Ltd
Original Assignee
Nanjing DseeLab Digital Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing DseeLab Digital Technology Ltd filed Critical Nanjing DseeLab Digital Technology Ltd
Priority to CN202211545375.7A priority Critical patent/CN115562501B/en
Publication of CN115562501A publication Critical patent/CN115562501A/en
Application granted granted Critical
Publication of CN115562501B publication Critical patent/CN115562501B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • G09F9/30Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements
    • G09F9/33Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements being semiconductor devices, e.g. diodes
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Abstract

The invention discloses a man-machine interaction method of a rotary scanning display, which comprises a motor driving unit, a driving motor, a human body sensor, a signal conditioning circuit, a signal processing unit, an angle sensing unit, a control unit and a display unit, wherein the motor driving unit is used for controlling the operation of the driving motor, the driving motor is used for driving the scanning display to rotate, and the human body sensor is used for sensing human body information data.

Description

Man-machine interaction method for rotary scanning display
Technical Field
The invention relates to the field of man-machine interaction, in particular to a man-machine interaction method of a rotary scanning display.
Background
The rotary LED display screen is a novel display screen which utilizes mechanical rotation dynamic scanning to replace the traditional progressive scanning mode, has the characteristics of low cost and large visual range, is a new development direction of the LED display screen, and is essentially a dynamic scanning display technology matched with mechanical rotation, and the main control chip is AT89S52, and auxiliary components such as a motor module, a time module, a temperature module, a display module, a speed module and the like;
the invention discloses an LED rotary scanning (POV) display technology for realizing picture display by rotating one or more rows of LED lamp bars, which can achieve very high brightness/definition and a nearly transparent display effect, but the conventional rotary display equipment generally does not have a man-machine interaction function, interaction needs to be realized by an external button/body sensing sensor and the like, and meanwhile, when human interaction is carried out, possible columns (such as a support frame and a support rod) around the rotary display equipment can be mistaken for the man-machine interaction equipment to be an interaction signal, so that the interaction information is changed.
Disclosure of Invention
The invention mainly aims to provide a man-machine interaction method for a rotary scanning display, which can effectively solve the problems in the background technology: the prior rotary display device generally does not have a man-machine interaction function and needs to realize interaction through an external button/somatosensory sensor and the like, the invention adds a photoelectric or capacitive human body sensor on a rotary component of the display, when the equipment rotates, the sensor rotates along with the equipment to scan the space, so that whether a human body exists or not is detected, the position of the human body is obtained, and interaction is realized.
In order to realize the purpose, the invention adopts the technical scheme that:
a man-machine interaction method of a rotary scanning display comprises a motor driving unit, a driving motor, a human body sensor, a signal conditioning circuit, a signal processing unit, an angle sensing unit, a control unit and a display unit, wherein the motor driving unit is used for controlling the driving motor to operate, the driving motor is used for driving the scanning display to rotate, the human body sensor is used for sensing human body information data, the signal conditioning circuit is used for processing and conditioning acquired signals and eliminating interference factors in the acquired signals, the signal processing unit is used for processing the acquired signals to obtain position and posture information of a human body, the angle sensing unit is used for directly or indirectly obtaining touch position angle information, the control unit is used for receiving signals sent by the signal processing unit and controlling the display unit to make corresponding picture changes, the display unit is used for displaying corresponding pictures, the human body sensor is a photoelectric sensor and/or a distributed capacitance sensor, the photoelectric sensor comprises a transmitter and a receiver, the distributed capacitance sensor comprises a metal polar plate and an oscillator, and the angle sensing unit is one of a zero point detection sensor and an angle sensor;
the human-computer interaction method comprises the following specific steps: the first step is as follows: the motor driving unit controls the operation of a driving motor, the driving motor drives the scanning display to rotate, and the human body sensor senses human body information data;
the second step is that: the human body sensor is fixed with the light bar and rotates along with the light bar, when an object such as a human hand is in front of the sensor, the output signal of the sensor changes, is amplified/shaped by the signal conditioning circuit and is output to the signal processing unit;
the third step: the signal processing unit is connected with the angle sensing unit so as to obtain the angle of the current sensor, further calculate the position of an object in front of the display, and feed back the position information to the control unit, namely, the control unit can control the display unit to make corresponding picture changes to realize man-machine interaction;
the human body sensor further comprises a gesture induction strategy, and the gesture induction strategy is used for acquiring human body gesture information.
In a further improvement of the present invention, the signal conditioning circuit comprises one of a driver, an amplifier and a detector/ADC, the transmitter is driven by a driver modulation signal, the amplifier amplifies a received signal, the detector is sensitive only to the frequency of the modulation signal, thereby eliminating the interference caused by static ambient light, and the ADC can also overcome the ambient light effect by collecting the brightness difference received by the receiver when the transmitter is turned on/off.
The invention is further improved in that one human body sensor is arranged and installed at the edge of the rotating part of the lamp strip.
The invention is further improved in that the human body sensor array is arranged on the light bar.
A further improvement of the invention is that the body sensors are arranged one by one, using a linear imaging sensor in combination with a light source.
The invention is further improved in that the human body sensor is arranged in an array and is arranged in the middle of the rotating part of the lamp strip.
The invention has the further improvement that the specific steps of the human body sensor for sensing the human body information data are as follows: 1. by the position of a human body sensor
Figure 151697DEST_PATH_IMAGE001
Setting reference cylinder as center of circle
Figure 926755DEST_PATH_IMAGE002
Setting the entry reference radius
Figure 8106DEST_PATH_IMAGE003
(ii) a 2. By initial entry of the contact into the reference circle
Figure 534028DEST_PATH_IMAGE004
For initial signal, when human hand approaches the display and the human body sensor rotates to the position below the human hand, the sensor gives an approach signal, records the moving track of the human hand, and records that the human hand is located on the reference circle for the last time
Figure 959193DEST_PATH_IMAGE004
End point position of inner point
Figure 385495DEST_PATH_IMAGE005
(ii) a 3. The signal is amplified by the conditioning circuit and then sent to the information processing unit, the angle of the touch position of the hand can be known by the information processing unit and the angle sensing unit according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize the response to the touch.
The invention has the further improvement that the specific steps of the human body sensor for sensing the human body information data are as follows: the imaging area of the linear imaging sensor is perpendicular to the rotation direction, the light source is an invisible light source such as photoelectric light and the like and is used for illuminating the imaging area, when objects such as hands appear in the imaging area, the hands reflect the photoelectric light to image at the position, corresponding to the hands, on the linear imaging sensor, the processing unit can obtain the horizontal position corresponding to the hands, coordinates where the hands are located are obtained by combining the current angle and are provided for the control unit to realize interaction, signals are amplified by the conditioning circuit and then are sent to the information processing unit, the information processing unit and the angle sensing unit can know the angle of the touch position of the hands according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize response to touch.
The invention has the further improvement that the specific steps of the human body sensor sensing the human body information data also comprise an interactive identification error correction strategy, and the interactive identification error correction strategy comprises the following steps: for a proximity sensor which detects a fault, the reference circle is again entered from a different direction
Figure 372168DEST_PATH_IMAGE004
And (3) carrying out a step of sensing human body information data by a sensor, and if the human body information data is still identified wrongly, taking the intersection point position of the moving tracks of the two human hands as a final identification point.
The invention has the further improvement that the gesture sensing strategy comprises the following specific steps: the first step, extracting the average width of the user's finger as the base number of the finger width
Figure 618341DEST_PATH_IMAGE006
(ii) a Second, the approach person is collected by the approach sensorData signal of hand to obtain the real width close to human hand
Figure 83083DEST_PATH_IMAGE007
(ii) a Thirdly, calculating the extending hand index close to the human hand
Figure 832733DEST_PATH_IMAGE008
Rounding the calculated value to an integer value, and extending the hand index of the user
Figure 957684DEST_PATH_IMAGE009
Representing the number of displays that the user interacts with.
The further improvement of the invention is that the gesture sensing strategy also comprises a finger similarity judgment method: the first step is as follows: collecting the average width of the user's finger as the base number of the finger width
Figure 384162DEST_PATH_IMAGE006
Extreme value of finger gap
Figure 619971DEST_PATH_IMAGE010
Base finger length
Figure 958549DEST_PATH_IMAGE011
And secondly, acquiring a data signal of the 'approaching object' through the approaching sensor to obtain the real width of the approaching object
Figure 21445DEST_PATH_IMAGE012
Size of gap
Figure 445473DEST_PATH_IMAGE013
And length
Figure 484973DEST_PATH_IMAGE014
(ii) a And thirdly, calculating the similarity S between the approaching object and the finger, comparing the similarity S with a threshold value, judging whether the finger is pressed, if so, displaying, otherwise, not displaying and alarming.
The invention has the further improvement that the calculation formula of the similarity S is as follows: similarity of one finger:
Figure 445101DEST_PATH_IMAGE015
in which
Figure 646275DEST_PATH_IMAGE016
Represent
Figure 823179DEST_PATH_IMAGE012
Whether or not to be at
Figure 100002_DEST_PATH_IMAGE017
In the range, if the value is 1, if not, 0 is selected, and
Figure 26889DEST_PATH_IMAGE018
represent
Figure 107103DEST_PATH_IMAGE014
Whether or not to be at
Figure 744758DEST_PATH_IMAGE011
In the range, if yes, the value is 1, if not, 0 is taken; similarity of n fingers:
Figure 100002_DEST_PATH_IMAGE019
in which
Figure 100002_DEST_PATH_IMAGE020
To represent
Figure 97373DEST_PATH_IMAGE013
Whether or not to be at
Figure 213096DEST_PATH_IMAGE010
If the similarity is 1, the value is 1, if the similarity is not 0, the similarity S is obtained, the similarity is 1 and indicates that the finger interaction is performed, and the similarity is 0 and indicates that the finger interaction is not performed.
Compared with the prior art, the invention has the following beneficial effects: the invention adds a human body sensor, a signal conditioning circuit and a signal processing unit on the basic structure of the rotary display, wherein the human body sensor can be a photoelectric geminate transistor, a photoelectric sensor array or a capacitance sensor, is fixed with a light bar and rotates along with the light bar, when an object, such as a human hand, is in front of the sensor, the output signal of the sensor changes, is amplified/shaped by the signal conditioning circuit and is output to the signal processing unit, the signal processing unit is simultaneously connected with a zero point detection sensor or an angle sensor, so as to obtain the angle of the current sensor, further calculate the position of the object in front of the display, and the position information is fed back to the control unit, so that the display unit can be controlled to make corresponding picture changes, thereby realizing human-computer interaction, and simultaneously providing a gesture recognition algorithm, being convenient for fast recognition of gestures, and further enhancing the reality sense of human-computer interaction.
Drawings
FIG. 1 is a schematic diagram of a schematic framework of a method for human-computer interaction with a rotating scanning display according to the present invention.
FIG. 2 is a schematic structural diagram of an embodiment 1 of a method for human-computer interaction with a rotating scanning display according to the present invention.
Fig. 3 is a block diagram of the architecture of embodiment 2 of the present invention.
FIG. 4 is a schematic diagram of the architecture of embodiment 3 of a rotational scanning display human-machine interaction method according to the present invention.
FIG. 5 is a block diagram of the architecture of an embodiment 4 of the method for human-computer interaction with a rotating scanning display according to the present invention.
FIG. 6 is a schematic diagram of a detector electro-optical proximity sensor and a signal conditioning circuit according to a rotational scanning display human-computer interaction method of the present invention.
FIG. 7 is a schematic diagram of an ADC electro-optical proximity sensor and a signal conditioning circuit according to a method for human-computer interaction with a rotary scan display of the present invention.
FIG. 8 is a schematic diagram of a capacitive proximity sensor for a method of human-computer interaction with a rotating scanning display according to the present invention.
Detailed Description
In order to make the technical means, the original characteristics, the achieved objects and the functions of the present invention easy to understand, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. indicate the orientation or the positional relationship based on the orientation or the positional relationship shown in the drawings, and are only for the convenience of describing the present invention and simplifying the description, but not for indicating or implying that the referred device or element must have a specific orientation, be constructed and operated in a specific orientation, and thus not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The invention will be further illustrated with reference to specific embodiments.
Example 1
As shown in fig. 1, 2, 6, 7 and 8, a human-computer interaction method for a rotary scanning display includes a motor driving unit, a driving motor, a human body sensor, a signal conditioning circuit, a signal processing unit, an angle sensing unit, a control unit and a display unit, where the motor driving unit is configured to control operation of the driving motor, the driving motor is configured to drive the scanning display to rotate, the human body sensor is configured to sense human body information data, the signal conditioning circuit is configured to process and condition an acquired signal and eliminate an interference factor therein, the signal processing unit is configured to process the acquired signal and obtain position and posture information of a human body, the angle sensing unit is configured to directly or indirectly obtain touch position angle information, the control unit is configured to receive a signal sent by the signal processing unit and control the display unit to make a corresponding picture change, the display unit is configured to display a corresponding picture, the human body sensor is a photoelectric sensor and/or a distributed capacitive sensor, the photoelectric sensor includes a transmitter and a receiver, the distributed capacitive sensor includes a metal plate and an oscillator, and the angle sensing unit is one of a detection sensor and a zero-point sensor;
the human-computer interaction method comprises the following specific steps: the first step is as follows: the motor driving unit controls the operation of a driving motor, the driving motor drives the scanning display to rotate, and the human body sensor senses human body information data;
the second step is that: the human body sensor is fixed with the light bar and rotates along with the light bar, when an object such as a human hand is in front of the sensor, the output signal of the sensor changes, is amplified/shaped by the signal conditioning circuit and is output to the signal processing unit;
the third step: the signal processing unit is connected with the angle sensing unit so as to obtain the angle of the current sensor and further calculate the position of an object in front of the display, and the position information is fed back to the control unit, so that the display unit can be controlled to make corresponding picture changes, and man-machine interaction is realized;
the human body sensor further comprises a gesture induction strategy, and the gesture induction strategy is used for acquiring human body gesture information;
the signal conditioning circuit comprises a driver, an amplifier and one of a detector and an ADC (analog to digital converter), wherein the transmitter is driven by a driver modulation signal, the amplifier amplifies a received signal, the detector is only sensitive to the frequency of the modulation signal, so that the interference generated by static ambient light is eliminated, and the ADC can overcome the influence of the ambient light by collecting the brightness difference received by the receiver when the transmitter is turned on/off;
wherein, human body sensor sets up one, installs at lamp strip rotating part edge, and human body sensor response human body information data's concrete step is: 1. by the position of the body sensors
Figure 147817DEST_PATH_IMAGE001
Setting reference circle for circle center
Figure 956373DEST_PATH_IMAGE004
Setting the entry reference radius
Figure 107868DEST_PATH_IMAGE003
(ii) a 2. By initial entry of the contact into the reference circle
Figure 263168DEST_PATH_IMAGE004
For the initial signal, when the hand approaches the edge of the display, the human body sensor rotatesWhen the robot moves to the position below the hand, the signal output by the sensor changes under the influence of the hand, the moving track of the hand is recorded, and the last time the robot is positioned on the reference circle is recorded
Figure 550930DEST_PATH_IMAGE004
End point position of inner point
Figure 795967DEST_PATH_IMAGE005
(ii) a 3. The signal is amplified by the conditioning circuit and then sent to the information processing unit, the information processing unit and the angle sensing unit can know the angle of the touch position of the hand according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize the response to touch;
the specific steps of the human body sensor sensing the human body information data further comprise an interactive identification error correction strategy, and the interactive identification error correction strategy comprises the following steps: for a proximity sensor which recognizes a fault, the reference circle is again entered from a different direction
Figure 416784DEST_PATH_IMAGE004
Sensing human body information data by a sensor, and if the human body information data are identified wrongly, taking the intersection point position of the moving tracks of the two human hands as a final identification point;
the gesture sensing strategy comprises the following specific steps: the first step, extracting the average finger width of the user as the base finger width
Figure 405468DEST_PATH_IMAGE006
(ii) a Secondly, acquiring data signals close to the human hand through a proximity sensor to obtain the real width close to the human hand
Figure 547737DEST_PATH_IMAGE007
(ii) a Thirdly, calculating the extending hand index close to the hand of the human
Figure 465139DEST_PATH_IMAGE008
Rounding off the calculated value to take an integer value, and extending the hand index of the user
Figure 325648DEST_PATH_IMAGE009
The number of display frames representing user interaction;
the gesture sensing strategy further comprises a finger similarity judgment method: the first step is as follows: collecting the average width of the user's finger as the base of the finger width
Figure 118023DEST_PATH_IMAGE006
Extreme value of finger gap
Figure 881842DEST_PATH_IMAGE010
Finger length base
Figure 468681DEST_PATH_IMAGE011
And secondly, acquiring a data signal of the 'approaching object' through the approaching sensor to obtain the real width of the approaching object
Figure 317951DEST_PATH_IMAGE012
Size of gap
Figure 648438DEST_PATH_IMAGE013
And length
Figure 765299DEST_PATH_IMAGE014
(ii) a Thirdly, calculating the similarity S between the approaching object and the finger, comparing the similarity S with a threshold value, judging whether the finger is pressed, if the finger is indeed judged to be interacted, displaying, and if the finger is not judged to be interacted, not displaying and alarming;
wherein, the calculation formula of the similarity S is as follows: similarity of one finger:
Figure 290083DEST_PATH_IMAGE021
wherein
Figure 125184DEST_PATH_IMAGE016
To represent
Figure 728204DEST_PATH_IMAGE012
Whether or not to be at
Figure DEST_PATH_IMAGE022
In the range, the value is 1 if it is, 0 if it is not, and
Figure 60090DEST_PATH_IMAGE018
to represent
Figure 723153DEST_PATH_IMAGE014
Whether or not to be at
Figure 547015DEST_PATH_IMAGE011
In the range, if yes, the value is 1, if not, the value is 0; similarity of n fingers:
Figure 219304DEST_PATH_IMAGE019
in which
Figure 812222DEST_PATH_IMAGE020
To represent
Figure 911765DEST_PATH_IMAGE013
Whether or not to be at
Figure 455879DEST_PATH_IMAGE010
If the similarity is 1, otherwise, 0 is obtained, and the similarity S is obtained, wherein if the similarity is 1, the finger interaction is represented, and if the similarity is 0, the finger interaction is not represented.
The embodiment can realize that: in the above embodiment, the display is in a cylindrical rotary scanning mode, and the proximity sensor is placed at the edge of the rotating portion. When the display part rotates, the proximity sensor rotates along with the rotating structure, when a human hand approaches the edge of the display and the proximity sensor rotates to the position below the human hand, the sensor gives a proximity signal, the signal is amplified by the conditioning circuit and then sent to the processing unit, the processing unit can know the angle of the touch position of the human hand according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize the response to the touch.
Example 2
As shown in fig. 1, 2, 6, 7 and 8, a man-machine interaction method for a rotary scanning display includes a motor driving unit, a driving motor, a body sensor, a signal conditioning circuit, a signal processing unit, an angle sensing unit, a control unit and a display unit, where the motor driving unit is configured to control operation of the driving motor, the driving motor is configured to drive the scanning display to rotate, the body sensor is configured to sense body information data, the signal conditioning circuit is configured to process and condition a collected signal and eliminate interference factors therein, the signal processing unit is configured to process the collected signal and obtain position and posture information of a body, the angle sensing unit is configured to directly or indirectly obtain touch position angle information, the control unit is configured to receive a signal sent by the signal processing unit and control the display unit to make a corresponding frame change, the display unit is configured to display a corresponding frame, the body sensor is a photoelectric sensor and/or a distributed capacitance sensor, the photoelectric sensor includes a transmitter and a receiver, the distributed capacitance sensor includes a metal plate and an oscillator, and the angle sensing unit is one of a detection sensor and a zero-point sensor;
the man-machine interaction method comprises the following specific steps: the first step is as follows: the motor driving unit controls the operation of a driving motor, the driving motor drives the scanning display to rotate, and the human body sensor senses human body information data;
the second step: the human body sensor is fixed with the light bar and rotates along with the light bar, when an object such as a human hand is in front of the sensor, the output signal of the sensor changes, is amplified/shaped by the signal conditioning circuit and is output to the signal processing unit;
the third step: the signal processing unit is connected with the angle sensing unit so as to obtain the angle of the current sensor, further calculate the position of an object in front of the display, and feed back the position information to the control unit, namely, the control unit can control the display unit to make corresponding picture changes to realize man-machine interaction;
the human body sensor also comprises a gesture induction strategy, and the gesture induction strategy is used for acquiring human body gesture information;
the signal conditioning circuit comprises a driver, an amplifier and one of a detector and an ADC (analog to digital converter), wherein the transmitter is driven by a driver modulation signal, the amplifier amplifies a received signal, the detector is only sensitive to the frequency of the modulation signal, so that the interference generated by static ambient light is eliminated, and the ADC can overcome the influence of the ambient light by collecting the brightness difference received by the receiver when the transmitter is turned on/off;
wherein, the human body sensor array sets up on the lamp strip, and the specific step of human body sensor response human body information data is: 1. by the position of the body sensors
Figure 421606DEST_PATH_IMAGE001
Setting reference cylinder as center of circle
Figure 101986DEST_PATH_IMAGE002
Setting the entry reference radius
Figure 638009DEST_PATH_IMAGE003
(ii) a 2. By initial entry of the contact into the reference circle
Figure 905305DEST_PATH_IMAGE004
When the human hand approaches the display and the human body sensor rotates to the position below the human hand, the signal output by the sensor changes under the influence of the human hand, the moving track of the human hand is recorded, and the last time the human hand is positioned on the reference circle is recorded
Figure 919397DEST_PATH_IMAGE004
End point position of inner point
Figure 719863DEST_PATH_IMAGE005
(ii) a 3. The signal is amplified by the conditioning circuit and then sent to the information processing unit, the information processing unit and the angle sensing unit can know the angle of the touch position of the hand according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize the response to touch;
wherein, the specific steps of the human body sensor sensing the human body information data further comprise an interactive identification and error correction strategy, and the interaction is carried outIdentifying an error correction policy comprises the steps of: for a proximity sensor which recognizes a fault, the reference circle is again entered from a different direction
Figure 928253DEST_PATH_IMAGE004
Sensing human body information data by a sensor, and if the human body information data are identified wrongly, taking the intersection point position of the moving tracks of the two human hands as a final identification point;
the gesture sensing strategy comprises the following specific steps: the first step, extracting the average finger width of the user as the base finger width
Figure 181379DEST_PATH_IMAGE006
(ii) a Secondly, acquiring data signals close to the human hand through a proximity sensor to obtain the real width close to the human hand
Figure 733583DEST_PATH_IMAGE007
(ii) a Thirdly, calculating the extending hand index close to the human hand
Figure 155600DEST_PATH_IMAGE008
Rounding off the calculated value to take an integer value, and extending the hand index of the user
Figure 33426DEST_PATH_IMAGE009
The number of display frames representing user interaction;
the gesture sensing strategy further comprises a finger similarity judgment method: the first step is as follows: collecting the average width of the user's finger as the base number of the finger width
Figure 773849DEST_PATH_IMAGE006
Extreme value of finger gap
Figure 896788DEST_PATH_IMAGE010
Base finger length
Figure 140687DEST_PATH_IMAGE011
And secondly, acquiring a data signal of a 'approaching object' through a proximity sensor to obtainTrue width of approaching object
Figure 923836DEST_PATH_IMAGE012
Size of gap
Figure 653019DEST_PATH_IMAGE013
And length
Figure 812605DEST_PATH_IMAGE014
(ii) a Thirdly, calculating the similarity S between the approaching object and the finger, comparing the similarity S with a threshold value, judging whether the finger is pressed, if the finger is indeed judged to be interacted, displaying, and if the finger is not judged to be interacted, not displaying and alarming;
wherein, the calculation formula of the similarity S is as follows: similarity of one finger:
Figure 176591DEST_PATH_IMAGE021
wherein
Figure 100947DEST_PATH_IMAGE023
Represent
Figure 815962DEST_PATH_IMAGE012
Whether or not to be at
Figure DEST_PATH_IMAGE024
In the range, the value is 1 if it is, 0 if it is not, and
Figure 343020DEST_PATH_IMAGE018
to represent
Figure 295933DEST_PATH_IMAGE014
Whether or not to be at
Figure 420884DEST_PATH_IMAGE011
In the range, if yes, the value is 1, if not, the value is 0; n finger similarity:
Figure 92036DEST_PATH_IMAGE019
wherein
Figure 106608DEST_PATH_IMAGE020
To represent
Figure 179607DEST_PATH_IMAGE013
Whether or not to be at
Figure 475459DEST_PATH_IMAGE010
If the similarity is 1, the value is 1, if the similarity is not 0, the similarity S is obtained, the similarity is 1 and indicates that the finger interaction is performed, and the similarity is 0 and indicates that the finger interaction is not performed.
The embodiment can realize that: by installing the sensors in an array mode in an expanded mode, the touch angle information and the touch horizontal position can be obtained, the sensing area is expanded to the whole display area, and the accuracy of the horizontal position information can be improved by interpolating according to the information of the adjacent sensors.
Example 3
In this embodiment, on the basis of embodiment 2, a photoelectric sensor array mode is adopted to realize human body sensing, but this mode needs to use a large number of sensors, and has a complex structure and high cost, so that one linear imaging sensor (such as a linear CCD) can be used in combination with a light source to realize the same function, and the specific implementation manner is as follows: as shown in fig. 1, 4, 6, 7 and 8, a man-machine interaction method for a rotary scanning display includes a motor driving unit, a driving motor, a body sensor, a signal conditioning circuit, a signal processing unit, an angle sensing unit, a control unit and a display unit, where the motor driving unit is configured to control operation of the driving motor, the driving motor is configured to drive the scanning display to rotate, the body sensor is configured to sense body information data, the signal conditioning circuit is configured to process and condition a collected signal and eliminate interference factors therein, the signal processing unit is configured to process the collected signal and obtain position and posture information of a body, the angle sensing unit is configured to directly or indirectly obtain touch position angle information, the control unit is configured to receive a signal sent by the signal processing unit and control the display unit to make a corresponding frame change, the display unit is configured to display a corresponding frame, the body sensor is a photoelectric sensor and/or a distributed capacitance sensor, the photoelectric sensor includes a transmitter and a receiver, the distributed capacitance sensor includes a metal plate and an oscillator, and the angle sensing unit is one of a detection sensor and a zero-point sensor;
the human-computer interaction method comprises the following specific steps: the first step is as follows: the motor driving unit controls the operation of a driving motor, the driving motor drives the scanning display to rotate, and the human body sensor senses human body information data;
the second step is that: the human body sensor is fixed with the light bar and rotates along with the light bar, when an object such as a human hand is in front of the sensor, the output signal of the sensor changes, is amplified/shaped by the signal conditioning circuit and is output to the signal processing unit;
the third step: the signal processing unit is connected with the angle sensing unit so as to obtain the angle of the current sensor and further calculate the position of an object in front of the display, and the position information is fed back to the control unit, so that the display unit can be controlled to make corresponding picture changes, and man-machine interaction is realized;
the human body sensor also comprises a gesture induction strategy, and the gesture induction strategy is used for acquiring human body gesture information;
the signal conditioning circuit comprises a driver, an amplifier and one of a detector and an ADC (analog to digital converter), wherein the transmitter is driven by a driver modulation signal, the amplifier amplifies a received signal, the detector is only sensitive to the frequency of the modulation signal, so that the interference generated by static ambient light is eliminated, and the ADC can overcome the influence of the ambient light by collecting the brightness difference received by the receiver when the transmitter is turned on/off;
wherein, the human body sensor sets up one, uses a linear imaging sensor to combine the light source, and the specific step of human body sensor response human body information data is: 1. by the position of the body sensors
Figure 400952DEST_PATH_IMAGE001
Setting reference circle for circle center
Figure 174873DEST_PATH_IMAGE004
Setting the entry reference radius
Figure 633536DEST_PATH_IMAGE003
(ii) a 2. By initial entry of the contact into the reference circle
Figure 100289DEST_PATH_IMAGE004
When the human hand approaches the display and the human body sensor rotates to the position below the human hand, the signal output by the sensor changes under the influence of the human hand, the moving track of the human hand is recorded, and the last time the human hand is positioned on the reference circle is recorded
Figure 44237DEST_PATH_IMAGE004
End point position of inner point
Figure 621849DEST_PATH_IMAGE005
(ii) a 3. When objects such as hands appear in the imaging area, the hands reflect light and emit light, imaging is carried out on the linear imaging sensor at the position corresponding to the hands, the processing unit can obtain the horizontal position corresponding to the hands, coordinates where the hands are located are obtained by combining the current angle and are provided for the control unit to realize interaction, signals are amplified by the conditioning circuit and then are sent to the information processing unit, the information processing unit and the angle sensing unit can know the angle of the touch position of the hands according to the angle of the current light bar, and the control unit can realize response to touch after the information is sent to the control unit;
the specific steps of the human body sensor sensing human body information data further comprise an interactive identification error correction strategy, and the interactive identification error correction strategy comprises the following steps: for a proximity sensor which recognizes a fault, the reference circle is again entered from a different direction
Figure 436483DEST_PATH_IMAGE025
Carrying out the step of sensing the human body information data by the sensor, and if the human body information data is still identified by errors, taking two handsThe position of the intersection point of the moving track is the final identification point;
the gesture sensing strategy comprises the following specific steps: the first step, extracting the average width of the user's finger as the base number of the finger width
Figure 339717DEST_PATH_IMAGE006
(ii) a Secondly, acquiring data signals close to the human hand through a proximity sensor to obtain the real width close to the human hand
Figure DEST_PATH_IMAGE026
(ii) a Thirdly, calculating the extending hand index close to the hand of the human
Figure 800654DEST_PATH_IMAGE008
Rounding the calculated value to an integer value, and extending the hand index of the user
Figure 417843DEST_PATH_IMAGE009
The number of display frames representing user interaction;
the gesture sensing strategy further comprises a finger similarity judgment method: the first step is as follows: collecting the average width of the user's finger as the base number of the finger width
Figure 585519DEST_PATH_IMAGE006
Extreme value of finger gap
Figure 394075DEST_PATH_IMAGE010
Finger length base
Figure 984718DEST_PATH_IMAGE011
And secondly, acquiring a data signal of the 'approaching object' through the proximity sensor to obtain the real width of the approaching object
Figure 435291DEST_PATH_IMAGE012
Size of gap
Figure 191895DEST_PATH_IMAGE013
And length
Figure 905773DEST_PATH_IMAGE014
(ii) a Thirdly, calculating the similarity S between the approaching object and the finger, comparing the similarity S with a threshold value, judging whether the finger is pressed, if the finger is indeed judged to be interacted, displaying, and if the finger is not judged to be interacted, not displaying and alarming;
wherein, the calculation formula of the similarity S is as follows: similarity of one finger:
Figure 780450DEST_PATH_IMAGE021
wherein
Figure 237976DEST_PATH_IMAGE023
To represent
Figure 114665DEST_PATH_IMAGE012
Whether or not to be at
Figure 766489DEST_PATH_IMAGE024
In the range, the value is 1 if it is, 0 if it is not, and
Figure 95839DEST_PATH_IMAGE018
to represent
Figure 888215DEST_PATH_IMAGE014
Whether or not to be at
Figure 619410DEST_PATH_IMAGE011
In the range, if yes, the value is 1, if not, the value is 0; similarity of n fingers:
Figure 430416DEST_PATH_IMAGE019
wherein
Figure 247063DEST_PATH_IMAGE020
Represent
Figure 311970DEST_PATH_IMAGE013
Whether or not to be at
Figure 664717DEST_PATH_IMAGE010
If the similarity is 1, otherwise, 0 is obtained, and the similarity S is obtained, wherein if the similarity is 1, the finger interaction is represented, and if the similarity is 0, the finger interaction is not represented.
The embodiment can realize that: the imaging area of the linear CCD is vertical to the rotating direction, and the light source is an invisible light source such as photoelectric light and the like and is used for illuminating the imaging area. When objects such as hands appear in the imaging area, the hands reflect light to generate electricity, images are formed on the CCD at positions corresponding to the hands, the processing unit can obtain horizontal positions corresponding to the hands, coordinates where the hands are located are obtained by combining the current angles, and the coordinates are provided for the control unit to achieve interaction.
Example 4
As shown in fig. 1, 5, 6, 7 and 8, a man-machine interaction method for a rotary scanning display includes a motor driving unit, a driving motor, a body sensor, a signal conditioning circuit, a signal processing unit, an angle sensing unit, a control unit and a display unit, where the motor driving unit is configured to control operation of the driving motor, the driving motor is configured to drive the scanning display to rotate, the body sensor is configured to sense body information data, the signal conditioning circuit is configured to process and condition a collected signal and eliminate interference factors therein, the signal processing unit is configured to process the collected signal and obtain position and posture information of a body, the angle sensing unit is configured to directly or indirectly obtain touch position angle information, the control unit is configured to receive a signal sent by the signal processing unit and control the display unit to make a corresponding frame change, the display unit is configured to display a corresponding frame, the body sensor is a photoelectric sensor and/or a distributed capacitance sensor, the photoelectric sensor includes a transmitter and a receiver, the distributed capacitance sensor includes a metal plate and an oscillator, and the angle sensing unit is one of a detection sensor and a zero-point sensor;
the human-computer interaction method comprises the following specific steps: the first step is as follows: the motor driving unit controls the operation of a driving motor, the driving motor drives the scanning display to rotate, and the human body sensor senses human body information data;
the second step: the human body sensor is fixed with the light bar and rotates along with the light bar, when an object such as a human hand is in front of the sensor, the output signal of the sensor changes, is amplified/shaped by the signal conditioning circuit and is output to the signal processing unit;
the third step: the signal processing unit is connected with the angle sensing unit so as to obtain the angle of the current sensor and further calculate the position of an object in front of the display, and the position information is fed back to the control unit, so that the display unit can be controlled to make corresponding picture changes, and man-machine interaction is realized;
the human body sensor further comprises a gesture induction strategy, and the gesture induction strategy is used for acquiring human body gesture information;
the signal conditioning circuit comprises a driver, an amplifier and one of a detector and an ADC (analog to digital converter), wherein the transmitter is driven by a driver modulation signal, the amplifier amplifies a received signal, the detector is only sensitive to the frequency of the modulation signal, so that the interference generated by static ambient light is eliminated, and the ADC can overcome the influence of the ambient light by collecting the brightness difference received by the receiver when the transmitter is turned on/off;
wherein, the human body sensor sets up the array setting, installs at lamp strip rotating part middle part, and the specific step of human body sensor response human body information data does: 1. setting a reference circle by taking the position of the human body sensor as the center of a circle, and setting an entering reference radius; 2. the method comprises the following steps that a contact object enters a reference circle for the first time to serve as an initial signal, a proximity sensor array and a light bar scan on a display plane around the center, so that detection of the object in the display plane is achieved, similarly, the position of a human hand in a polar coordinate of a rotating plane can be obtained through the angle and the center distance of an output signal sensor, interaction is achieved, and the final position of a point located in the reference circle for the last time is recorded; 3. the signal is amplified by the conditioning circuit and then sent to the information processing unit, the information processing unit and the angle sensing unit can know the angle of the touch position of the hand according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize the response to touch;
wherein the body sensor sensesIn the specific steps of the human body information data, an interactive identification error correction strategy is further included, and the interactive identification error correction strategy comprises the following steps: for a proximity sensor which recognizes a fault, the reference circle is again entered from a different direction
Figure 156878DEST_PATH_IMAGE004
Sensing human body information data by a sensor, and if the human body information data are identified wrongly, taking the intersection point position of the moving tracks of the two human hands as a final identification point;
the gesture sensing strategy comprises the following specific steps: the first step, extracting the average width of the user's finger as the base number of the finger width
Figure 460820DEST_PATH_IMAGE006
(ii) a Secondly, acquiring data signals close to the human hand through a proximity sensor to obtain the real width close to the human hand
Figure 329419DEST_PATH_IMAGE007
(ii) a Thirdly, calculating the extending hand index close to the human hand
Figure 536672DEST_PATH_IMAGE008
Rounding off the calculated value to take an integer value, and extending the hand index of the user
Figure 934155DEST_PATH_IMAGE009
The number of display frames representing user interaction;
the gesture sensing strategy further comprises a finger similarity judgment method: the first step is as follows: collecting the average width of the user's finger as the base of the finger width
Figure 256552DEST_PATH_IMAGE006
Extreme value of finger gap
Figure 164727DEST_PATH_IMAGE010
Finger length base
Figure 725022DEST_PATH_IMAGE011
And secondly, acquiring a data signal of the 'approaching object' through the approaching sensor to obtain the real width of the approaching object
Figure 293406DEST_PATH_IMAGE012
Size of gap
Figure 837520DEST_PATH_IMAGE013
And length
Figure 549387DEST_PATH_IMAGE014
(ii) a Thirdly, calculating the similarity S between the approaching object and the finger, comparing the similarity S with a threshold value, judging whether the object is pressed by the finger, if so, displaying, otherwise, not displaying and alarming;
wherein, the calculation formula of the similarity S is as follows: similarity of one finger:
Figure 964187DEST_PATH_IMAGE021
wherein
Figure 234632DEST_PATH_IMAGE023
Represent
Figure 236348DEST_PATH_IMAGE012
Whether or not to be at
Figure 984861DEST_PATH_IMAGE024
In the range, the value is 1 if it is, 0 if it is not, and
Figure 519748DEST_PATH_IMAGE018
to represent
Figure 695514DEST_PATH_IMAGE014
Whether or not to be at
Figure 450106DEST_PATH_IMAGE011
In the range, if yes, the value is 1, if not, the value is 0; n finger similarity:
Figure 736731DEST_PATH_IMAGE019
wherein
Figure 126124DEST_PATH_IMAGE020
To represent
Figure 239835DEST_PATH_IMAGE013
Whether or not to be at
Figure 511417DEST_PATH_IMAGE010
If the similarity is 1, the value is 1, if the similarity is not 0, the similarity S is obtained, the similarity is 1 and indicates that the finger interaction is performed, and the similarity is 0 and indicates that the finger interaction is not performed.
The embodiment can realize that: through the angle and the center distance of the output signal sensor, the position of the human hand in the polar coordinates of the rotating plane can be obtained, and interaction is realized.
The foregoing shows and describes the general principles and broad features of the present invention and advantages thereof. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are described in the specification and illustrated only to illustrate the principle of the present invention, but that various changes and modifications may be made therein without departing from the spirit and scope of the present invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (12)

1. A man-machine interaction method for a rotary scanning display is characterized in that: the human body sensor comprises a motor driving unit, a driving motor, a human body sensor, a signal conditioning circuit, a signal processing unit, an angle sensing unit, a control unit and a display unit, wherein the motor driving unit is used for controlling the operation of the driving motor, the driving motor is used for driving a scanning display to rotate, the human body sensor is used for sensing human body information data, the signal conditioning circuit is used for processing and conditioning acquired signals and eliminating interference factors in the acquired signals, the signal processing unit is used for processing the acquired signals to obtain position and posture information of a human body, the angle sensing unit is used for directly or indirectly obtaining touch position angle information, the control unit is used for receiving signals sent by the signal processing unit and controlling the display unit to make corresponding picture changes, the display unit is used for displaying corresponding pictures, the human body sensor is a photoelectric sensor and/or a distributed capacitance sensor, the photoelectric sensor comprises a zero point and a receiver, the distributed capacitance sensor comprises a metal polar plate and an oscillator, and the angle sensing unit is one of a detection sensor and an angle sensor;
the man-machine interaction method comprises the following specific steps: the first step is as follows: the motor driving unit controls the operation of a driving motor, the driving motor drives the scanning display to rotate, and the human body sensor senses human body information data;
the second step is that: the human body sensor is fixed with the light bar and rotates along with the light bar, when an object such as a human hand is in front of the sensor, the output signal of the sensor changes, is amplified/shaped by the signal conditioning circuit and is output to the signal processing unit;
the third step: the signal processing unit is connected with the angle sensing unit so as to obtain the angle of the current sensor and further calculate the position of an object in front of the display, and the position information is fed back to the control unit, so that the display unit can be controlled to make corresponding picture changes, and man-machine interaction is realized;
the human body sensor further comprises a gesture sensing strategy, and the gesture sensing strategy is used for acquiring human body gesture information.
2. The method of claim 1, wherein the method comprises: the signal conditioning circuit comprises a driver, an amplifier and a detector/ADC (analog to digital converter), wherein the transmitter is driven by a driver modulation signal, the amplifier amplifies a received signal, the detector is only sensitive to the frequency of the modulation signal, so that the interference generated by static ambient light is eliminated, and the ADC can overcome the influence of the ambient light by acquiring the brightness difference received by the receiver when the transmitter is turned on/off.
3. The method of claim 2, wherein the method comprises the following steps: the human body sensor is arranged at one side and is installed at the edge of the rotating part of the lamp strip.
4. The method of claim 2, wherein the method comprises the following steps: the human body sensor array is arranged on the lamp strip.
5. The method of claim 2, wherein the method comprises the following steps: the human body sensors are arranged one by one, and a linear imaging sensor is used to combine a light source.
6. The method of claim 2, wherein the method comprises the following steps: the human body sensors are arranged in an array and are arranged in the middle of the rotating part of the lamp strip.
7. The method according to any one of claims 3, 4 or 6, wherein the step of sensing the human body information data by the human body sensor comprises: 1. by the position of the body sensors
Figure 421493DEST_PATH_IMAGE001
Setting reference cylinder as center of circle
Figure 361768DEST_PATH_IMAGE002
Setting the entry reference radius
Figure 781424DEST_PATH_IMAGE003
(ii) a 2. By first entering the reference circle with the touch object
Figure 883372DEST_PATH_IMAGE004
For initial signal, when human hand approaches the display and the human body sensor rotates to the position below the human hand, the sensor gives approach signal, records the moving track of the human hand and recordsLast time located in reference circle
Figure 531522DEST_PATH_IMAGE004
End point position of inner point
Figure 977284DEST_PATH_IMAGE005
(ii) a 3. The signal is amplified by the conditioning circuit and then sent to the information processing unit, the angle of the touch position of the hand can be known by the information processing unit and the angle sensing unit according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize the response to touch.
8. The method of claim 5, wherein the method comprises the following steps: the specific steps of the human body sensor for sensing the human body information data are as follows: the imaging area of the linear imaging sensor is perpendicular to the rotation direction, the light source is an invisible light source such as photoelectric light and the like and is used for illuminating the imaging area, when objects such as hands appear in the imaging area, the hands reflect the photoelectric light to image at the position, corresponding to the hands, on the linear imaging sensor, the processing unit can obtain the horizontal position corresponding to the hands, coordinates where the hands are located are obtained by combining the current angle and are provided for the control unit to realize interaction, signals are amplified by the conditioning circuit and then are sent to the information processing unit, the information processing unit and the angle sensing unit can know the angle of the touch position of the hands according to the angle of the current light bar, and after the information is sent to the control unit, the control unit can realize response to touch.
9. A method of human-computer interaction for a rotating scanning display according to any of claims 3-6, characterized in that: in the specific steps of the human body sensor sensing the human body information data, the human body sensor further comprises an interactive identification error correction strategy, and the interactive identification error correction strategy comprises the following steps: for a proximity sensor which detects a fault, the reference circle is again entered from a different direction
Figure 27280DEST_PATH_IMAGE004
And (3) carrying out a step of sensing human body information data by a sensor, and if the human body information data is still identified wrongly, taking the intersection point position of the moving tracks of the two human hands as a final identification point.
10. The method of claim 9, wherein the method comprises the following steps: the gesture sensing strategy specifically comprises the following steps: the first step, extracting the average width of the user's finger as the base number of the finger width
Figure 34550DEST_PATH_IMAGE006
(ii) a Secondly, acquiring data signals close to the human hand through a proximity sensor to obtain the real width close to the human hand
Figure 405882DEST_PATH_IMAGE007
(ii) a Thirdly, calculating the extending hand index close to the hand of the human
Figure 422380DEST_PATH_IMAGE008
Rounding the calculated value to an integer value, and extending the hand index of the user
Figure 559838DEST_PATH_IMAGE009
Representing the number of displays that the user interacts with.
11. The method of claim 10, wherein the method comprises: the gesture sensing strategy also comprises a finger similarity judgment method: the first step is as follows: collecting the average width of the user's finger as the base number of the finger width
Figure 472430DEST_PATH_IMAGE006
Extreme value of finger gap
Figure 829593DEST_PATH_IMAGE010
Finger length base
Figure 620088DEST_PATH_IMAGE011
And secondly, acquiring a data signal of the 'approaching object' through the approaching sensor to obtain the real width of the approaching object
Figure 113517DEST_PATH_IMAGE012
Size of gap
Figure 961125DEST_PATH_IMAGE013
And length
Figure 805585DEST_PATH_IMAGE014
(ii) a And thirdly, calculating the similarity S between the approaching object and the finger, comparing the similarity S with a threshold value, judging whether the finger is pressed, if so, displaying, otherwise, not displaying and alarming.
12. The method of claim 11, wherein the method comprises the steps of: the calculation formula of the similarity S is as follows: similarity of one finger:
Figure 898306DEST_PATH_IMAGE015
in which
Figure 13286DEST_PATH_IMAGE016
To represent
Figure 798839DEST_PATH_IMAGE012
Whether or not to be at
Figure DEST_PATH_IMAGE017
In the range, the value is 1 if it is, 0 if it is not, and
Figure 301233DEST_PATH_IMAGE018
represent
Figure 197645DEST_PATH_IMAGE014
Whether or not to be at
Figure 196825DEST_PATH_IMAGE011
In the range, if yes, the value is 1, if not, the value is 0; n finger similarity:
Figure DEST_PATH_IMAGE019
in which
Figure DEST_PATH_IMAGE020
Represent
Figure 479513DEST_PATH_IMAGE013
Whether or not to be at
Figure 797099DEST_PATH_IMAGE010
If the similarity is 1, otherwise, 0 is obtained, and the similarity S is obtained, wherein if the similarity is 1, the finger interaction is represented, and if the similarity is 0, the finger interaction is not represented.
CN202211545375.7A 2022-12-05 2022-12-05 Man-machine interaction method for rotary scanning display Active CN115562501B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211545375.7A CN115562501B (en) 2022-12-05 2022-12-05 Man-machine interaction method for rotary scanning display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211545375.7A CN115562501B (en) 2022-12-05 2022-12-05 Man-machine interaction method for rotary scanning display

Publications (2)

Publication Number Publication Date
CN115562501A true CN115562501A (en) 2023-01-03
CN115562501B CN115562501B (en) 2023-03-03

Family

ID=84770494

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211545375.7A Active CN115562501B (en) 2022-12-05 2022-12-05 Man-machine interaction method for rotary scanning display

Country Status (1)

Country Link
CN (1) CN115562501B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103714322A (en) * 2013-12-26 2014-04-09 四川虹欧显示器件有限公司 Real-time gesture recognition method and device
CN103809880A (en) * 2014-02-24 2014-05-21 清华大学 Man-machine interaction system and method
US8854433B1 (en) * 2012-02-03 2014-10-07 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
CN105579929A (en) * 2013-10-29 2016-05-11 英特尔公司 Gesture based human computer interaction
CN106325523A (en) * 2016-09-07 2017-01-11 讯飞幻境(北京)科技有限公司 Man-machine interaction display device and system
CN106603945A (en) * 2016-12-28 2017-04-26 Tcl集团股份有限公司 Mobile playing device and control method thereof
CN107197223A (en) * 2017-06-15 2017-09-22 北京有初科技有限公司 The gestural control method of micro-projection device and projector equipment
CN107272893A (en) * 2017-06-05 2017-10-20 上海大学 Man-machine interactive system and method based on gesture control non-touch screen
CN208240332U (en) * 2018-06-22 2018-12-14 南京达斯琪数字科技有限公司 A kind of real time human-machine interaction holographic display system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8854433B1 (en) * 2012-02-03 2014-10-07 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US20150062004A1 (en) * 2012-02-03 2015-03-05 Aquifi, Inc. Method and System Enabling Natural User Interface Gestures with an Electronic System
CN105579929A (en) * 2013-10-29 2016-05-11 英特尔公司 Gesture based human computer interaction
CN103714322A (en) * 2013-12-26 2014-04-09 四川虹欧显示器件有限公司 Real-time gesture recognition method and device
CN103809880A (en) * 2014-02-24 2014-05-21 清华大学 Man-machine interaction system and method
CN106325523A (en) * 2016-09-07 2017-01-11 讯飞幻境(北京)科技有限公司 Man-machine interaction display device and system
CN106603945A (en) * 2016-12-28 2017-04-26 Tcl集团股份有限公司 Mobile playing device and control method thereof
CN107272893A (en) * 2017-06-05 2017-10-20 上海大学 Man-machine interactive system and method based on gesture control non-touch screen
CN107197223A (en) * 2017-06-15 2017-09-22 北京有初科技有限公司 The gestural control method of micro-projection device and projector equipment
CN208240332U (en) * 2018-06-22 2018-12-14 南京达斯琪数字科技有限公司 A kind of real time human-machine interaction holographic display system

Also Published As

Publication number Publication date
CN115562501B (en) 2023-03-03

Similar Documents

Publication Publication Date Title
US10725554B2 (en) Motion detecting system
US8619060B2 (en) Multi-touch positioning method and multi-touch screen
TWI450159B (en) Optical touch device, passive touch system and its input detection method
US20130194240A1 (en) Optical Input Devices with Sensors
CN102754048A (en) Imaging methods and systems for position detection
JP4915367B2 (en) Display imaging apparatus and object detection method
CN102799318A (en) Human-machine interaction method and system based on binocular stereoscopic vision
JP2004118312A (en) Input device, information device and control information generation method
US10884518B2 (en) Gesture detection device for detecting hovering and click
TW201800901A (en) Method and pixel array for detecting gesture
CN115562501B (en) Man-machine interaction method for rotary scanning display
CN107563259B (en) Method for detecting action information, photosensitive array and image sensor
US9489077B2 (en) Optical touch panel system, optical sensing module, and operation method thereof
CN103076925B (en) Optical touch control system, optical sensing module and How It Works thereof
KR20090061213A (en) Complementary metal oxide semiconductor, source of light using the touch coordinates preception method and the touch screen system
CN101825971B (en) Laser scanning input device
US20140210715A1 (en) Gesture detection device for detecting hovering and click
CN104375631A (en) Non-contact interaction method based on mobile terminal
JP6233941B1 (en) Non-contact type three-dimensional touch panel, non-contact type three-dimensional touch panel system, non-contact type three-dimensional touch panel control method, program, and recording medium
CN109145711B (en) Fingerprint sensor supporting waking up finger and navigation
KR20090037535A (en) Method for processing input of touch screen
CN101833398A (en) Position detecting device and method thereof
CN104866112A (en) Non-contact interaction method based on mobile terminal
US11287897B2 (en) Motion detecting system having multiple sensors
CN110502095B (en) Three-dimensional display with gesture sensing function

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant