CN112346644A - Interaction method based on laser induction, terminal equipment and readable storage medium - Google Patents

Interaction method based on laser induction, terminal equipment and readable storage medium Download PDF

Info

Publication number
CN112346644A
CN112346644A CN202011307037.0A CN202011307037A CN112346644A CN 112346644 A CN112346644 A CN 112346644A CN 202011307037 A CN202011307037 A CN 202011307037A CN 112346644 A CN112346644 A CN 112346644A
Authority
CN
China
Prior art keywords
laser
coordinate information
determining
gesture type
laser spot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011307037.0A
Other languages
Chinese (zh)
Inventor
张发
赵德民
罗阳志
任贵斌
司科研
任新伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen TCL New Technology Co Ltd
Original Assignee
Shenzhen TCL New Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen TCL New Technology Co Ltd filed Critical Shenzhen TCL New Technology Co Ltd
Priority to CN202011307037.0A priority Critical patent/CN112346644A/en
Publication of CN112346644A publication Critical patent/CN112346644A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Abstract

The invention discloses an interaction method based on laser induction, a terminal device and a readable storage medium, wherein the interaction method based on laser induction comprises the following steps: acquiring a detected laser spot; determining a gesture type according to the first coordinate information of the laser spot; and determining an interaction instruction according to the gesture type. The interaction mode of the terminal equipment is enriched.

Description

Interaction method based on laser induction, terminal equipment and readable storage medium
Technical Field
The invention relates to the technical field of intelligent equipment, in particular to an interaction method based on laser induction, terminal equipment and a readable storage medium.
Background
When a user uses a terminal device, the user needs to interact with the terminal device, the terminal device is a smart television, and when the user operates the smart television, the user generally controls the smart television through a remote controller, however, a method for interacting with the smart television through the remote controller excessively depends on the device, the interaction mode which can be realized by the remote controller is simple, for example, only simple channel selection operation, volume adjustment operation and the like can be performed, and along with the development of the times, the functions of the terminal device are increasingly complex, the requirement on the richness of man-machine interaction is also increasingly high, and therefore, the problem that the interaction mode of the terminal device existing in the prior art is single is urgently needed to be solved.
Disclosure of Invention
The invention mainly aims to provide an interaction method based on laser sensing, a terminal device and a readable storage medium, and aims to solve the technical problem that the interaction mode of the terminal device is single.
In order to achieve the above object, the present invention provides an interaction method based on laser sensing, where the interaction method based on laser sensing is applied to a terminal device, and the interaction method based on laser sensing includes:
acquiring a detected laser spot;
determining a gesture type according to the first coordinate information of the laser spot;
and determining an interaction instruction according to the gesture type.
Preferably, the laser spot is formed by irradiating a laser beam emitted by a laser emitting device on a laser sensing layer of the terminal device, the first coordinate information is coordinate information of the laser spot in the laser sensing layer, and the step of determining the gesture type according to the first coordinate information of the laser spot includes:
determining second coordinate information corresponding to the first coordinate information according to a preset corresponding relationship and the first coordinate information, wherein the preset corresponding relationship is a corresponding relationship between the first coordinate information and the second coordinate information which are preset in the terminal equipment, and the second coordinate information is coordinate information of the laser spot in a display panel, which corresponds to the first coordinate information, in the preset corresponding relationship;
and determining the gesture type according to the second coordinate information.
Preferably, the step of determining the gesture type according to the second coordinate information comprises:
determining a movement parameter of the laser spot according to the second coordinate information and the detection time of the laser spot detected within a preset time interval, wherein the movement parameter comprises at least one of a movement distance, a movement direction and a movement track;
and determining the gesture type according to the movement parameters.
Preferably, the laser emitting device is an intelligent glove, the intelligent glove includes at least two laser emitters, the laser beam emitted by each laser emitter irradiates on the laser sensing layer to form one laser spot, and the step of determining the movement parameter of the laser spot according to the second coordinate information and the detection time of the detected laser spot in a preset interval includes:
grouping according to the second coordinate information corresponding to each laser spot and the detection time to obtain at least two groups of coordinate sets, wherein the laser spots corresponding to each second coordinate information in the same group of coordinate sets are formed by the same laser transmitter in the intelligent glove;
and determining the movement parameters of the laser spots corresponding to the coordinate sets according to the coordinate sets of each group.
Preferably, the step of grouping according to the second coordinate information corresponding to each laser spot and the detection time to obtain at least two sets of coordinate sets includes:
determining an absolute value of a difference between the second coordinate information adjacent to the detection time;
and adding the second coordinate information of which the absolute value is smaller than a preset threshold value into the same group of coordinate sets corresponding to the same laser transmitter so as to respectively obtain at least two groups of coordinate sets corresponding to at least two laser transmitters.
Preferably, the movement parameters include the movement distance, the movement track and the movement direction, and the step of determining the gesture type according to the movement parameters includes:
when the moving distance of each laser spot is greater than a preset distance threshold and the moving direction of each laser spot is a preset moving direction, determining that the gesture type is a sliding gesture, or,
when the shape of the moving track is an arc, determining that the gesture type is a rotation gesture.
Preferably, the step of determining the gesture type according to the second coordinate information comprises:
determining a target area of the laser spot in the display panel according to the second coordinate information;
and determining the gesture type according to a preset gesture type corresponding to the target area.
Preferably, the step of determining the gesture type according to the second coordinate information comprises:
determining the stay time of the laser spot on the display panel according to the second coordinate information of the laser spot detected within a preset time interval;
and determining the gesture type according to the stay time.
In addition, in order to achieve the above object, the present invention further provides a terminal device, where the terminal device includes a laser sensing layer, a memory, a processor, and a laser sensing-based interaction program stored in the memory and executable on the processor, and when the laser sensing-based interaction program is executed by the processor, the steps of any one of the above-mentioned laser sensing-based interaction methods are further implemented.
In addition, to achieve the above object, the present invention further provides a computer readable storage medium, on which a laser sensing-based interaction program is stored, and the laser sensing-based interaction program, when executed by a processor, implements the steps of the laser sensing-based interaction method according to any one of the above.
According to the interaction method based on laser sensing, the terminal device and the readable storage medium provided by the embodiment of the invention, the gesture type is determined according to the first coordinate information of the laser spot by acquiring the detected laser spot, the interaction instruction is determined according to the gesture type, the laser spot can be emitted when a user interacts with the terminal device, the terminal device firstly acquires the laser spot, determines the gesture type according to the position coordinate of the laser spot, and further determines the corresponding interaction instruction according to the gesture type, so that the interaction mode of the terminal device is enriched.
Drawings
Fig. 1 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart illustrating a first embodiment of a laser-induced interaction method according to the present invention;
FIG. 3 is a schematic flow chart of a laser-induced interaction method according to a second embodiment of the present invention;
FIG. 4 is a schematic flow chart illustrating a laser-induced interaction method according to a third embodiment of the present invention;
FIG. 5 is a schematic flow chart illustrating a fourth embodiment of a laser-induced interaction method according to the present invention;
FIG. 6 is a flowchart illustrating a fifth exemplary embodiment of a laser-induced interaction method according to the present invention;
FIG. 7 is a flowchart illustrating a sixth exemplary embodiment of a laser-induced interaction method according to the present invention;
fig. 8 is a schematic diagram of a display device in the terminal device of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
As shown in fig. 1, fig. 1 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
The terminal equipment of the embodiment of the invention can be an intelligent television, a PC, a smart phone, a tablet personal computer, a portable computer and the like.
As shown in fig. 1, the terminal may include: a processor 1001, such as a CPU, a memory 1002, a communication bus 1003, and a laser sensing layer 1004. The communication bus 1003 is used to implement connection communication among these components. The memory 1002 may be a high-speed RAM memory or a non-volatile memory (e.g., a disk memory). The memory 1002 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the configuration of the terminal device shown in fig. 1 does not constitute a limitation of the terminal device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, the memory 1002, which is a kind of computer storage medium, may include therein an operating system and an interactive program of the terminal device.
In the terminal device shown in fig. 1, the processor 1001 may be configured to call an interactive program of the terminal device stored in the memory 1002, and perform the following operations:
acquiring a detected laser spot;
determining a gesture type according to the first coordinate information of the laser spot;
and determining an interaction instruction according to the gesture type.
Further, the processor 1001 may call the interactive program of the terminal device stored in the memory 1002, and further perform the following operations:
determining second coordinate information corresponding to the first coordinate information according to a preset corresponding relationship and the first coordinate information, wherein the preset corresponding relationship is a corresponding relationship between the first coordinate information and the second coordinate information which are preset in the terminal equipment, and the second coordinate information is coordinate information of the laser spot in a display panel, which corresponds to the first coordinate information, in the preset corresponding relationship;
and determining the gesture type according to the second coordinate information.
Further, the processor 1001 may call the interactive program of the terminal device stored in the memory 1002, and further perform the following operations:
determining a movement parameter of the laser spot according to the second coordinate information and the detection time of the laser spot detected within a preset time interval, wherein the movement parameter comprises at least one of a movement distance, a movement direction and a movement track;
and determining the gesture type according to the movement parameters.
Further, the processor 1001 may call the interactive program of the terminal device stored in the memory 1002, and further perform the following operations:
grouping according to the second coordinate information corresponding to each laser spot and the detection time to obtain at least two groups of coordinate sets, wherein the laser spots corresponding to each second coordinate information in the same group of coordinate sets are formed by the same laser transmitter in the intelligent glove;
and determining the movement parameters of the laser spots corresponding to the coordinate sets according to the coordinate sets of each group.
Further, the processor 1001 may call the interactive program of the terminal device stored in the memory 1002, and further perform the following operations:
determining an absolute value of a difference between the second coordinate information adjacent to the detection time;
and adding the second coordinate information of which the absolute value is smaller than a preset threshold value into the same group of coordinate sets corresponding to the same laser transmitter so as to respectively obtain at least two groups of coordinate sets corresponding to at least two laser transmitters.
Further, the processor 1001 may call the interactive program of the terminal device stored in the memory 1002, and further perform the following operations:
when the moving distance of each laser spot is greater than a preset distance threshold and the moving direction of each laser spot is a preset moving direction, determining that the gesture type is a sliding gesture, or,
when the shape of the moving track is an arc, determining that the gesture type is a rotation gesture.
Further, the processor 1001 may call the interactive program of the terminal device stored in the memory 1002, and further perform the following operations:
determining a target area of the laser spot in the display panel according to the second coordinate information;
and determining the gesture type according to a preset gesture type corresponding to the target area.
Further, the processor 1001 may call the interactive program of the terminal device stored in the memory 1002, and further perform the following operations:
determining the stay time of the laser spot on the display panel according to the second coordinate information of the laser spot detected within a preset time interval;
and determining the gesture type according to the stay time.
Referring to fig. 2, a first embodiment of the present invention provides an interaction method based on laser sensing, where the interaction method based on laser sensing is applied to a terminal device, and the interaction method based on laser sensing includes:
step S10, acquiring the detected laser spot;
the terminal device is a hardware device with a computer infrastructure, which can be described by a von neumann architecture, that is, the terminal device includes an arithmetic logic unit, a control circuit, a memory, and an input/output device, and a laser spot is a bright area formed by focusing a laser beam on a point.
In an actual application scenario, the interaction method based on laser sensing in this embodiment may be applied to a terminal device, where the terminal device includes a display device for displaying content, such as a smart television, a smart phone, and a computer with a display panel. The terminal device comprises a laser sensing layer, wherein the laser sensing layer can detect laser spots emitted by the laser emitting device, the laser emitting device can be a laser remote controller, and the user emits the laser spots through the laser remote controller, so that the terminal device detects the laser spots through the laser sensing layer to realize the interaction with the terminal device; furthermore, for ease of user operation, the laser emitting device may also be a wearable device, such as a portable laser emitting device worn on the wrist or limb of the user. Of course, the display device may further include a display panel, and the display panel is used for displaying the laser spot detected by the laser sensing layer in addition to the content such as video or pictures, so that the user knows the position of the laser spot emitted by the laser emitting device. Specifically, after a user emits a laser beam through the laser emitting device, the laser sensing layer of the terminal device may detect the laser spot, because the laser spot detected by the laser sensing layer may be invisible light, and during the interaction process between the terminal device and the user, the terminal device further implements interaction through the position of the laser spot on the display panel, so as to facilitate the user to observe the position of the laser spot emitted by the laser emitter, and to facilitate the terminal device to determine the gesture type according to the position of the laser spot on the display panel, it is necessary to determine the second coordinate information of the laser spot in the display panel after detecting the laser spot, and then determine the position information of each laser spot in the display panel and the time interval at which the laser spot is displayed in the display panel according to the second coordinate information, and based on the characteristic that the position information of each laser spot and the time interval at which the laser spot is displayed are different, the gesture type can be further determined by combining the position information and the displayed time interval, and the interaction operation is further determined according to the gesture type.
In addition, the laser sensing layer is a sensor capable of detecting laser, and the laser sensing layer may be composed of M × N photoelectric sensors, such as photo resistors, and the X-axis coordinate and the Y-axis coordinate of each photoelectric sensor are obtained by encoding each photoelectric sensor, so that the photoelectric sensors can detect the coordinate information of the light spot; the wearable device is a device worn on the surface of a human body, the wearable device is used for example as a smart glove, a smart watch, a smart ring and the like, after the wearable device is worn by a user, the motion trend of a laser beam emitted by the wearable device is changed by changing the activity state of limbs, so that the position of a laser spot formed by the laser beam on a laser sensing layer of the terminal device moves along with the motion trend, and the terminal device further obtains first coordinate information; a laser emitting device is a device capable of emitting a laser beam, for example, various wearable devices including a laser emitter.
The display device in this embodiment is shown in fig. 8, where 10 is a laser spot, 20 is a display panel, and 30 is a laser sensing layer, and in one scene, a user emits a laser beam through a laser emitting device to form three laser spots 10 on the display device, where the sizes of the display panel 20 and the laser sensing layer 30 may be the same or different.
Step S20, determining a gesture type according to the first coordinate information of the laser spot;
the first coordinate information is coordinate information of the laser spot in a display device of the terminal device, and specifically, the first coordinate information may be coordinate information in a display panel or coordinate information in a laser sensing layer; the gesture type is an active state type of a hand of a human body, and in addition, the gesture type can also be extended to an active state type of a wrist or an arm, and the gesture type is, for example, a type of rotation, opening and closing, movement and the like of fingers.
The gesture type determining method comprises the steps that after the terminal device detects a laser spot, first coordinate information of the laser spot is determined, the gesture type is determined according to the first coordinate information, the gesture type is determined according to the principle that the change trend of the first coordinate information corresponding to a certain gesture type is counted and analyzed, the change trend is associated with the gesture type, and therefore the corresponding gesture type can be obtained after the first coordinate information is obtained. In addition, the first coordinate information may be converted to obtain second coordinate information, and the gesture type may be determined by the second coordinate information, for example, when the first coordinate information is set as the coordinate information of the laser spot in the laser sensing layer, the second coordinate information may be set as the coordinate information of the laser spot corresponding to the display panel, and further, the second coordinate information may also be the coordinate information of a certain pixel point or a certain group of pixel points in the display panel corresponding to the laser spot. For example, the user moves the three laser transmitters from left to right at the same time, the gesture type at this time may be set to be a panning gesture, the terminal device detects that the first coordinate information is three, and the three coordinate points have a variation trend of moving from left to right at the same time, based on which, the terminal device may correspond the variation trend of moving the three first coordinate information from left to right at the same time to the panning gesture.
In practical applications, when the gesture type is determined by the coordinate information in the display panel, the underlying framework does not need to be repeatedly developed, and therefore, in order to improve efficiency, the gesture type may be determined by preferably using the coordinate information in the display panel as the first coordinate information.
And step S30, determining an interaction instruction according to the gesture type.
The interactive instruction is a computer instruction, and is used for implementing an interactive behavior of a user and controlling an operating state of the terminal device, such as a state of displaying content and a state of playing audio.
The terminal equipment can determine the interactive instruction after determining the gesture type, and in order to determine the interactive instruction, the corresponding relation between the gesture type and the interactive instruction can be established in advance, and the interactive instruction corresponding to the gesture type is determined according to the corresponding relation; the corresponding relationship between the gesture type and the interaction instruction can be set according to actual needs, and is not limited herein. For example, in setting the gesture type and the interaction instruction, in order to better conform to the habit of the user for human-computer interaction, the translation gesture of the hand may be set as an interaction instruction for switching the state of the terminal device, where the state switching may be switching of the display state of the terminal device, that is, switching the currently displayed interface a to the interface B, or switching of the state of the terminal device playing music, such as switching from the currently played music to the next music.
In this embodiment, by acquiring the detected laser spot, determining the gesture type according to the first coordinate information of the laser spot, and determining the interaction instruction according to the gesture type, the user can emit the laser spot when interacting with the terminal device, the terminal device first acquires the laser spot, determines the gesture type according to the position coordinate of the laser spot, and further determines the corresponding interaction instruction according to the gesture type, thereby enriching the interaction mode of the terminal device.
Referring to fig. 3, a second embodiment of the present invention provides an interaction method based on laser sensing, based on the first embodiment shown in fig. 2, where the laser spot is formed by a laser sensing layer of the terminal device being irradiated by a laser beam emitted by a laser emitting device, the first coordinate information is coordinate information of the laser spot in the laser sensing layer, and the step of determining the gesture type according to the first coordinate information of the laser spot includes:
step S21, determining second coordinate information corresponding to the first coordinate information according to a preset corresponding relationship and the first coordinate information, where the preset corresponding relationship is a corresponding relationship between the first coordinate information and the second coordinate information preset in the terminal device, and the second coordinate information is coordinate information of the laser spot in the display panel corresponding to the first coordinate information in the preset corresponding relationship;
in this embodiment, the first coordinate information is coordinate information of the laser spot in the laser sensing layer, and the second coordinate information is coordinate information of the laser spot in the display panel corresponding to the first coordinate information in the preset relationship.
When the terminal equipment detects the laser spot, the terminal equipment detects the laser spot through the laser induction layer, the first coordinate information of the laser spot in the laser induction layer is detected at the moment, wherein, because the size of the laser spot is influenced by various factors, such as laser mode, diffraction, spherical aberration and the like, the area coverage of the laser spot can be more than one photoelectric sensor, because each photoelectric sensor corresponds to one coordinate point, the coordinate of the laser facula does not directly correspond to the coordinate of one photoelectric sensor, at the moment, the coordinates of one of the photosensors corresponding to the laser spot may be determined as first coordinate information, for example, the coordinates of the covered photo-sensor at the central position are selected as the first coordinate information, after the first coordinate information is obtained, the first coordinate information needs to be converted into second coordinate information.
And step S22, determining the gesture type according to the second coordinate information.
After the first coordinate information is obtained, the first coordinate information needs to be converted into second coordinate information, the conversion method is to perform conversion according to a preset corresponding relationship, the preset corresponding relationship is a corresponding relationship between the first coordinate information and the second coordinate information which are preset in the terminal device, and when the conversion is performed, the terminal device converts the detected first coordinate information into coordinates of pixel points in a display panel, namely the second coordinate information, through a Field Programmable Gate Array (FPGA); the terminal equipment can determine the number of the pixel points according to the resolution of the current display panel, can establish a coordinate system based on the pixel points as units, corresponds each pixel point to one coordinate point, and establishes a mapping relation between each first coordinate information and the coordinate information of one pixel point or the coordinate information of a group of pixel points, so that each first coordinate information has the second coordinate information of the corresponding pixel point, and the preset mapping relation is obtained. After the second coordinate information is obtained, the change trend of the laser light spot is determined according to the second coordinate information, so that the gesture type is further determined.
In this embodiment, the first coordinate information of the laser spot in the laser sensing layer is determined, and the second coordinate information is determined according to the preset corresponding relationship and the first coordinate information, so that the gesture type can be determined according to the second coordinate information.
Referring to fig. 4, a third embodiment of the present invention provides an interaction method based on laser sensing, and based on the second embodiment shown in fig. 3, the step S22 includes:
step S221, determining a movement parameter of the laser spot according to the second coordinate information and the detection time of the laser spot detected within a preset time interval, wherein the movement parameter includes at least one of a movement distance, a movement direction and a movement track;
the preset time interval is a preset parameter which is stored in the terminal equipment and is used for describing the time length of a period of time, the detection time is the time when the terminal equipment detects the laser spot through the laser induction layer, and the movement parameter is a parameter used for describing the movement state of the laser spot on the display device; the moving parameter includes at least one of a moving distance, a moving direction, and a moving trajectory, where the moving distance is a distance between any two second coordinate information, for example, the moving distance may be calculated according to the second coordinate information of the start time and the second coordinate information of the end time in a preset time period, the moving direction is a direction in which the second coordinate information of the start time in the preset time period points to the second coordinate information of the end time, the moving direction may also be a direction in which the second coordinate information of the start time points to the second coordinate information of any other detection time, and the moving trajectory refers to a trajectory formed by connecting the second coordinate information of different detection times in the preset time period.
When the gesture type is determined by the terminal equipment, the gesture type is indirectly inferred through the change trend of the laser light spots, and the implementation mode is that the detected second coordinate information in the preset time interval and the detection time of the laser light spot corresponding to each second coordinate information are obtained; under the condition that only one laser spot exists at the same detection time, the terminal equipment only obtains one piece of second coordinate information at the same detection time, and the movement parameters can be obtained by counting the second coordinate information of a plurality of detection times in a preset time period.
Step S222, determining the gesture type according to the movement parameter.
The terminal device counts and records the rule of the movement parameter corresponding to each gesture type in advance, so that after the movement parameter is obtained, whether the movement parameter belongs to a certain rule or not can be judged, and whether the movement parameter corresponds to a certain gesture or not can be further judged, for example, under the condition that the laser emitter is a wearable device intelligent glove, the laser emitter can be connected with a finger containing sleeve of the intelligent glove, so that a laser beam moves along with the movement of the finger, if a user emits the laser beam through one laser emitter at the moment and draws a closed graph on a laser sensor by the laser beam, the gesture type can be regarded as a rotary gesture, in addition, the closed graph drawn by the user may not be an accurate closed graph, namely, the movement track drawn by the second coordinate information is not a closed graph, at the moment, whether the closed graph is in an error range or not can be further judged, when the distance between the second coordinate information of the starting time and the second coordinate information of the ending time is within the error range, the closed figure can be determined to be the closed figure, and when the distance is within the error range, the closed figure can be determined to be within the error range.
In this embodiment, the movement parameter of the laser spot is determined according to the second coordinate information of the laser spot detected within the preset time interval and the detection time, and the gesture type is determined according to the movement parameter, so that the gesture of the user can be flexibly determined according to the movement parameter.
Referring to fig. 5, a fourth embodiment of the present invention provides an interaction method based on laser sensing, and based on the third embodiment shown in fig. 4, the step S221 includes:
step S2211, grouping according to the second coordinate information corresponding to each laser spot and the detection time to obtain at least two groups of coordinate sets, wherein the laser spots corresponding to each second coordinate information in the same group of coordinate sets are formed by the same laser transmitter in the intelligent glove;
the coordinate set is a set of coordinates formed by second coordinate information corresponding to the same laser transmitter, and the set of coordinates is used for indicating the distribution state of the positions of the laser spots of one laser transmitter at different times.
When the terminal equipment is in interactive operation with a user, in order to enrich the interactive mode, the implementation adopts multipoint interaction, the multipoint interaction between the terminal equipment and the user is realized through the intelligent glove, the multipoint interaction refers to that the intelligent glove simultaneously emits more than two laser beams, the number of laser spots formed by the laser beams simultaneously irradiating the surface of the laser sensing layer is more than two, at the moment, the second coordinate information detected by the terminal equipment in the preset time period is at least two, and the laser beams tend to move in the preset time period, so that the number of the actually acquired second coordinate information is generally more than two, under the condition, the terminal equipment needs to determine the laser spots corresponding to the second coordinate information, and the effect of determining the gesture type according to the movement parameters of different laser spots is achieved.
In this embodiment, laser emission equipment is intelligent gloves, and intelligent gloves include two at least laser emitter, and laser emitter holds the cover with the finger of intelligent gloves and is connected, holds the cover in order to realize different laser beams of different finger control and all connect at least one laser emitter, can make each finger hold the cover as far as possible and all connect at least one laser emitter, and the laser beam that each laser emitter transmitted shines and forms a laser facula on the surface of laser induction layer. In order to classify the first coordinate information of the same laser transmitter and distinguish the first coordinate information of each laser transmitter under the condition of multipoint interaction, the first coordinate information of each laser transmitter needs to be grouped according to the second coordinate information corresponding to each laser spot and the detection time to obtain at least two groups of coordinate sets, and each group of coordinate set is composed of the second coordinate information of the laser spots formed by the same laser transmitter.
For example, when the same laser spot moves, the corresponding second coordinate information of every two adjacent detection times is also adjacent to each other, and based on the characteristic, the absolute value of the difference value between the corresponding second coordinate information of the adjacent detection times can be determined; adding second coordinate information of which the absolute value is smaller than a preset threshold value to the same group of coordinate sets corresponding to the same laser transmitter to respectively obtain at least two groups of coordinate sets corresponding to at least two laser transmitters; in addition, second coordinate information of different detection time can be connected, whether the second coordinate information is the same coordinate set or not can be determined according to the rule of the connected curves, the curves formed by connecting the second coordinate information of the same laser spot are often regular, and the curves formed by connecting the second coordinate information of different laser spots are often irregular and abrupt, so that an algorithm can be further designed to determine each group of coordinate sets based on the characteristic.
Step S2212, determining the movement parameters of the laser spot corresponding to each group of the coordinate set according to each group of the coordinate set.
After more than two groups of coordinate sets are obtained, the movement parameters of the laser spots corresponding to each group of coordinate set are determined according to each group of coordinate set, and the gesture type of multipoint interaction is determined according to the movement parameters of each laser spot.
When the gesture type is determined according to the movement parameter of each laser spot, the gesture types corresponding to different movement parameters can be determined according to actual needs, for example, by judging whether the moving distance of the laser facula is larger than a preset distance threshold value and whether the moving direction is a preset moving direction, and when the moving distance of each laser facula is larger than the preset distance threshold value, and when the moving direction of each laser spot is a preset moving direction, determining the gesture type as a sliding gesture, wherein the preset distance threshold is a preset distance value for judging whether the gesture type is a sliding gesture, the average value can be set as a preset distance threshold value by measuring the average value of the moving distance when the gesture of the user is a sliding gesture in advance, and the preset moving direction is moving from left to right, moving from right to left, moving from top to bottom or moving from bottom to top; or, whether the gesture type is a rotation gesture can be judged through the movement track, wherein when the movement track is arc-shaped, the gesture type can be judged to be the rotation gesture; or calculating a first distance between two groups of coordinates of starting detection time and a second distance between two groups of coordinates of ending detection time in the two groups of coordinate sets, and determining that the gesture type is an opening and closing gesture or a closing gesture, wherein the gesture type is an opening and closing gesture under the condition that the second distance is greater than the first distance, and the gesture type is a closing gesture under the condition that the second distance is less than the first distance.
In this embodiment, the second coordinate information and the detection time corresponding to each laser spot are grouped to obtain at least two groups of coordinate sets, and the movement parameter of the laser spot formed by each laser emitter is determined according to each group of coordinate sets, so that the gesture type can be determined according to the movement parameter of each laser spot under the condition of multipoint interaction, and the space-isolated interaction mode is more flexible and richer.
Referring to fig. 6, a fifth embodiment of the present invention provides an interaction method based on laser sensing, and based on the second embodiment shown in fig. 3, the step S22 includes:
step S223, determining a target area of the laser spot in the display panel according to the second coordinate information;
the target area is a part of a display area which is preset in the display panel and corresponds to a specific gesture type, and the target area is an area at four boundaries of a display edge of the display panel.
When the terminal device determines the gesture type according to the second coordinate information, the gesture type can be further determined through a target area of the laser spot in the display panel, specifically, an area formed by m × n pixel points at the junction can be set as the target area, the coordinate value of each pixel point in the target area is determined, and when the target area where the second coordinate information is located is determined, whether the second coordinate information is the coordinate value of the pixel point in a certain target area is determined, so that the target area where the second coordinate information is located is further determined. In order to prevent the misoperation, only a partial area of the display panel can be set as the target area, so that when the second coordinate information is in an area outside the target area, the terminal device does not trigger the determination of the gesture type, and thus does not trigger the interaction instruction.
Step S224, determining the gesture type according to the preset gesture type corresponding to the target area.
The preset gesture type is a preset gesture type associated with the target area, and the preset gesture type may be set to a gesture type commonly used by a user, for example, the preset gesture type of a target area may be set to be a sliding gesture, and then it may be further determined that the gesture type is the sliding gesture, and when the terminal device determines that the gesture type is the sliding gesture, the interactive instruction may be determined to be a sliding instruction, and at this time, a page of the terminal device may be slid up and down or slid left and right according to the sliding instruction, so as to switch contents of a picture displayed by the display panel.
In addition, the target area may also correspond to a plurality of preset gesture types, and at this time, the terminal device may determine a preset gesture type as the required gesture type according to the number of the laser spots.
In this embodiment, the target area of the laser cursor in the display panel is determined according to the second coordinate information, and the gesture type is determined according to the preset gesture type corresponding to the target area, so that the corresponding preset gesture type can be determined only according to the target area of the laser spot in the display panel, and the gesture type and the abundant interaction modes are further determined.
Referring to fig. 7, a sixth embodiment of the present invention provides an interaction method based on laser sensing, where based on the first embodiment shown in fig. 2, the step S22 includes:
step S225, determining the stay time of the laser spot on the display panel according to the second coordinate information of the laser spot detected in a preset time interval;
the stay time is the time length of the laser spot in a specific stay state, and the stay state refers to the state that the second coordinate information of the laser spot is in a specific coordinate range.
When the terminal equipment determines the gesture type, the gesture type can be determined according to the stay time of the laser spot in the display panel, wherein the stay time of the laser spot in the display panel needs to be determined firstly, the method is adopted to obtain second coordinate information of the detected laser spot within a preset time interval, judge whether the current laser spot is in the stay state according to the second coordinate information, and record the stay time of the laser spot in the stay state as the stay time under the condition that the laser spot is in the stay state.
Step S226, determining the gesture type according to the stay time length.
After the stay time length is determined, determining a gesture type according to the stay time length, wherein when the stay time length is longer than the preset stay time length, determining the gesture type as a hovering gesture; in addition, under the condition that a plurality of preset gesture types exist, after the gesture type is determined to be the hovering gesture, a plurality of hovering gestures of different levels can be further determined according to the length of the hovering time.
After determining that the gesture type is the hover type, an interaction instruction corresponding to the gesture type of the hover type may be further determined, for example, the interaction instruction may be determined as a selection instruction or a trigger instruction of a secondary menu, in a case that the interaction instruction is determined as the selection instruction, a corresponding interface element in the display panel may be selected, the interface element is, for example, an icon of an application program, at this time, the application program may be opened, and a jump is made to a page of the application program, in a case that the interaction instruction is the trigger instruction of the secondary menu, a list of the secondary menu may be displayed, and a corresponding function may be displayed in the list.
In this embodiment, the stay time of the laser spot on the display panel is determined according to the second coordinate information of the laser spot detected at the preset time interval, and the gesture type is determined according to the stay time, so that the interaction mode of the terminal device is enriched.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a terminal device (e.g., a smart tv, a mobile phone, a computer, etc.) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. An interaction method based on laser induction is characterized in that the interaction method based on laser induction is applied to terminal equipment, and comprises the following steps:
acquiring a detected laser spot;
determining a gesture type according to the first coordinate information of the laser spot;
and determining an interaction instruction according to the gesture type.
2. The interaction method based on laser sensing of claim 1, wherein the laser spot is formed by irradiating a laser sensing layer of the terminal device with a laser beam emitted by a laser emitting device, the first coordinate information is coordinate information of the laser spot in the laser sensing layer, and the step of determining the gesture type according to the first coordinate information of the laser spot comprises:
determining second coordinate information corresponding to the first coordinate information according to a preset corresponding relationship and the first coordinate information, wherein the preset corresponding relationship is a corresponding relationship between the first coordinate information and the second coordinate information which are preset in the terminal equipment, and the second coordinate information is coordinate information of the laser spot in a display panel, which corresponds to the first coordinate information, in the preset corresponding relationship;
and determining the gesture type according to the second coordinate information.
3. The laser-sensing-based interaction method of claim 2, wherein the step of determining the gesture type according to the second coordinate information comprises:
determining a movement parameter of the laser spot according to the second coordinate information and the detection time of the laser spot detected within a preset time interval, wherein the movement parameter comprises at least one of a movement distance, a movement direction and a movement track;
and determining the gesture type according to the movement parameters.
4. The laser sensing-based interaction method according to claim 3, wherein the laser emitting device is a smart glove, the smart glove comprises at least two laser emitters, the laser beam emitted by each laser emitter irradiates the laser sensing layer to form one laser spot, and the step of determining the movement parameter of the laser spot according to the second coordinate information of the detected laser spot and the detection time within a preset interval comprises:
grouping according to the second coordinate information corresponding to each laser spot and the detection time to obtain at least two groups of coordinate sets, wherein the laser spots corresponding to each second coordinate information in the same group of coordinate sets are formed by the same laser transmitter in the intelligent glove;
and determining the movement parameters of the laser spots corresponding to the coordinate sets according to the coordinate sets of each group.
5. The laser induction based interaction method as claimed in claim 4, wherein the step of grouping according to the second coordinate information corresponding to each laser spot and the detection time to obtain at least two sets of coordinate sets comprises:
determining an absolute value of a difference between the second coordinate information adjacent to the detection time;
and adding the second coordinate information of which the absolute value is smaller than a preset threshold value into the same group of coordinate sets corresponding to the same laser transmitter so as to respectively obtain at least two groups of coordinate sets corresponding to at least two laser transmitters.
6. The laser-sensing-based interaction method according to claim 4, wherein the movement parameters include the movement distance, the movement track and the movement direction, and the step of determining the gesture type according to the movement parameters includes:
when the moving distance of each laser spot is greater than a preset distance threshold and the moving direction of each laser spot is a preset moving direction, determining that the gesture type is a sliding gesture, or,
when the shape of the moving track is an arc, determining that the gesture type is a rotation gesture.
7. The laser-sensing-based interaction method of claim 2, wherein the step of determining the gesture type according to the second coordinate information comprises:
determining a target area of the laser spot in the display panel according to the second coordinate information;
and determining the gesture type according to a preset gesture type corresponding to the target area.
8. The laser-sensing-based interaction method of claim 2, wherein the step of determining the gesture type according to the second coordinate information comprises:
determining the stay time of the laser spot on the display panel according to the second coordinate information of the laser spot detected within a preset time interval;
and determining the gesture type according to the stay time.
9. A terminal device, characterized in that the terminal device comprises a laser sensing layer, a memory, a processor and a laser sensing based interaction program stored on the memory and executable on the processor, wherein the laser sensing based interaction program, when executed by the processor, further implements the steps of the laser sensing based interaction method according to any one of claims 1 to 8.
10. A computer-readable storage medium, wherein a laser sensing-based interaction program is stored on the computer-readable storage medium, and when executed by a processor, the computer-readable storage medium implements the steps of the laser sensing-based interaction method according to any one of claims 1 to 8.
CN202011307037.0A 2020-11-19 2020-11-19 Interaction method based on laser induction, terminal equipment and readable storage medium Pending CN112346644A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011307037.0A CN112346644A (en) 2020-11-19 2020-11-19 Interaction method based on laser induction, terminal equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011307037.0A CN112346644A (en) 2020-11-19 2020-11-19 Interaction method based on laser induction, terminal equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN112346644A true CN112346644A (en) 2021-02-09

Family

ID=74364379

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011307037.0A Pending CN112346644A (en) 2020-11-19 2020-11-19 Interaction method based on laser induction, terminal equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112346644A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103092432A (en) * 2011-11-08 2013-05-08 深圳市中科睿成智能科技有限公司 Trigger control method and system of man-machine interaction operating instruction and laser emission device
CN103702151A (en) * 2013-11-25 2014-04-02 何文林 Man-computer interaction remote control device and method for intelligent television and touch control screen
CN104166509A (en) * 2013-05-20 2014-11-26 华为技术有限公司 Non-contact screen interaction method and system
CN105807989A (en) * 2016-02-29 2016-07-27 深圳柔石科技有限公司 Gesture touch method and system
US20190179417A1 (en) * 2017-12-11 2019-06-13 Shenzhen Starfield Information Technologies Co., Ltd. 3D Interaction Method, Device, Computer Equipment and Storage Medium
CN111078018A (en) * 2019-12-31 2020-04-28 深圳Tcl新技术有限公司 Touch control method of display, terminal device and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103092432A (en) * 2011-11-08 2013-05-08 深圳市中科睿成智能科技有限公司 Trigger control method and system of man-machine interaction operating instruction and laser emission device
CN104166509A (en) * 2013-05-20 2014-11-26 华为技术有限公司 Non-contact screen interaction method and system
CN103702151A (en) * 2013-11-25 2014-04-02 何文林 Man-computer interaction remote control device and method for intelligent television and touch control screen
CN105807989A (en) * 2016-02-29 2016-07-27 深圳柔石科技有限公司 Gesture touch method and system
US20190179417A1 (en) * 2017-12-11 2019-06-13 Shenzhen Starfield Information Technologies Co., Ltd. 3D Interaction Method, Device, Computer Equipment and Storage Medium
CN111078018A (en) * 2019-12-31 2020-04-28 深圳Tcl新技术有限公司 Touch control method of display, terminal device and storage medium

Similar Documents

Publication Publication Date Title
US10606441B2 (en) Operation control device and operation control method
US9268400B2 (en) Controlling a graphical user interface
US8810509B2 (en) Interfacing with a computing application using a multi-digit sensor
US8866781B2 (en) Contactless gesture-based control method and apparatus
KR101541928B1 (en) visual feedback display
CN101278251B (en) Interactive large scale touch surface system
US20050057524A1 (en) Gesture recognition method and touch system incorporating the same
CN109254823B (en) Method for switching multi-level nested paging view elements, memory and terminal
US10963136B2 (en) Highlighting of objects on a display
CN111475097B (en) Handwriting selection method and device, computer equipment and storage medium
US20190324539A1 (en) Systems and methods for providing dynamic haptic playback for an augmented or virtual reality environments
CN104423870A (en) Control in graphical user interface, display method as well as method and device for operating control
EP2891949A1 (en) Information input device and information display method
AU2019440748B2 (en) Method, device and apparatus for controlling touch operation mode, and storage medium
US20220382377A1 (en) Systems and methods for controlling virtual widgets in a gesture-controlled device
US20230280837A1 (en) Interaction method, display device, and non-transitory storage medium
CN112346644A (en) Interaction method based on laser induction, terminal equipment and readable storage medium
CN103870146A (en) Information processing method and electronic equipment
EP4345583A1 (en) Gesture interaction method and system based on artificial reality
CN109858000A (en) Form processing method, device, system, storage medium and interactive intelligent tablet computer
CN113687722A (en) Page control method, device, equipment and storage medium of electronic equipment
CN113031817A (en) Multi-point touch gesture recognition method and false touch prevention method
Guesgen et al. Gestural control of household appliances for the physically impaired
CN111880717B (en) Method and device for suspending touch control remote control playing equipment
US20240134461A1 (en) Gesture interaction method and system based on artificial reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination