CN116301429A - Control method of intelligent terminal and intelligent terminal - Google Patents

Control method of intelligent terminal and intelligent terminal Download PDF

Info

Publication number
CN116301429A
CN116301429A CN202310287206.6A CN202310287206A CN116301429A CN 116301429 A CN116301429 A CN 116301429A CN 202310287206 A CN202310287206 A CN 202310287206A CN 116301429 A CN116301429 A CN 116301429A
Authority
CN
China
Prior art keywords
touch
user
touch screen
gazing
operation object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310287206.6A
Other languages
Chinese (zh)
Inventor
杨文武
袁海江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HKC Co Ltd
Chongqing HKC Optoelectronics Technology Co Ltd
Original Assignee
HKC Co Ltd
Chongqing HKC Optoelectronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HKC Co Ltd, Chongqing HKC Optoelectronics Technology Co Ltd filed Critical HKC Co Ltd
Priority to CN202310287206.6A priority Critical patent/CN116301429A/en
Publication of CN116301429A publication Critical patent/CN116301429A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • G06F3/04186Touch location disambiguation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The scheme firstly obtains the gazing position of a user, confirms whether the obtained gazing position falls within the touch position range of the user on a touch screen, and if the gazing position falls within the touch position range, responds to the touch of the user to execute control operation on an operation object corresponding to the gazing position. The user needs to pay attention to the position of the operation object in the touch screen first, and then touch the position corresponding to the operation object by using a finger, so that the fixation position of eyes of the user on the touch screen is obtained, however, the user does not necessarily need to execute an operation on the operation object when the user focuses on a certain operation object, and therefore, related operations are executed on the operation object under the condition that the fixation position is touched by the user. The gazing position can be positioned to a small area, control is performed based on the obtained gazing position, an operation object which needs to be operated by a user can be accurately positioned, and misoperation is prevented.

Description

Control method of intelligent terminal and intelligent terminal
Technical Field
The application relates to the technical field of intelligent terminals, in particular to a control method of an intelligent terminal and the intelligent terminal.
Background
The touch screen is also called as a touch screen, and is an induction type display screen capable of receiving signals input by a user in a touch manner, and the user can send various operation instructions to the intelligent terminal by touching icons, interfaces or graphic buttons on the display screen, so that the traditional mechanical operation keys can be replaced.
To improve the accuracy of touch position detection, as shown in fig. 1, the related art divides a touch screen into a plurality of touch partitions 110. Each touch partition 110 is used as an independent touch detection area, touch screen signals need to be led out through leads, if the touch partition 110 is smaller, the number of the touch partitions 110 needing to be divided is correspondingly increased, the number of leads of the whole touch screen is increased, the lead design is complex, and excessive numbers of leads can cause poor light transmittance of the display screen and affect the display effect, so that the touch partition 110 is generally arranged larger, and one touch partition 110 corresponds to a plurality of operation objects. In the touch detection process, the operation object that the user wants to operate is estimated from the left and right directions of the click position 1101 of the touch area 110 by the finger, and the operation object estimation error is likely to occur, which leads to erroneous operation.
Disclosure of Invention
The application provides a control method of an intelligent terminal, a control device of the intelligent terminal, electronic equipment, the intelligent terminal and a computer readable storage medium, so as to improve the accuracy of touch response of a touch screen.
According to an aspect of the embodiments of the present application, a control method of an intelligent terminal is disclosed, where the control method of the intelligent terminal includes:
acquiring the gazing position of eyes of a user on a touch screen;
when the fixation position of eyes of a user on a touch screen is obtained, the touch position of the user on the touch screen is obtained;
confirming whether the obtained gaze location falls within the touch location range;
and if the obtained gazing position falls within the touch position range, responding to the touch of the user to execute control operation on the operation object corresponding to the gazing position.
In an exemplary embodiment, the acquiring the gaze location of the user's eyes on the touch screen includes:
acquiring a focusing position of a pupil of a user on a touch screen;
displaying a pattern identifier at a position corresponding to the current focusing position of the pupil of the user on the touch screen;
acquiring movement information of a pupil of a user;
adjusting the position of the pattern mark on the touch screen based on the movement information of the pupil of the user;
And when the pupil of the user stops moving, taking the position marked by the current pattern as the fixation position of eyes of the user on the touch screen.
In an exemplary embodiment, the control method further includes:
and if the obtained gazing position does not fall into the touch position range, not responding to the touch of the user.
In an exemplary embodiment, the control method further includes:
if the obtained gazing position does not fall into the touch position range, acquiring the offset of the touch position of the user on the touch screen relative to the gazing position;
comparing the offset with an offset threshold;
if the offset is below the offset threshold, shifting the touch position towards the direction of the gazing position by a preset value to obtain a corrected touch position, and executing control operation on an operation object corresponding to the corrected touch position in response to the touch of a user;
and if the offset is above the offset threshold, not responding to the touch of the user.
In an exemplary embodiment, the performing, in response to a touch of a user, a control operation on an operation object corresponding to the gaze location includes:
Acquiring an operation object corresponding to the gazing position based on the mapping relation between the position coordinates of the touch screen and the operation object;
and executing corresponding control operation on the obtained operation object.
In an exemplary embodiment, the touch screen is divided into a plurality of touch partitions, the touch screen generates touch screen signals corresponding to the touch partitions when the touch partitions are touched, and the number of sub-pixels corresponding to the gaze locations is smaller than the number of sub-pixels corresponding to the touch partitions.
According to an aspect of the embodiments of the present application, a control device of an intelligent terminal is disclosed, where the control device of the intelligent terminal includes:
the eye movement detection module is used for acquiring the gazing position of eyes of a user on the touch screen;
the touch detection module is used for acquiring the touch position of the user on the touch screen when acquiring the gazing position of the eyes of the user on the touch screen;
a position comparison module for confirming whether the obtained gaze position falls within the touch position range;
and the operation control module is used for responding to the touch of the user to execute control operation on the operation object corresponding to the gazing position under the condition that the obtained gazing position falls within the touch position range.
According to an aspect of embodiments of the present application, an electronic device is disclosed, including one or more processors and a memory, where the memory is configured to store one or more computer programs, and when the one or more computer programs are executed by the one or more processors, cause the processors to implement the foregoing method for controlling a smart terminal.
According to an aspect of an embodiment of the application, an intelligent terminal is disclosed, the intelligent terminal comprises a terminal main body, a camera and electronic equipment, and the terminal main body is provided with a touch screen; the camera is arranged on the terminal main body and used for collecting images containing pupils of a user so as to obtain the gazing position of eyes of the user on the touch screen; the electronic device is as described above.
According to an aspect of embodiments of the present application, a computer-readable storage medium storing computer-readable instructions that, when executed by a processor of a computer, cause the computer to perform the foregoing method of controlling a smart terminal is disclosed.
The technical scheme provided by the embodiment of the application at least comprises the following beneficial effects:
According to the technical scheme, firstly, the gazing position of eyes of a user on the touch screen is obtained, whether the obtained gazing position falls within the touch position range of the user on the touch screen is further confirmed, and if the obtained gazing position falls within the touch position range, control operation is performed on an operation object corresponding to the gazing position in response to the touch of the user. In the process of using the intelligent terminal by a user, when the user needs to operate a certain operation object, the user needs to pay attention to the position of the operation object in the touch screen, and then touch the position corresponding to the operation object with a finger, so that the gazing position of eyes of the user on the touch screen is obtained, however, the user gazing at a certain operation object does not mean that the user does not necessarily need to perform operation on the operation object, and therefore, related operation is performed on the operation object under the condition that the gazing position is touched by the user. Because the gazing position can be positioned to a small area, the control is performed based on the obtained gazing position, the operation object which is needed to be operated by the user can be accurately positioned, the problem that the positioning of the target operation object is inaccurate due to the large touch screen partition is solved, the accuracy of touch response of the touch screen is improved, and misoperation is prevented.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic diagram of a touch partition shown in accordance with an exemplary embodiment.
FIG. 2 is a schematic diagram of a usage scenario, shown according to an example embodiment.
Fig. 3 is a logic block diagram of a control method of an intelligent terminal according to an exemplary embodiment.
Fig. 4 is a flowchart illustrating a control method of an intelligent terminal according to an exemplary embodiment.
Fig. 5 is a detailed flowchart of step S101 in the corresponding embodiment of fig. 4.
Fig. 6 is a schematic view of a gaze location determination scene shown in accordance with an exemplary embodiment.
Fig. 7 is a schematic view of a gaze location determination scene shown in accordance with another exemplary embodiment.
FIG. 8 is a schematic diagram of an implementation scenario illustrated according to an example embodiment.
FIG. 9 is a schematic diagram of another implementation scenario illustrated in accordance with an example embodiment.
Fig. 10 is a flowchart illustrating a control method of an intelligent terminal according to another exemplary embodiment.
Fig. 11 is a block diagram illustrating a control device of an intelligent terminal according to an exemplary embodiment.
Fig. 12 is a schematic diagram of an electronic device shown according to an example embodiment.
Fig. 13 is a schematic diagram of a smart terminal shown according to an example embodiment.
Fig. 14 is a schematic diagram of a smart terminal shown according to another exemplary embodiment.
Fig. 15 is a block diagram illustrating a computer system architecture for implementing an electronic device of an embodiment of the present application, according to an example embodiment.
The reference numerals are explained as follows:
11. a touch screen; 110. touching the subarea; 1101. a touch location; 110a, the touch position seen by the eye; 110b, the touch position actually clicked by the finger; 120. pattern identification; 130. a target operation object; 140. a touch location; 150. a gaze location; 12. a camera; 13. a watch case; 14. watchband; 200. a control device; 210. an eye movement detection module; 220. a touch detection module; 230. a position comparison module; 240. an operation control module; 300. an electronic device; 301. a processor; 302. a memory; 400. a computer system; 401. a CPU; 402. a ROM; 403. a storage section; 404. a RAM; 405. a bus; 406. an I/O interface; 407. an input section; 408. an output section; 409. a communication section; 410. a driver; 411. removable media.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present application. One skilled in the relevant art will recognize, however, that the aspects of the application can be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known methods, devices, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the application.
The block diagrams depicted in the figures are merely functional entities and do not necessarily correspond to physically separate entities. That is, the functional entities may be implemented in software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only, and do not necessarily include all of the elements and operations/steps, nor must they be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
The capacitor comprises two conductive plates, and a non-conductive material is arranged in the middle of the capacitor. When the two conductive plates are connected by a power source, they are charged by the power source. The distance between the two conductive plates is different, and the charged electric quantity is also different.
The touch screen of most intelligent terminals such as mobile phones, tablet computers and the like is a capacitive screen, the bottom layer of the capacitive screen is a conductive plate, and the upper layer of the capacitive screen is non-conductive glass. When a finger of a user touches the touch screen, the finger acts as another plate corresponding to the conductive plate due to the human body electric field, and forms a coupling capacitance with the bottom conductive plate. Thus, the electric quantity of the conductive plate of the touch screen is changed, and the position of the touch screen touched by a user is estimated according to the position of the electric quantity change.
In the related art, as shown in fig. 1, a touch area of a touch screen is divided into a plurality of touch partitions 110, and each touch partition 110 corresponds to a plurality of operation objects, respectively. When a user touches a touch area with a finger, a touch partition 110 corresponding to the finger touch position 1101 is first obtained, and then according to the position of the finger touch position 1101 relative to the touch partition 110, for example, at the middle position of the touch partition 110, at the upper left corner position of the touch partition 110, at the lower right corner position of the touch partition 110, or the like, then based on the position correspondence between each operation object and the touch partition 110, the operation object that the user wants to touch is estimated, and the estimation error of the operation object is easy to occur, so that the situation that the intelligent terminal starts the wrong application or jumps to the wrong page and other misoperation occurs, and great inconvenience is caused to the use of the user.
Especially, when the user uses the intelligent terminal, the head is not right opposite to the touch screen, and when the picture of the touch screen is transmitted to the eyes of the user, a certain angle is formed instead of being vertical, and when the user clicks the touch screen, the user can feel that the finger stops the target to be the point, but the user does not actually point. As shown in fig. 2, 110a represents the touch position seen by the user's eye, 110b represents the touch position actually clicked by the user's finger, and the actual finger drop point (touch position) 110b is lower due to the flatter angle of the user's eye and the target, but the finger does block the target as seen by the eye.
In order to reduce misoperation and improve accuracy of touch response of a touch screen, the application provides a control method of an intelligent terminal, a control device of the intelligent terminal, electronic equipment, the intelligent terminal and a computer readable storage medium.
Referring first to fig. 3, fig. 3 is a logic block diagram of a control method of an intelligent terminal according to an exemplary embodiment. As shown in fig. 3, firstly, an image containing the pupil of a user is acquired through a camera, then, the image acquired through the camera is analyzed through a processor, and the gazing position of eyes of the user on a touch screen is calculated; synchronously, detecting a touch screen signal generated on a touch screen, and analyzing and obtaining the touch position of a user on the touch screen through a processor; then, determining whether to respond to the touch of the user according to whether the gazing position falls into the touch position range; specifically, if the gaze location falls within the touch location range, a gaze location related control operation is performed.
When a user needs to operate a certain operation object in the process of using the intelligent terminal, the user always needs to pay attention to the position of the operation object in the touch screen firstly and then touches the position corresponding to the operation object by using a finger, so that the gazing position of eyes of the user on the touch screen is obtained; however, the user's gaze at a certain operation object does not mean that the user is necessarily required to perform an operation on the operation object, and therefore, in the case where the gaze position is touched by the user, a related operation is performed on the operation object. Because the gazing position can be positioned to a small area, the control is performed based on the obtained gazing position, the operation object which needs to be operated by the user can be accurately positioned, and misoperation is prevented.
Fig. 4 is a flowchart illustrating a control method of an intelligent terminal, an execution subject of which may be an MCU (english full name: micro Controller Unit, chinese name: microcontroller) or the like of the intelligent terminal, according to an exemplary embodiment, the control method mainly including steps S101 to S104 as follows.
Step S101, a gaze location of a user' S eyes on a touch screen is obtained.
In one embodiment of the present application, as shown in fig. 5, step S101 includes steps S1011 to S1015 as follows.
In step S1011, the focus position of the pupil of the user on the touch screen is acquired.
Eye tracking is a process of measuring the eye operation of a user, wherein the eye tracking is to acquire images through instruments and equipment, locate the position of pupils through an image processing technology, calculate the position point of eye fixation through a certain algorithm, and enable a computer to know which position the user is looking at.
In detail, first, an image including a pupil of a user is acquired; then, processing the obtained image and positioning the position of the pupil of the user in the image; and then, determining the focusing position of the pupil of the user on the touch screen based on the mapping relation between the image and the touch screen coordinate system and the position of the pupil of the user in the image.
Step S1012, displaying the pattern identifier at a position on the touch screen corresponding to the current focusing position of the pupil of the user.
That is, in step S1012, a pattern mark is generated based on the current focus position of the pupil of the user, and is displayed at a position corresponding to the current focus position on the touch screen. The position on the touch screen corresponding to the current focus position, i.e. the position on the touch screen corresponding to the current focus position and projected to fully coincide in a direction perpendicular to the touch screen.
The pattern identification may be a box, the boundary of which is the boundary of the current focus position. The pattern identifier may also be a circular box, the boundary of which is the boundary of the current focus position. The pattern identifier may also be an oval, the boundary of which is the boundary of the current focus position. The pattern identifier may also be any shape other than the enumerated square, circular, oval.
In step S1013, movement information of the pupil of the user is acquired.
Movement information of the user's pupil, including but not limited to rotational direction, rotational distance information, etc. For example, how much to turn right, how much to turn left, how much to turn up, how much to turn down, etc.
The movement information of the user pupil can be obtained by acquiring an image containing the user pupil in real time according to the position of the user pupil in the current frame image and the position of the user pupil in the previous frame image.
Step S1014 adjusts the position of the pattern mark on the touch screen based on the movement information of the user' S pupil.
That is, in step S1014, the position of the pattern mark on the touch screen is adjusted in real time, that is, the focus position of the user 'S pupil on the touch screen is corrected in real time, according to the movement information of the user' S pupil obtained in step S1013.
The adjusting the position of the pattern mark on the touch screen based on the movement information of the user pupil may be adjusting the position of the pattern mark on the touch screen according to the mapping relation between the movement distance of the user pupil and the adjustment distance and the movement direction of the user pupil; or when the movement distance of the pupil of the user is greater than the preset distance value, the preset distance of the pattern mark is adjusted based on the movement direction of the pupil of the user, and the preset distance is a preset fixed distance value.
In step S1015, when the pupil of the user stops moving, the position identified by the current pattern is taken as the gaze position of the eyes of the user on the touch screen.
When the pupil of the user stops moving, that is, the focusing position of the pupil on the touch screen is not required to be adjusted, the operation object corresponding to the position of the current pattern mark is the target operation object of the user, so that the position of the current pattern mark is taken as the gazing position of eyes of the user on the touch screen.
It should be noted that, when the user pupil stops moving, the user pupil position does not change any more or the user pupil position changes less than the preset change threshold between preset times, for example, within 2 seconds.
By displaying the pattern mark at the position corresponding to the current focusing position of the pupil of the user on the touch screen, the user can know the focusing position of the intelligent terminal currently positioned by looking at the position of the pattern mark, and further execute corresponding actions according to the position relation between the position of the pattern mark and the target operation object, for example, when the pattern mark deviates to the left relative to the target operation object, the eye bead is turned to the right, and for example, when the pattern mark deviates to the lower relative to the target operation object, the eye bead is turned to the upper; the position of the pattern mark on the touch screen is adjusted based on the movement information of the pupil of the user, and when the pupil of the user stops moving, the position of the current pattern mark is used as the gazing position of eyes of the user on the touch screen, so that the accuracy of gazing position determination is improved, an operation object which needs to be operated by the user is accurately positioned, and misoperation is further prevented.
Fig. 6 is a schematic view of a gaze location determination scene shown in accordance with an exemplary embodiment. As shown in fig. 6, in one embodiment of the present application, the pattern identifier 120 is a circular pattern, and the target operation object 130 of the user is a square identifier. In step S1012, a displayed circular pattern representing the focus position of the user' S pupil on the touch screen is 120, as shown in fig. 6 (a); in step S1013, it is acquired that the pupil of the user rotates rightward; accordingly, in step S1014, the position of the circular pattern 120 on the touch screen is moved rightward, as shown in (b) of fig. 6; at this time, the position of the circular pattern 120 on the touch screen corresponds to the square mark 130 in the left-right direction, but there is still a gap in the up-down direction; then, it is obtained that the pupil of the user rotates downward, and at this time, the position of the circular pattern 120 on the touch screen is moved downward, as shown in (c) of fig. 6; at this time, the position of the circular pattern 120 on the touch screen corresponds to the square mark 130, i.e. the focusing position corresponds to the target operation object correctly, and if the pupil of the user stops moving, the position of the current pattern mark can be taken as the gazing position of the eyes of the user on the touch screen.
Fig. 7 is a schematic view of a gaze location determination scene shown in accordance with another exemplary embodiment. As shown in fig. 7, in one embodiment of the present application, the pattern identifier 120 is a square pattern, and the target operation object 130 of the user is a square identifier. In step S1012, a square pattern representing the focus position of the user' S pupil on the touch screen is displayed as 120, as shown in fig. 7 (a); in step S1013, it is acquired that the pupil of the user rotates rightward; accordingly, in step S1014, the position of the square pattern 120 on the touch screen is moved rightward by a preset distance, as shown in (b) of fig. 7; at this time, the position of the square pattern 120 on the touch screen is still spaced from the square mark 130 in the left-right direction; then, the user's pupil is again acquired to rotate rightward, and at this time, the position of the square pattern 120 on the touch screen is again moved rightward by a preset distance, as shown in (c) of fig. 7; at this time, the position of the square pattern 120 on the touch screen corresponds to the square mark 130, i.e. the focusing position corresponds to the target operation object correctly, and if the pupil of the user stops moving, the position of the current pattern mark can be taken as the gazing position of the eyes of the user on the touch screen.
In one embodiment of the present application, the touch screen is divided into a plurality of touch zones 110 to improve the accuracy of touch location detection. In order to accurately determine the operation object that the user needs to operate, in step S101, the number of sub-pixels corresponding to the obtained gaze location should be smaller than the number of sub-pixels corresponding to the touch partition 110.
The sub-pixels are the minimum display units of the display screen, the display screens with different sizes, and the sizes of the sub-pixels are slightly different. The sub-pixels of the macro LED display screen can be below 50 um.
In one embodiment of the present application, the gaze location of the user's eyes on the touch screen is a gaze point. Further, the region corresponding to the gaze point is smaller than the region size corresponding to the target operation object. Therefore, the overlapping of the gazing position and the area where other operation objects except the target operation object are located is avoided, and misoperation is further avoided.
Further, the gaze location is one sub-pixel. The screen needs to normally display images, and the size of each sub-pixel needs to be in a micron level, in this embodiment, the gazing position is one sub-pixel, which is far smaller than the touch position (generally in a millimeter level) of the finger on the touch screen, so that the accuracy of positioning the target operation object can be further improved.
For example, the touch partition 110 corresponds to 100 sub-pixels, the target operation object corresponds to 20 sub-pixels, and the gaze location is one sub-pixel.
Step S102, when the fixation position of the eyes of the user on the touch screen is obtained, the touch position of the user on the touch screen is obtained.
In step S102, firstly, detecting a touch screen signal generated on a touch screen; and then, based on the position of the touch screen signal generated on the touch screen, acquiring the touch position of the user on the touch screen.
In an embodiment in which the touch screen is divided into a plurality of touch partitions, first, the touch partitions of the touch screen signal generated on the touch screen are detected; and then, based on the position of the touch screen signal generated on the touch partition, acquiring the touch position of the user on the touch screen.
Step S103, it is confirmed whether the obtained gaze location falls within the touch location range.
If the obtained gaze location falls within the touch location range, as shown in fig. 8, the oval area 140 represents the touch location, the circular area 150 represents the gaze location, 110 represents a touch partition, and the circular area 150 falls within the oval area 140, that is, the gaze location 150 falls within the touch location 140 range, and the process proceeds to step S104a.
On the contrary, if the obtained gaze location is out of the touch location range, as shown in fig. 9, the oval area 140 represents the touch location, the circular area 150 represents the gaze location, 110 represents one touch partition, and the circular area 150 is out of the oval area 140, that is, the gaze location 150 is out of the touch location 140 range, and the process goes to step S104b.
Step S104a, performing a control operation on an operation object corresponding to the gaze location in response to a touch of the user.
In step S104a, the operation object corresponding to the gaze location is turned off, turned on, exited, and zoomed.
For example, the operation object corresponding to the gaze location is a closing identifier, and in step S104a, that is, the web page, the APP page, etc. corresponding to the closing identifier are closed. For another example, the operation object corresponding to the gaze location is an exit identifier, and in step S104a, that is, the web page, the APP page, etc. corresponding to the exit identifier are exited. For another example, the operation object corresponding to the gaze location is an enlarged mark, and in step S104a, that is, the content corresponding to the enlarged mark is enlarged. For another example, the operation object corresponding to the gaze location is an APP icon, and in step S104a, that is, the APP is opened.
In detail, step S104a includes: firstly, obtaining an operation object corresponding to a gazing position based on a mapping relation between position coordinates of a touch screen and the operation object; then, a corresponding control operation is performed on the obtained operation object.
Step S104b, does not respond to the touch of the user.
That is, the smart terminal does not perform any operation with respect to the touch of the user.
Referring next to fig. 10, fig. 10 is a flowchart illustrating a control method of an intelligent terminal according to another exemplary embodiment. In the embodiment shown in fig. 10, the control method includes steps S201 to S207 as follows.
Step S201, a gaze location of a user' S eyes on a touch screen is acquired.
Step S202, a touch position of a user on a touch screen is obtained.
Step S203, confirms whether the obtained gaze location falls within the touch location range. If the obtained gaze location falls within the touch location range, step S204a is entered; otherwise, if the obtained gaze location is outside the touch location range, the process proceeds to step S204b.
In step S204a, a control operation is performed on an operation object corresponding to the gaze location in response to a touch by the user.
In step S204b, an offset of the touch position of the user on the touch screen relative to the gaze position is obtained.
In step S205, the offset is compared with the offset threshold.
Step S206, if the offset is below the offset threshold, proceeding to step S207a; otherwise, if the offset is not less than the offset threshold, the process proceeds to step S207b.
Wherein, the offset is below the offset threshold may mean that the offset is less than the offset threshold, and correspondingly, the offset is above the offset threshold, that is, the offset is greater than or equal to the offset threshold. The offset being below the offset threshold may also mean that the offset is less than or equal to the offset threshold, and correspondingly, the offset being above the offset threshold, i.e., the offset being greater than the offset threshold.
S207a, shifting the touch position toward the direction in which the gaze position is located by a preset value, obtaining a corrected touch position, and performing a control operation on an operation object corresponding to the corrected touch position in response to a touch of the user.
The offset of the touch position relative to the gaze position is smaller, possibly caused by slight error of user operation, by correcting the touch position, the corrected touch position is obtained, and further, control operation is performed based on the corrected touch position, so that the situation that the user operation is not responded due to slight touch error of the user is avoided, and user experience is improved.
The touch position is shifted towards the direction of the gazing position by a preset value, and the preset value can be obtained through multiple tests. For example, a plurality of closing marks are displayed on a screen, when a test user clicks each closing mark for a plurality of times, the offset of the touch position and the gaze position is calculated in a manner of averaging based on the offset obtained by the plurality of tests, and the empirical offset is used as the preset value to correct the touch position.
S207b, does not respond to the touch of the user.
That is, the smart terminal does not perform any operation with respect to the touch of the user. The offset of the touch position relative to the gaze position is large, which is generally caused by the false touch of the user, and at the moment, the touch of the user is not responded, so that the situation that the intelligent terminal opens the wrong application or jumps to the wrong page and other misoperation is avoided.
Fig. 11 is a block diagram illustrating a control device of an intelligent terminal according to an exemplary embodiment. The control device of the intelligent terminal can be applied to the control process of the intelligent terminal, and all or part of the steps of the control method of the intelligent terminal shown in any one of fig. 4, 5 and 10 are executed. The control device 200 of the intelligent terminal includes, but is not limited to: an eye movement detection module 210, a touch detection module 220, a position comparison module 230, and an operation control module 240.
The eye movement detection module 210 is configured to obtain a gaze position of eyes of a user on the touch screen.
The touch detection module 220 is configured to obtain a touch position of a user on the touch screen when a gaze position of the user's eyes on the touch screen is obtained.
The position comparison module 230 is used to confirm whether the obtained gaze position falls within the touch position range.
The operation control module 240 is configured to perform a control operation on an operation object corresponding to the gaze location in response to a touch of a user in a case where the obtained gaze location falls within the touch location range.
In one embodiment of the present application, the eye movement detection module 210 obtains a focusing position of a pupil of a user on a touch screen; displaying a pattern identifier at a position corresponding to the current focusing position of the pupil of the user on the touch screen; acquiring movement information of a pupil of a user; adjusting the position of the pattern mark on the touch screen based on the movement information of the pupil of the user; and when the pupil of the user stops moving, taking the position marked by the current pattern as the fixation position of eyes of the user on the touch screen.
In one embodiment of the present application, the operation control module 240 does not respond to the user's touch in the event that the obtained gaze location does not fall within the range of touch locations.
In one embodiment of the present application, the operation control module 240 obtains an offset of a touch position of a user on the touch screen relative to the gaze position in a case where the obtained gaze position does not fall within the touch position range; comparing the offset with the offset threshold; if the offset is below the offset threshold, shifting the touch position towards the direction of the gazing position by a preset value to obtain a corrected touch position, and executing control operation on an operation object corresponding to the corrected touch position in response to the touch of the user; if the offset is above the offset threshold, the user's touch is not responded to.
In one embodiment of the present application, the operation control module 240 obtains an operation object corresponding to a gaze location based on a mapping relationship between a position coordinate of the touch screen and the operation object; and executing corresponding control operation on the obtained operation object.
In one embodiment of the present application, the touch screen is divided into a plurality of touch partitions 110, and generates touch screen signals corresponding to the touch partitions when the touch partitions are touched, and the number of sub-pixels corresponding to the gaze locations is smaller than the number of sub-pixels corresponding to the touch partitions.
Fig. 12 is a schematic diagram of an electronic device shown according to an example embodiment.
Referring to fig. 12, the present embodiment provides an electronic device 300, where the electronic device 300 includes one or more processors 301 and a memory 302, and the memory 302 is configured to store one or more programs, and when the one or more programs are executed by the one or more processors 301, the processor 301 is caused to implement a control method of the intelligent terminal of the present application.
The specific manner in which the processor of the electronic device performs the operations in this embodiment has been described in detail in the embodiments related to the control method of the intelligent terminal, and will not be described in detail here.
In detail, the processor may include one or more processing units, such as two CPUs. And as one example an electronic device may include multiple processors, e.g., two processors. Each of these processors may be a single-Core Processor (CPU) or a multi-core processor (multi-CPU). A processor may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions), as will be appreciated.
The memory may be, but is not limited to, read-only memory (ROM) or other type of static storage device that can store static information and instructions, random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, charged erasable programmable read-only memory (Electrically Erasable Programmable read only memory, EEPROM), compact disc-only memory (compact disc read-only memory, CD-ROM) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
It will be appreciated that the memory may be stand alone and be coupled to the processor via a bus. The memory may also be integrated with the processor.
In one exemplary embodiment, the memory is used for storing computer-executable instructions corresponding to executing the software programs of the present application. The processor may implement various functions of the electronic device by running or executing software programs stored in the memory.
Fig. 13 is a schematic diagram of a smart terminal shown according to an example embodiment. The intelligent terminal may perform all or part of the steps of the control method of the intelligent terminal shown in any one of fig. 4, 5 and 10. The intelligent terminal includes, but is not limited to, a terminal body, a camera, and an electronic device as shown in fig. 12.
The terminal body includes some necessary components constituting the intelligent terminal, such as a housing, a communication module, a touch screen 11, and the like.
The camera 12 is disposed on the terminal body, specifically, on the upper portion of the terminal body, and is configured to collect an image including a pupil of a user, so as to obtain a fixation position of eyes of the user on the touch screen 11.
The electronic device is used as a main control module of the intelligent terminal and comprises one or more processors and a memory, wherein the memory is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the processors are enabled to realize the control method of the intelligent terminal.
The smart terminal may be a smart phone as shown in fig. 13. At this time, the terminal body is a mobile phone body, and includes, but is not limited to, a casing, a voice module (not shown in the figure) for implementing a voice processing function, a communication module (not shown in the figure) for implementing an information interaction function, and some necessary auxiliary function modules (not shown in the figure). The touch screen 11 is disposed at a front side of the cabinet, and the touch screen 11 is divided into a plurality of touch partitions 110. The main control module (electronic equipment) is accommodated in the casing.
The smart terminal may be a smart watch as shown in fig. 14. At this time, the terminal body is a wristwatch body including, but not limited to, a case 13, a band 14 connecting the case 13, a timing module (not shown) housed inside the case 13, and some necessary auxiliary functional modules (not shown). The touch screen 11 is provided on the front side of the case 13, and the touch screen 11 is divided into a plurality of touch areas 110. The main control module (electronic device) is housed inside the case 13. The camera 12 is provided on the front side of the case 13 and is provided adjacent to the touch screen 11.
It should be noted that the smart terminal is not limited to the smart phone and the smart watch, and may be any smart device having a camera and a touch screen, such as a tablet computer, an outdoor bulletin board, and the like.
Fig. 15 is a block diagram illustrating a computer system architecture for implementing an electronic device of an embodiment of the present application, according to an example embodiment.
It should be noted that the computer system shown in fig. 15 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present application.
As shown in fig. 15, the computer system 400 includes a central processing unit 401 (Central Processing Unit, CPU) which can execute various appropriate actions and processes according to a program stored in a Read-Only Memory 402 (ROM) or a program loaded from a storage section 403 into a random access Memory 404 (Random Access Memory, RAM). In the random access memory 404, various programs and data required for device operation are also stored. The central processing unit 401, the read only memory 402, and the random access memory 404 are connected to each other via a bus 405. An Input/Output interface 406 (i.e., an I/O interface) is also connected to bus 405.
The following components are connected to the input/output interface 406: an input section 407 including a keyboard, a mouse, and the like; an output section 408 including a Cathode Ray Tube (CRT), a liquid crystal display (Liquid Crystal Display, LCD), and the like, a speaker, and the like; a storage section 403 including a hard disk or the like; and a communication section 409 including a network interface card such as a local area network card, a modem, or the like. The communication section 409 performs communication processing via a network such as the internet. The drive 410 is also connected to the input/output interface 406 as needed. A removable medium 411 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed on the drive 410 as needed, so that a computer program read out therefrom is installed into the storage section 403 as needed.
In particular, according to embodiments of the present application, the processes described in the various method flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 409 and/or installed from the removable medium 411. The computer program, when executed by the central processor 401, performs the various functions defined in the apparatus of the present application.
It should be noted that, the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor apparatus, device, or means, or any combination of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution apparatus, device, or apparatus. In the present application, however, a computer-readable signal medium may include a data signal that propagates in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution apparatus, device, or apparatus. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
Those of skill in the art will appreciate that in one or more of the examples described above, the functions described herein may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, these functions may be stored on or transmitted over as one or more instructions or code on a computer-readable storage medium.
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of functional modules is illustrated, and in practical application, the above-described functional allocation may be implemented by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to implement all or part of the functions described above.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, e.g., divided modularly, divided into only one type of logic functions, and other manners of division are possible in actual practice. For example, multiple units or components may be combined or may be integrated into another device, or some features may be omitted, or not performed.
It is to be understood that the present application is not limited to the precise construction set forth above and shown in the drawings, and that various modifications and changes may be effected therein without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. A control method of an intelligent terminal, the intelligent terminal having a touch screen, the control method comprising:
acquiring the gazing position of eyes of a user on a touch screen;
when the fixation position of eyes of a user on a touch screen is obtained, the touch position of the user on the touch screen is obtained;
confirming whether the obtained gaze location falls within the touch location range;
and if the obtained gazing position falls within the touch position range, responding to the touch of the user to execute control operation on the operation object corresponding to the gazing position.
2. The control method according to claim 1, wherein the acquiring the gaze location of the user's eyes on the touch screen comprises:
acquiring a focusing position of a pupil of a user on a touch screen;
displaying a pattern identifier at a position corresponding to the current focusing position of the pupil of the user on the touch screen;
Acquiring movement information of a pupil of a user;
adjusting the position of the pattern mark on the touch screen based on the movement information of the pupil of the user;
and when the pupil of the user stops moving, taking the position marked by the current pattern as the fixation position of eyes of the user on the touch screen.
3. The control method according to claim 1, characterized by further comprising:
and if the obtained gazing position does not fall into the touch position range, not responding to the touch of the user.
4. The control method according to claim 1, characterized by further comprising:
if the obtained gazing position does not fall into the touch position range, acquiring the offset of the touch position of the user on the touch screen relative to the gazing position;
comparing the offset with an offset threshold;
if the offset is below the offset threshold, shifting the touch position towards the direction of the gazing position by a preset value to obtain a corrected touch position, and executing control operation on an operation object corresponding to the corrected touch position in response to the touch of a user;
and if the offset is above the offset threshold, not responding to the touch of the user.
5. The control method according to claim 1, wherein the performing a control operation on the operation object corresponding to the gaze location in response to a touch of a user includes:
acquiring an operation object corresponding to the gazing position based on the mapping relation between the position coordinates of the touch screen and the operation object;
and executing corresponding control operation on the obtained operation object.
6. The control method according to any one of claims 1 to 5, characterized in that the touch screen is divided into a plurality of touch sections, the touch screen generating touch screen signals corresponding to the touch sections when the touch sections are touched, the number of sub-pixels corresponding to the gaze locations being smaller than the number of sub-pixels corresponding to the touch sections.
7. The utility model provides a controlling means of intelligent terminal which characterized in that includes:
the eye movement detection module is used for acquiring the gazing position of eyes of a user on the touch screen;
the touch detection module is used for acquiring the touch position of the user on the touch screen when acquiring the gazing position of the eyes of the user on the touch screen;
a position comparison module for confirming whether the obtained gaze position falls within the touch position range;
And the operation control module is used for responding to the touch of the user to execute control operation on the operation object corresponding to the gazing position under the condition that the obtained gazing position falls within the touch position range.
8. An electronic device, comprising:
one or more processors;
a memory for storing one or more computer programs that, when executed by the one or more processors, cause the processors to implement the control method of any of claims 1-6.
9. An intelligent terminal, characterized by comprising:
a terminal body having a touch screen;
the camera is arranged on the terminal main body and used for acquiring images containing pupils of a user so as to acquire the gazing position of eyes of the user on the touch screen;
an electronic device as claimed in claim 8.
10. A computer readable storage medium storing computer readable instructions which, when executed by a processor of a computer, cause the computer to perform the control method of any one of claims 1 to 6.
CN202310287206.6A 2023-03-22 2023-03-22 Control method of intelligent terminal and intelligent terminal Pending CN116301429A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310287206.6A CN116301429A (en) 2023-03-22 2023-03-22 Control method of intelligent terminal and intelligent terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310287206.6A CN116301429A (en) 2023-03-22 2023-03-22 Control method of intelligent terminal and intelligent terminal

Publications (1)

Publication Number Publication Date
CN116301429A true CN116301429A (en) 2023-06-23

Family

ID=86830304

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310287206.6A Pending CN116301429A (en) 2023-03-22 2023-03-22 Control method of intelligent terminal and intelligent terminal

Country Status (1)

Country Link
CN (1) CN116301429A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116737051A (en) * 2023-08-16 2023-09-12 北京航空航天大学 Visual touch combination interaction method, device and equipment based on touch screen and readable medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116737051A (en) * 2023-08-16 2023-09-12 北京航空航天大学 Visual touch combination interaction method, device and equipment based on touch screen and readable medium
CN116737051B (en) * 2023-08-16 2023-11-24 北京航空航天大学 Visual touch combination interaction method, device and equipment based on touch screen and readable medium

Similar Documents

Publication Publication Date Title
US9904409B2 (en) Touch input processing method that adjusts touch sensitivity based on the state of a touch object and electronic device for supporting the same
CN107463329B (en) Detection method, device, storage medium and the mobile terminal of blank screen gesture
EP3309667A1 (en) Electronic device having plurality of fingerprint sensing modes and method for controlling the same
CN115334198A (en) Method for acquiring biometric information and electronic device supporting the same
CN108024009B (en) Electronic equipment and method thereof
US8482540B1 (en) Configuring a user interface for use with an overlay
WO2008157250A1 (en) Mode sensitive processing of touch data
KR20180089133A (en) Electric device and method for controlling display
CN106257398B (en) Electronic device and method for processing notification in electronic device
AU2011318746A1 (en) Method and apparatus for recognizing a gesture in a display
CN116301429A (en) Control method of intelligent terminal and intelligent terminal
TW201947361A (en) Terminal control method and, apparatus, and terminal
CN107765900A (en) Intelligent pen, control method, device, equipment and storage medium of intelligent pen
CN103905865A (en) Touchable intelligent television and method for achieving touching
CN104133578A (en) Touch screen panel display and touch key input system
CN112274918A (en) Information processing method and device, storage medium and electronic equipment
KR20160098700A (en) Apparatus for processing multi-touch input and method thereof
CN104049868A (en) Response method and electronic equipment
US20200057549A1 (en) Analysis device equipped with touch panel device, method for display control thereof, and program
CN110308821A (en) Touch-control response method and electronic equipment
CN211628203U (en) Portable laser projection keyboard
CN109101179A (en) Touch control method, mobile terminal and the computer readable storage medium of mobile terminal
CN116795273A (en) Interactive screen display method, device, medium and electronic equipment
JP6613854B2 (en) Information processing apparatus, information processing program, and electronic device
CN113220160A (en) Touch control method and system for touch display equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination