CN107092433B - Touch control method and device of touch control all-in-one machine - Google Patents

Touch control method and device of touch control all-in-one machine Download PDF

Info

Publication number
CN107092433B
CN107092433B CN201710235100.6A CN201710235100A CN107092433B CN 107092433 B CN107092433 B CN 107092433B CN 201710235100 A CN201710235100 A CN 201710235100A CN 107092433 B CN107092433 B CN 107092433B
Authority
CN
China
Prior art keywords
touch
screen
display
display screen
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710235100.6A
Other languages
Chinese (zh)
Other versions
CN107092433A (en
Inventor
倪立亭
薛婷婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN201710235100.6A priority Critical patent/CN107092433B/en
Publication of CN107092433A publication Critical patent/CN107092433A/en
Application granted granted Critical
Publication of CN107092433B publication Critical patent/CN107092433B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The embodiment of the application discloses a touch control method and device of a touch all-in-one machine, wherein the method comprises the following steps: storing the touch point coordinates of the detected coherent touch actions in a coordinate queue; and if the continuous variation trend of the transverse coordinates and/or the longitudinal coordinates of the touch points in the coordinate queue is consistent and the variation value exceeds a threshold value, sending a key instruction to a CPU (central processing unit), wherein the key instruction is used for indicating the display content of the display screen to execute corresponding operation according with the variation trend of the transverse coordinates and/or the longitudinal coordinates. In the embodiment of the application, when the application displayed on the display screen is in a full-screen display state, the MCU in the lower computer is switched to a keyboard mode, the MCU converts the detected touch point coordinates into a key instruction, and the CPU in the upper computer executes a key response corresponding to the key instruction. Therefore, when the application displayed on the display screen is in a full-screen display state, the touch control can be directly performed on the application program in a screen-sliding mode, and the operation is simple.

Description

Touch control method and device of touch control all-in-one machine
Technical Field
The application relates to the technical field of touch control integrated machines, in particular to a touch control method and device of a touch control integrated machine.
Background
The touch control integrated machine is man-machine interactive equipment integrating a touch control technology and a computer technology, and adopts a touch screen to replace a traditional mouse or a keyboard for information input, so that man-machine interaction is simpler, and the touch control integrated machine is widely applied to places such as conferences, teaching and the like.
Fig. 1 is a structural block diagram of a touch all-in-one machine in the prior art, and as shown in fig. 1, the touch all-in-one machine includes an upper computer and a lower computer. The upper computer and the lower computer can be integrally arranged and can be separately arranged, the upper computer comprises a processor (Central Processing Unit, CPU), and the lower computer comprises a Micro Control Unit (MCU), a display screen and a touch screen. The CPU of the upper computer is in communication connection with the MCU of the lower computer, and in the lower computer, the MCU is in communication connection with the display screen and the touch screen respectively. When the display screen is used, a user can install a corresponding application program on the upper computer according to the requirements of the user, the CPU transmits the picture information of the application program to the MCU, and the MCU controls the display screen to display the corresponding picture. Meanwhile, the MCU detects the coordinates of the touch points on the touch screen according to a preset time interval, sends the detected coordinates of the touch points to the CPU, and the CPU executes corresponding touch response. For example, when the touch point coordinates are (x1, y1), the CPU performs a touch response of a click at coordinates (x1, y 1).
The screen of the application program generally has two display states on the display screen: a full screen display state and a non-full screen display state. When the screen of the application program is in a non-full screen display state, a touch control area is usually set on the display screen, and a corresponding instruction identifier (for example, a previous page or a next page, etc.) is marked in the touch control area, so that a user can realize touch operation of the application program by clicking the corresponding touch control area. Although the method can facilitate the operation of the user on the application program, the picture display effect is poor due to the low occupancy rate of the picture on the display screen and the interference of the touch control area on the display screen. Therefore, when using the touch-control all-in-one machine, the user usually switches the screen of the application program to a full-screen display state.
When the screen of the application is in a full-screen display state, the touch control area is usually hidden or deleted. Because the touch control area is not displayed on the display screen, the user can only call out the hidden touch control area or quit the full-screen display state by long pressing the touch screen, and then touch control operation is carried out on the application program, and the operation is complicated.
Disclosure of Invention
The application provides a touch control method and device of a touch all-in-one machine, and aims to solve the problem that the touch all-in-one machine in the prior art is inconvenient for a user to perform touch operation on an application program in a full-screen display state.
In a first aspect, an embodiment of the present application provides a touch control method for a touch all-in-one machine in a full-screen display state, which is applied to a lower computer, and the method includes: storing the touch point coordinates of the detected coherent touch actions in a coordinate queue; and if the continuous variation trend of the transverse coordinates and/or the longitudinal coordinates of the touch points in the coordinate queue is consistent and the variation value exceeds a threshold value, sending a key instruction to a CPU (central processing unit), wherein the key instruction is used for indicating the display content of the display screen to execute corresponding operation according with the variation trend of the transverse coordinates and/or the longitudinal coordinates.
In a second aspect, an embodiment of the present application provides a touch control method for a touch all-in-one machine, which is applied to a lower computer, and the method includes: if the application currently displayed on the display screen is in a full-screen display state, applying the control method of the first aspect; if the application currently displayed on the display screen is in a non-full screen display state, sending the touch point coordinate to a CPU (central processing unit) after detecting the touch point coordinate, wherein the touch point coordinate is used for indicating the display content of the display screen to execute corresponding operation conforming to the touch point coordinate.
In a third aspect, an embodiment of the present application provides a touch control method in a full-screen display state of a touch all-in-one machine, which is applied to an upper computer, and the method includes: receiving a key instruction sent by a Micro Control Unit (MCU), wherein the key instruction is generated by the MCU in the following way: storing the touch point coordinates of the detected coherent touch actions in a coordinate queue; if the continuous variation trend of the transverse coordinates and/or the longitudinal coordinates of the touch points in the coordinate queue is consistent and the variation value exceeds a threshold value, generating a key instruction; and executing corresponding operation according with the change trend of the horizontal coordinate and/or the vertical coordinate in the display content of the display screen.
In a fourth aspect, an embodiment of the present application provides a touch control method for a touch all-in-one machine, which is applied to an upper computer, and the method includes: if the application currently displayed on the display screen is in a full-screen display state, applying the control method of the third aspect; if the application currently displayed on the display screen is in a non-full screen display state, receiving the touch point coordinates sent by the MCU; and executing corresponding operation according with the touch point coordinates in the display content of the display screen.
In a fifth aspect, an embodiment of the present application provides a lower computer, where the lower computer includes: the touch screen comprises a Micro Control Unit (MCU) and a touch screen, wherein the MCU is in communication connection with the touch screen; the MCU is configured to perform the method according to the first aspect and/or the second aspect.
In a sixth aspect, an embodiment of the present application provides an upper computer, where the upper computer includes a processor CPU; a memory for storing instructions for execution by the CPU; wherein the CPU is configured to perform the method of the third and/or fourth aspect.
In a seventh aspect, an embodiment of the present application provides a touch all-in-one machine, where the touch all-in-one machine includes the lower computer in the fifth aspect and the upper computer in the sixth aspect, and a processor CPU in the upper computer is in communication connection with a micro control unit MCU in the lower computer.
By adopting the technical scheme provided by the embodiment of the application, when the display screen displays the full-screen display state, the MCU in the lower computer is switched to a keyboard mode, at the moment, the touch screen is equivalent to a keyboard externally connected to the upper computer, the MCU is configured with the corresponding relation between the change value of the coordinates of the touch points and the key instructions, the MCU converts the detected coordinates of the touch points into the key instructions according to the corresponding relation, and the CPU in the upper computer executes the key response corresponding to the key instructions. Therefore, when the application displayed on the display screen is in a full-screen display state, the touch control can be directly performed on the application program in a screen-sliding mode, and the operation is simple.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without any creative effort.
Fig. 1 is a block diagram of a touch control integrated machine in the prior art;
fig. 2 is a schematic view of a scenario provided in an embodiment of the present application;
fig. 3 is a schematic flow chart of a touch control method in a full-screen display state of a touch all-in-one machine according to an embodiment of the present disclosure;
4A-4C are schematic diagrams of the operation of the sliding screen provided by the embodiment of the application;
fig. 5 is a schematic flow chart of a touch control method of a touch all-in-one machine according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a lower computer according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an upper computer provided in the embodiment of the present application.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 2 is a schematic view of a scene provided in an embodiment of the present application, and fig. 2 includes an upper computer 201 and a lower computer 202. The upper computer 201 includes a processor (CPU), and the lower computer 202 includes a Micro Controller Unit (MCU), a display screen, and a touch screen. When the PPT display screen is used, a user can install a corresponding application program, such as a PPT program, on the upper computer according to the requirement of the user, the CPU transmits PPT picture information to the MCU, and the MCU controls the display screen to display a corresponding PPT picture. Meanwhile, the MCU detects the coordinates of the touch points on the touch screen according to a preset time interval, sends the detected coordinates of the touch points to the CPU, and the CPU executes corresponding touch response. For example, when a user touches a previous touch control area, the MCU sends the detected touch point coordinates to the CPU, and the CPU executes a clicked touch response at the previous touch control area to realize page up operation of PPT.
It should be noted that fig. 2 is only a scene schematic diagram of the embodiment of the present application, and specific types and corresponding relationships of the upper computer 201 and the lower computer 202 in the diagram are not limited to the specific form disclosed in fig. 2. For example, the upper computer 201 and the lower computer 202 may be separately provided or may be integrally provided; the upper computer 201 and the lower computer 202 can be in communication connection through a signal line (for example, a USB line) and also can be in communication connection in a wireless mode; the upper computer 201 may be a PC, a notebook computer, a tablet computer, or the like, which is not specifically limited in this embodiment.
Taking the application scenario of PPT as an example. When a user uses the touch control all-in-one machine to perform PPT demonstration, the PPT screen is usually required to be switched to a full-screen display state in order to obtain a better screen display effect. However, since the touch control area is not displayed on the display screen in the full-screen display state, when a user needs to operate the PPT (for example, page up or page down), the user can only call out the hidden touch control area by pressing the touch screen for a long time or quit the full-screen display state, so as to perform a touch operation, which is cumbersome to operate.
In order to solve the above problem, an embodiment of the present application provides a touch control method for a touch all-in-one machine in a full-screen display state. Fig. 3 is a schematic flow chart of a touch control method in a full-screen display state of a touch all-in-one machine according to an embodiment of the present disclosure, where the method is applicable to the touch all-in-one machine shown in fig. 1 and 2, and as shown in fig. 3, the method mainly includes the following steps.
Step S301: touch point coordinates at which a coherent touch action is detected are stored in a coordinate queue.
When the touch screen works, the MCU in the lower computer detects the coordinates of the touch points on the touch screen according to a preset detection period. For example, when there is a touch operation of a user on the touch screen, the MCU may detect a touch position of the user on the touch screen, i.e., a touch point coordinate.
In the embodiment of the application, the MCU stores the touch point coordinates of the detected coherent touch action in a coordinate queue. The method specifically comprises the following steps: if the MCU detects a press operation action, storing coordinates corresponding to the press operation action in a coordinate queue, and sequentially storing the coordinates of touch points detected subsequently in the coordinate queue until a lift action is detected, and finishing the storage of the coordinates of the touch points. In other words, the touch point coordinates stored in the coordinate queue are the touch point coordinates detected between the press-down action and the lift-up action, and the coordinate queue is determined as the touch coordinates of the consecutive touch actions.
For example, when the user performs a screen sliding operation on the touch screen using a finger, the entire screen sliding operation may be composed of a start point, a sliding trajectory, and an end point. The starting point is the position where the user starts to contact the touch screen and is reflected in the lower computer as the position where the MCU detects the pressing operation action; the end point is the position where the user leaves the touch screen and reflects the position where the MCU detects the lifting operation action in the lower computer; the touch coordinates in the coordinate queue can reflect a sliding track of the sliding screen operation on the touch screen.
In this embodiment, the touch screen of the lower computer may be an infrared touch screen, a capacitive touch screen, a resistive touch screen, or an electromagnetic touch screen, which is not specifically limited in this application.
Step S302: and if the continuous variation trends of the transverse coordinates and/or the longitudinal coordinates of the touch points in the coordinate queue are consistent and the variation value exceeds a threshold value, sending a key instruction to a CPU (central processing unit).
In the embodiment of the application, when the MCU detects the lifting action, it indicates that the user has finished the screen sliding operation, and at this time, the button command received by the touch screen may be calculated according to the coordinates of the touch point in the coordinate queue. Specifically, if the continuous variation trends of the horizontal coordinates and/or the vertical coordinates of the touch points in the coordinate queue are consistent and the variation value exceeds a threshold value, a key instruction is sent to a processor CPU.
In order to realize the input of various key commands, different key commands can be configured for different sliding paths. Since the variation tendency of the coordinates of the touch point is different in different sliding paths, the key instruction intended to be input by the user can be determined based on the variation tendency of the lateral coordinates and/or the longitudinal coordinates of the touch point.
Based on the principle, in the embodiment of the application, the corresponding relation between the change trend of the horizontal coordinate and/or the vertical coordinate of the touch point and the key instruction is configured, and the key instruction received by the touch screen is determined according to the change trend of the horizontal coordinate and/or the vertical coordinate of the touch point in the coordinate queue. In addition, in order to avoid the touch screen from being triggered too sensitively, a certain threshold value can be set for the change value of the horizontal coordinate and/or the vertical coordinate of the touch point, and the key instruction received by the touch screen is determined only when the change value exceeds the threshold value. The trigger threshold value cannot be too large or too small, if the trigger threshold value is too large, the touch screen may be difficult to trigger, and if the trigger threshold value is too small, the touch screen may be sensitive to trigger, and misoperation is easily caused.
It can be understood that after the touch screen is determined to receive the key instruction, in order to facilitate the determination of the next key instruction, the touch point coordinates in the current coordinate queue need to be cleared, that is, the touch point coordinates stored in the coordinate queue are emptied.
Step S303: and the processor CPU executes corresponding operation according with the change trend of the horizontal coordinate and/or the vertical coordinate in the display content of the display screen.
And after receiving the key instruction, the CPU in the upper computer executes corresponding operation according with the change trend of the transverse coordinates and/or the longitudinal coordinates in the display content of the display screen. The display content of the display screen can be understood as an application displayed by the display screen, such as PPT, flash, and the like.
Taking an application scene of the PPT as an example, if the operation corresponding to the change trend of the transverse coordinate and/or the longitudinal coordinate is page-down, performing page-down operation on the PPT; if the operation corresponding to the change trend of the transverse coordinates and/or the longitudinal coordinates is page turning, performing page turning on the PPT; and if the operation corresponding to the change trend of the transverse coordinates and/or the longitudinal coordinates is the operation of withdrawing the pin at the previous step, the operation of withdrawing the previous step is executed on the PPT.
According to the technical scheme, in the embodiment of the application, when the application displayed by the display screen is in a full-screen display state, the MCU in the lower computer is switched to a keyboard mode, the touch screen is equivalent to a keyboard externally connected to the upper computer, the MCU is provided with a corresponding relation between the change trend of the coordinates of the touch points and the key instructions, the MCU converts the detected coordinates of the touch points into the key instructions according to the corresponding relation, and the CPU in the upper computer executes key responses corresponding to the key instructions. Therefore, when the application displayed on the display screen is in a full-screen display state, the touch control can be directly performed on the application program in a screen-sliding mode, and the operation is simple.
In addition, the key instruction determining process is realized in the lower computer in the technical scheme, and the upper computer only needs to receive the corresponding key instruction, so that the application program installed in the upper computer does not need to be adjusted, and the use is convenient.
In order to facilitate better understanding of the technical solution by those skilled in the art, the following description is provided with an exemplary correspondence relationship between a variation trend of coordinates of a touch point and a keystroke instruction in combination with an application scenario of a PPT. In fig. 4A-4C, the lower left corner of the touch screen (the orientation shown in the figure) is the origin of coordinates O.
In one possible implementation manner, the correspondence relationship between the variation trend of the horizontal coordinate and/or the vertical coordinate of the touch point and the key instruction is configured to: and if the horizontal coordinate of the touch point in the coordinate queue is continuously increased and/or the vertical coordinate is continuously decreased, and the change value exceeds a threshold value, determining that the key instruction received by the touch screen is page turning.
Two types of sliding screen operations are shown in FIG. 4A, respectively: a first slide operation 401 and a second slide operation 402. The first screen sliding operation 401 slides from the left side to the right side of the touch screen along the direction shown in the figure, and the horizontal coordinates of the touch points continuously increase progressively, so that when the first screen sliding operation 401 exists on the touch screen, the MCU of the lower computer generates a key instruction for turning down the page. The second sliding screen operation 402 slides from the upper side to the lower side of the touch screen along the direction shown in the figure, and the longitudinal coordinates of the touch points continuously decrease, so that when the second sliding screen operation 402 exists on the touch screen, the MCU of the lower computer generates a key instruction for turning down the page.
In another possible implementation manner, the correspondence relationship between the variation trend of the horizontal coordinate and/or the vertical coordinate of the touch point and the key instruction is configured to: and if the horizontal coordinate of the touch point in the coordinate queue is continuously decreased and/or the vertical coordinate is continuously increased, and the change value exceeds a threshold value, determining that the key instruction received by the touch screen is page turning.
Two types of sliding screen operations are shown in FIG. 4B, respectively: a third screen sliding operation 403 and a fourth screen sliding operation 404. The third sliding screen operation 403 slides from the right side to the left side of the touch screen along the direction shown in the figure, and the horizontal coordinate of the touch point continuously decreases, so that when the third sliding screen operation 403 exists on the touch screen, the MCU of the lower computer generates a key instruction for turning up the page.
The fourth sliding operation 404 slides from the lower side to the upper side of the touch screen along the direction shown in the figure, and the longitudinal coordinates of the touch points continuously increase in an increasing manner, so that when the fourth sliding operation 404 exists on the touch screen, the MCU of the lower computer generates a key instruction for turning up the page.
In another possible implementation manner, the correspondence relationship between the variation trend of the horizontal coordinate and/or the vertical coordinate of the touch point coordinate and the key instruction is configured to: and if the horizontal coordinate value of the touch point in the coordinate queue is detected to increase first and then decrease or decrease first and then increase, and/or the vertical coordinate value is detected to increase first and then decrease or decrease first and then increase, determining that the key instruction received by the touch screen is the operation of canceling the previous step.
Four types of sliding screen operations are shown in FIG. 4C, respectively: a fifth screen slide operation 405, a sixth screen slide operation 406, a seventh screen slide operation 407, and an eighth screen slide operation 408. The fifth screen sliding operation 405 slides from the left side to the right side of the touch screen along the direction shown in the figure, and then slides from the right side to the left side of the touch screen, and the horizontal coordinate value of the touch point increases gradually and then decreases gradually, so that when the fifth screen sliding operation 405 exists on the touch screen, the MCU of the lower computer generates a key instruction for canceling the previous operation.
The sixth screen sliding operation 406 slides from the right side to the left side of the touch screen along the direction shown in the figure, and then slides from the left side to the right side of the touch screen, and the horizontal coordinate value of the touch point is decreased gradually and then increased gradually, so that when the sixth screen sliding operation 406 exists on the touch screen, the MCU of the lower computer generates a key instruction for canceling the previous operation.
The seventh sliding operation 407 slides from the lower side to the upper side of the touch screen along the direction shown in the figure, and then slides from the upper side to the lower side of the touch screen, and the longitudinal coordinate value of the touch point increases gradually and then decreases gradually, so that when the seventh sliding operation 407 exists on the touch screen, the MCU of the lower computer generates a key instruction to cancel the previous operation.
The eighth screen sliding operation 408 slides from the upper side to the lower side of the touch screen along the direction shown in the figure, and then slides from the lower side to the upper side of the touch screen, and the longitudinal coordinate value of the touch point is decreased gradually and then increased gradually, so that when the eighth screen sliding operation 408 exists on the touch screen, the MCU of the lower computer generates a key instruction for canceling the previous operation.
By adopting the technical scheme, when the application displayed by the display screen is in a full-screen display state, the MCU in the lower computer is switched into a keyboard mode. However, when the application displayed on the display screen is in a non-full screen display state, the multi-touch mode of the MCU needs to be restored. Which mainly comprises the following steps.
Step S501: and judging whether the application currently displayed on the display screen is in a full-screen display state.
In the embodiment of the application, the display state of the application can be detected by the upper computer, and the display state of the application can be detected by the lower computer.
When the display state of the Application is detected by the upper computer, a specific Application needs to be installed in a system (e.g., a windows system) of the upper computer, and an Application Programming Interface (API) of the detection system is implemented by the Application, so as to determine the display state of the Application. When the display state of the application is detected by the lower computer, the MCU of the lower computer detects the application in the API of the upper computer in real time, judges whether a full-screen playing event exists or not and further determines the display state of the application.
Step S502: and if the application currently displayed on the display screen is in a full-screen display state, switching the MCU into a keyboard mode.
The keyboard mode referred to in this application is the control mode in the embodiment shown in fig. 3.
Step S503: and if the application currently displayed on the display screen is in a non-full screen display state, switching the MCU into a multi-point touch mode.
The multi-touch mode includes: and after detecting the touch point coordinate, sending the touch point coordinate to a CPU (Central processing Unit), wherein the touch point coordinate is used for indicating the display content of the display screen to execute corresponding operation conforming to the touch point coordinate.
For example, after receiving the touch point coordinates, the processor CPU executes a touch response of a click at a position corresponding to the touch point coordinates in the display content of the display screen.
By adopting the technical scheme, the MCU can be switched among different working modes according to the applied display state.
On the basis of the method embodiment, the application also provides the lower computer. Fig. 6 is a schematic structural diagram of a lower computer provided in an embodiment of the present application, and as shown in fig. 6, the lower computer 600 includes a micro control unit MCU601, a touch screen 602, a display screen 603, and a communication unit 604. The components communicate via one or more buses, and those skilled in the art will appreciate that the architecture of the servers shown in the figures is not limiting of the application, and may be a bus architecture, a star architecture, a combination of more or fewer components than those shown, or a different arrangement of components.
The communication unit 604 is configured to establish a communication channel, so that the MCU601 can communicate with the touch screen 602 and the display screen 603.
The MCU601 includes a processor and a memory. The processor, which is a control center of the storage device, is connected to various parts of the whole electronic device by using various interfaces and lines, and executes various functions of the electronic device and/or processes data by running or executing software programs and/or modules stored in the memory and calling data stored in the memory. The processor may be composed of an Integrated Circuit (IC), for example, a single packaged IC, or a plurality of packaged ICs connected with the same or different functions. For example, a processor may include only a Central Processing Unit (CPU). In the embodiments of the present application, the CPU may be a single arithmetic core or may include multiple arithmetic cores.
The memory, which is used to store instructions for execution by the processor, may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The executable instructions in the memory, when executed by the processor, enable the lower computer to perform some or all of the steps in the method embodiments described below.
On the basis of the method embodiment, the application further provides the upper computer. Fig. 7 is a schematic structural diagram of an upper computer provided in an embodiment of the present application, and as shown in fig. 7, the upper computer 700 may include: a processor CPU701, a memory 702, and a communication unit 703. The components communicate via one or more buses, and those skilled in the art will appreciate that the architecture of the servers shown in the figures is not limiting of the application, and may be a bus architecture, a star architecture, a combination of more or fewer components than those shown, or a different arrangement of components.
The communication unit 703 is configured to establish a communication channel, so that the storage device can communicate with other devices. Receiving the user data sent by other devices or sending the user data to other devices.
The processor CPU701, which is a control center of the storage device, connects various parts of the entire electronic device using various interfaces and lines, and executes various functions of the electronic device and/or processes data by running or executing software programs and/or modules stored in the memory 702 and calling data stored in the memory. The processor may be composed of an Integrated Circuit (IC), for example, a single packaged IC, or a plurality of packaged ICs connected with the same or different functions. For example, the processor CPU701 may include only a central processing unit. In the embodiments of the present application, the CPU may be a single arithmetic core or may include multiple arithmetic cores.
The memory 702 is used for storing instructions executed by the processor CPU701, and the memory 702 may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic disk or optical disk.
The execution instructions in the memory 702, when executed by the processor CPU701, enable the host computer 700 to perform some or all of the steps in the method embodiments described below.
On the basis of the method embodiment, the embodiment of the application further provides a touch all-in-one machine which comprises a lower computer shown in fig. 6 and an upper computer shown in fig. 7, wherein a processor CPU in the upper computer is in communication connection with a micro control unit MCU in the lower computer.
In a specific implementation, the present application further provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in the embodiments of the calling method provided in the present application when executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
Those skilled in the art will clearly understand that the techniques in the embodiments of the present application may be implemented by way of software plus a required general hardware platform. Based on such understanding, the technical solutions in the embodiments of the present application may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the embodiments or some parts of the embodiments of the present application.
The same and similar parts in the various embodiments in this specification may be referred to each other. Especially, for the embodiment of the upper computer, since it is basically similar to the embodiment of the method, the description is simple, and the relevant points can be referred to the description in the embodiment of the method.
The above-described embodiments of the present application do not limit the scope of the present application.

Claims (13)

1. A touch control method in a full-screen display state of a touch all-in-one machine is characterized by comprising the following steps:
the touch control all-in-one machine comprises a display screen,
when the application program is displayed on the display screen in a full-screen display state, the display screen does not display the touch control area,
touch point coordinates at which a coherent touch action is detected are stored in a coordinate queue,
if the continuous change trend of the transverse coordinates and/or the longitudinal coordinates of the touch points in the coordinate queue is consistent and the change value exceeds a threshold value, sending a key instruction to a CPU (central processing unit), wherein the key instruction is used for indicating the display content of the display screen to execute corresponding operation according with the change trend of the transverse coordinates and/or the longitudinal coordinates so as to change the display picture of the display screen; responding to the click operation, and enabling the display picture of the display screen not to change;
when the application program is displayed on the display screen in a non-full screen display state, responding to a click operation, and changing a display picture of the display screen;
and when the display state of the application program is detected to be switched, entering different touch working modes.
2. The method of claim 1, wherein storing touch point coordinates at which coherent touch actions are detected in a coordinate queue comprises:
if the pressing operation action is detected, storing the coordinates of the pressing operation action in a coordinate queue;
and detecting other touch point coordinates to continue to be stored in the coordinate queue until the lifting action is detected, and determining the coordinate queue as the touch coordinates of the coherent touch action.
3. The method of claim 1, wherein storing touch point coordinates at which coherent touch actions are detected in a coordinate queue further comprises:
and calculating data in the coordinate queue, and emptying the touch point coordinates stored in the coordinate queue.
4. The method according to any one of claims 1 to 3, wherein if it is detected that the horizontal coordinate and/or the vertical coordinate of the touch point in the coordinate queue continuously change in a consistent manner and the change value exceeds a threshold value, sending a key instruction to a processor CPU, comprising:
if the continuous increasing of the horizontal coordinate and/or the continuous decreasing of the vertical coordinate of the touch point in the coordinate queue are detected, and the change value exceeds a threshold value, a key instruction for turning down the page is sent to a CPU;
and if the horizontal coordinate of the touch point in the coordinate queue continuously decreases and/or the vertical coordinate continuously increases and the variation value exceeds a threshold value, sending a key instruction of page turning to a CPU (Central processing Unit).
5. A touch control method of a touch control all-in-one machine is characterized by comprising the following steps:
if the application currently displayed on the display screen is in a full-screen display state, applying the control method of any one of claims 1 to 4;
if the application currently displayed on the display screen is in a non-full screen display state, sending the touch point coordinate to a CPU (central processing unit) after detecting the touch point coordinate, wherein the touch point coordinate is used for indicating the display content of the display screen to execute corresponding operation conforming to the touch point coordinate.
6. A touch control method in a full-screen display state of a touch all-in-one machine is characterized by comprising the following steps:
the touch control all-in-one machine comprises a display screen, an application program is displayed on the display screen in a full-screen display state, and a touch control area is not displayed on the display screen;
receiving a key instruction sent by a Micro Control Unit (MCU), wherein the key instruction is generated by the MCU in the following way: storing the touch point coordinates of the detected coherent touch actions in a coordinate queue; if the continuous variation trend of the transverse coordinates and/or the longitudinal coordinates of the touch points in the coordinate queue is consistent and the variation value exceeds a threshold value, generating a key instruction;
responding to the click operation, and enabling the display picture of the display screen not to change;
executing corresponding operation according with the change trend of the horizontal coordinate and/or the vertical coordinate in the display content of the display screen, so that the display picture of the display screen is changed;
and when the display state of the application program is detected to be switched, entering different touch working modes.
7. A touch control method of a touch control all-in-one machine is characterized by comprising the following steps:
if the application currently displayed on the display screen is in a full-screen display state, applying the control method of claim 6;
if the application program currently displayed on the display screen is in a non-full screen display state, receiving touch point coordinates sent by a Micro Control Unit (MCU);
executing corresponding operation according with the touch point coordinate in display content of the display screen;
when the application program is displayed on the display screen in a non-full-screen display state, responding to a click operation, and changing a display picture of the display screen;
and when the display state of the application program is detected to be switched, entering different touch working modes.
8. A lower computer, comprising: the touch screen comprises a Micro Control Unit (MCU) and a touch screen, wherein the MCU is in communication connection with the touch screen;
the MCU is configured to perform the method of any of claims 1-5.
9. An upper computer is characterized by comprising a processor CPU;
a memory for storing instructions for execution by the CPU;
wherein the CPU is configured to perform the method of claim 6 or 7.
10. An all-in-one touch control machine is characterized by comprising a lower computer and an upper computer, wherein the lower computer is in the claim 8, the upper computer is in the claim 9, and a processor CPU in the upper computer is in communication connection with a micro control unit MCU in the lower computer.
11. The utility model provides a touch control all-in-one machine which characterized in that, touch control all-in-one machine includes:
the display screen is used for displaying pictures;
the touch screen is used for receiving touch operation;
a Micro Control Unit (MCU) configured to:
when a full-screen picture of an application program is displayed on the display screen and the display screen does not display a touch control area, responding to screen sliding operation, wherein the display picture of the display screen changes, and responding to clicking operation, the display picture of the display screen does not change;
when the application program is displayed on the display screen in a non-full screen display state, responding to a click operation, and changing a display picture of the display screen;
and when the display state of the application program is detected to be switched, entering different touch working modes.
12. The touch-control all-in-one machine of claim 11,
the screen sliding operation is at least one of a sliding operation from a left side to a right side of the touch screen, a sliding operation from a right side to a left side of the touch screen, a sliding operation from an upper side to a lower side of the touch screen, and a sliding operation from a lower side to an upper side of the touch screen.
13. The touch-control all-in-one machine of claim 11,
the application program is PPT.
CN201710235100.6A 2017-04-12 2017-04-12 Touch control method and device of touch control all-in-one machine Active CN107092433B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710235100.6A CN107092433B (en) 2017-04-12 2017-04-12 Touch control method and device of touch control all-in-one machine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710235100.6A CN107092433B (en) 2017-04-12 2017-04-12 Touch control method and device of touch control all-in-one machine

Publications (2)

Publication Number Publication Date
CN107092433A CN107092433A (en) 2017-08-25
CN107092433B true CN107092433B (en) 2021-03-02

Family

ID=59637635

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710235100.6A Active CN107092433B (en) 2017-04-12 2017-04-12 Touch control method and device of touch control all-in-one machine

Country Status (1)

Country Link
CN (1) CN107092433B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019100299A1 (en) * 2017-11-23 2019-05-31 深圳市柔宇科技有限公司 Head-mounted display apparatus and automatic calibration method of touch device thereof
CN109766028B (en) * 2019-03-18 2022-03-18 深圳市康冠商用科技有限公司 Touch control sub-management system and method for infrared touch screen
CN110806514A (en) * 2019-09-23 2020-02-18 芜湖宏景电子股份有限公司 Automatic touch key detection system and working method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104731499A (en) * 2015-03-13 2015-06-24 深圳市中兴移动通信有限公司 Method and device for starting background applications and mobile terminal
CN104915139A (en) * 2015-04-30 2015-09-16 努比亚技术有限公司 Automatic adjusting method and device for brightness of display interface
CN105511774A (en) * 2014-10-17 2016-04-20 深圳Tcl新技术有限公司 Method and device for displaying display terminal interface
JP2016154056A (en) * 2010-04-13 2016-08-25 グリッドマーク株式会社 Stream dot pattern

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016154056A (en) * 2010-04-13 2016-08-25 グリッドマーク株式会社 Stream dot pattern
CN105511774A (en) * 2014-10-17 2016-04-20 深圳Tcl新技术有限公司 Method and device for displaying display terminal interface
CN104731499A (en) * 2015-03-13 2015-06-24 深圳市中兴移动通信有限公司 Method and device for starting background applications and mobile terminal
CN104915139A (en) * 2015-04-30 2015-09-16 努比亚技术有限公司 Automatic adjusting method and device for brightness of display interface

Also Published As

Publication number Publication date
CN107092433A (en) 2017-08-25

Similar Documents

Publication Publication Date Title
CN104364734B (en) Remote session control using multi-touch inputs
US9612675B2 (en) Emulating pressure sensitivity on multi-touch devices
CN109857303B (en) Interaction control method and device
CN102902469A (en) Gesture recognition method and touch system
EP3336679A1 (en) Method and terminal for preventing unintentional triggering of a touch key and storage medium
AU2016426314B2 (en) Interface display method and terminal
WO2015078353A1 (en) Touch screen control method and terminal equipment
US20120304199A1 (en) Information processing apparatus, information processing method, and computer program
US9213482B2 (en) Touch control device and method
KR20120114163A (en) Information processing apparatus, information processing method, and computer-readable storage medium
US20160004339A1 (en) Programmable display device and screen-operation processing program therefor
CN107092433B (en) Touch control method and device of touch control all-in-one machine
WO2016131274A1 (en) Method, device and terminal for controlling terminal display
CN104182144A (en) Mobile terminal interface browsing method and system
US11455071B2 (en) Layout method, device and equipment for window control bars
CN110262985B (en) Processing method and electronic equipment
EP3008556B1 (en) Disambiguation of indirect input
CN108815843B (en) Control method and device of virtual rocker
US10318131B2 (en) Method for scaling down effective display area of screen, and mobile terminal
CN103914228A (en) Mobile terminal and touch screen operating method thereof
CN116483246A (en) Input control method and device, electronic equipment and storage medium
KR20150111651A (en) Control method of favorites mode and device including touch screen performing the same
CN104484117B (en) Man-machine interaction method and device
US10133346B2 (en) Gaze based prediction device and method
CN106293304B (en) Interface operation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 266555 Qingdao economic and Technological Development Zone, Shandong, Hong Kong Road, No. 218

Applicant after: Hisense Video Technology Co., Ltd

Address before: 266555 Qingdao economic and Technological Development Zone, Shandong, Hong Kong Road, No. 218

Applicant before: HISENSE ELECTRIC Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant