CN110780798A - Electronic device - Google Patents

Electronic device Download PDF

Info

Publication number
CN110780798A
CN110780798A CN201910370320.9A CN201910370320A CN110780798A CN 110780798 A CN110780798 A CN 110780798A CN 201910370320 A CN201910370320 A CN 201910370320A CN 110780798 A CN110780798 A CN 110780798A
Authority
CN
China
Prior art keywords
touch
processor
signal
processing unit
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910370320.9A
Other languages
Chinese (zh)
Other versions
CN110780798B (en
Inventor
林合祥
高定甲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Asustek Computer Inc
Original Assignee
Asustek Computer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Asustek Computer Inc filed Critical Asustek Computer Inc
Publication of CN110780798A publication Critical patent/CN110780798A/en
Application granted granted Critical
Publication of CN110780798B publication Critical patent/CN110780798B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface

Abstract

The invention discloses an electronic device which comprises a processor and a plurality of touch displays. Touch-control display and treater electric connection, touch-control display includes: a touch detection unit and a touch processing unit. The touch control processing unit is electrically connected with the touch control detection unit and used for receiving the touch control signal. The touch detection unit is used for transmitting a touch signal to the touch processing unit or the processor. The electronic device can reduce the burden of the processor and improve the efficiency of the processor.

Description

Electronic device
Technical Field
The present invention relates to an electronic device, and more particularly, to an electronic device with dual touch screens.
Background
As the use of touch screens in mobile devices has become popular, notebook computers have also begun to use touch screens as display devices to enhance the user experience and meet the user requirements. A dual-touch-screen notebook typically uses a touch screen instead of a keyboard and a touch pad, and therefore needs to switch between a touch screen function (touch mode) and a touch pad function (mouse mode).
Disclosure of Invention
The present invention is directed to an electronic device capable of reducing the load of a processor and improving the performance of the processor.
The invention provides an electronic device, comprising: the processor comprises a plurality of touch control displays. The touch display comprises a touch detection unit and a touch processing unit. The touch control processing unit is electrically connected with the touch control detection unit and used for receiving the touch control signal. The touch detection unit is used for transmitting a touch signal to the touch processing unit or the processor
The electronic device of the invention mainly processes part of Touch judgment and execution switching modes through a Touch chip (Touch IC), thereby reducing the burden of a processor and achieving the effect of improving the efficiency of the processor.
Drawings
FIG. 1 is a schematic diagram of an electronic device according to some embodiments of the invention;
FIG. 2 is a schematic diagram of a touch display according to some embodiments of the invention;
FIG. 3 is a flow chart of a control method according to some embodiments of the present invention;
FIG. 4A is a schematic diagram illustrating a touch operation according to some embodiments of the invention;
FIG. 4B is a schematic diagram of an execution mode according to some embodiments of the invention;
FIG. 5 is a flowchart illustrating step S330 according to some embodiments of the invention;
FIG. 6 is a schematic diagram of an execution mode according to some embodiments of the invention;
FIG. 7 is a schematic diagram of an execution mode according to some embodiments of the invention;
FIG. 8 is a flow chart of a control method according to some embodiments of the present invention;
FIG. 9A is a schematic diagram illustrating a touch operation according to some embodiments of the invention;
FIG. 9B is a schematic diagram of an execution mode according to some embodiments of the invention;
FIG. 10 is a flowchart illustrating step S830 according to some embodiments of the invention;
FIG. 11 is a schematic diagram of an execution mode according to some embodiments of the invention;
FIG. 12 is a flow chart of a control method according to some embodiments of the present invention;
FIG. 13 is a schematic diagram of touch operations according to some embodiments of the invention;
FIG. 14 is a flow chart of a control method according to some embodiments of the present invention;
FIG. 15 is a schematic diagram of an execution mode according to some embodiments of the invention;
FIG. 16 is a flow chart of a control method according to some embodiments of the present invention;
FIG. 17A is a schematic diagram illustrating a touch operation according to some embodiments of the invention;
FIG. 17B is a schematic diagram illustrating a touch operation according to some embodiments of the invention;
FIG. 18 is a schematic view of an electronic device according to some embodiments of the invention;
FIG. 19 is a flow chart of a control method according to some embodiments of the present invention;
FIG. 20 is a schematic diagram of an execution mode according to some embodiments of the invention;
FIG. 21 is a schematic diagram of touch operations according to some embodiments of the invention; and
FIG. 22 is a schematic view of an electronic device according to further embodiments of the present invention.
Detailed Description
The following disclosure provides many different embodiments, or examples, for implementing different features of the invention. The elements and configurations of the specific examples are used in the following discussion to simplify the present invention. Any examples discussed are intended for illustrative purposes only and do not limit the scope or meaning of the invention or its illustrations in any way. Furthermore, the present invention may repeat reference numerals and/or letters in the various examples, which are repeated for simplicity and clarity and do not in themselves dictate a relationship between the various embodiments and/or configurations discussed below.
Please refer to fig. 1 and fig. 2. Fig. 1 is a schematic diagram of an electronic device 100 according to some embodiments of the invention, and fig. 2 is a schematic diagram of touch displays 110A and 110B according to some embodiments of the invention. As shown in fig. 1, the electronic device 100 includes touch displays 110A and 110B and a processor 130. The processor 130 is electrically connected to the touch displays 110A and 110B. As shown in fig. 2, the touch displays 110A and 110B include touch detection units 111A and 111B and touch processing units 112A and 112B. The touch detection unit 111A is electrically connected to the touch processing unit 112A, and the touch detection unit 111B is electrically connected to the touch processing unit 112B.
In various embodiments of the present invention, the processor 130 may be an integrated circuit such as a micro control unit (microcontroller), a microprocessor (microprocessor), a digital signal processor (digital signal processor), an Application Specific Integrated Circuit (ASIC), a logic circuit, or other similar components or combinations thereof. The touch displays 110A and 110B may be implemented as touch displays with finger signal detection or pressure sensing. In one embodiment, the touch processing units 112A and 112B are implemented by touch chips (touchics).
Please refer to fig. 3. Fig. 3 is a flow chart illustrating a control method 300 according to some embodiments of the present invention. In an embodiment, the control method 300 shown in fig. 3 can be applied to the electronic device 100 of fig. 1, and the touch displays 110A and 110B and the processor 130 are used for switching the execution modes of the touch displays 110A and 110B according to the steps described in the following control method 300.
In an embodiment, the touch displays 110A and 110B have similar hardware devices, and only the touch detection unit 111A and the touch processing unit 112A of the touch display 110A are taken as examples, and the touch detection unit 111B and the touch processing unit 112B also have similar operations, which are not described herein again. The control method 300 performs step S310 to detect a touch operation by the touch detection unit 111A and generate a touch signal, and performs step S320 to transmit the touch signal by the touch detection unit 111A.
Referring to fig. 4A, in an embodiment, the current execution mode of the touch display 110A is preset to be the touch screen mode, and in this embodiment, the entire touch display 110A is in the touch screen mode. When the user touches eight fingers on the execution area a1 of the touch display 110A, the touch detection unit 111A of the touch display 110A generates a touch signal after detecting the touch operation T1 of the user (i.e., the finger signal of 8 fingers detected in the execution area a 1), and transmits the touch signal to the touch processing unit 112A.
Next, the control method 300 executes step S330 to transmit a first notification signal from the touch processing unit 112A to the processor 130 according to the touch signal received from the touch detecting unit 111A, and step S340 to determine an execution mode from the processor 130 according to the first notification signal. In an embodiment, please refer to fig. 4A together, and fig. 4A is a schematic diagram illustrating an execution mode according to some embodiments of the invention. When the touch processing unit 112A receives the touch signal, it sends a first notification signal to the processor 130. In this case, the first notification signal is used to notify the processor 130 of the touch operation T1 detected by the touch detection unit 111A.
Please refer to fig. 5 in conjunction with the above description. Fig. 5 is a flowchart illustrating step S330 according to an embodiment of the present invention. In the above embodiment, the control method 300 executes step S331 to determine whether the touch operation T1 is a gesture operation by the touch processing unit 112A. In one embodiment, the gesture operation is multi-finger touch (shown in fig. 4A), and then the control method 300 executes step S332 to determine whether the touch operation T1 includes a plurality of touch positions TP corresponding to the multi-finger and determine whether the touch positions TP are located in the execution area (first execution area) a 1. When determining that the touch positions TP are located in the execution area a1, the control method 300 executes step S333 to transmit respective corresponding first notification signals from the touch processing unit 112A to the processor 130. Next, the control method 300 executes step S340 to determine the execution mode by the processor 130 according to the first notification signal.
As shown in fig. 4A and 4B, when the touch positions TP are located in the execution area a1, the processor 130 determines and executes the execution mode corresponding to the execution area a1 (e.g., the processor 130 controls the touch display 110A to display the interface setting corresponding to the touch operation T1 with eight fingers located in the execution area a 1). In this embodiment, the application executed by the processor 130 controls the touch display 110A to display the interface I2. The interface I2 includes a first sub-interface I21 and a second sub-interface I22, wherein the first sub-interface I21 is a virtual keyboard, and the second sub-interface I22 is a touch display screen. At this time, the touch signal received from the interface I2 is determined as a touch screen operation.
In one embodiment, the ranges of the execution region A1 and the execution region A2 shown in FIG. 4A may be determined by the partition line L. For example, if the separation line L is closer to the edge E1, the extent of the execution region a1 may be larger, and the extent of the execution region a2 may be smaller; the closer the separation line L is to the edge E2, the larger the extent of the execution region a2 and the smaller the extent of the execution region a 1.
As shown in fig. 6, in this embodiment, the execution mode of the second sub-interface I22 displayed on the touch display 110A is the touch screen mode. When the user touches the function image (icon) at the position P3 of the second sub-interface I22, the corresponding function of the function image is triggered. In the touch screen mode, the touch processing unit 112A receives a touch signal from the second sub-interface I22 displayed by the touch display 110A, the touch signal includes absolute position information of a touch operation, and the touch processing unit 112A triggers a function corresponding to a function icon (icon) according to the absolute position information (e.g., position P3).
As shown in FIG. 7, in one embodiment, the user's touch operation is to drag the function image (icon) at position P3 of the touch display 110A to position P6 of the touch display 110A, and since the execution mode is also the touch screen mode, the touch processing unit 112A moves the function image at position P3 to position P6 according to the absolute position information (e.g., the displacement information from position P3 to position P6).
Please refer to fig. 8. Fig. 8 is a flow chart illustrating a control method 800 according to some embodiments of the invention. In another embodiment, the control method 800 shown in fig. 8 can be applied to the electronic device 100 of fig. 1, the touch displays 110A and 110B and the processor 130 are configured to perform the steps described in the following control method 800, the control method 800 performs the step S810 of detecting a touch operation by the touch detection unit 111A and generating a touch signal, and performs the step S820 of transmitting the touch signal by the touch detection unit 111A.
Referring to fig. 9A, fig. 9A is a schematic diagram illustrating a touch operation according to some embodiments of the invention. As shown in fig. 9A, the current execution mode of the touch display 110A is preset to the touch screen mode. In this embodiment, the entire touch display 110A is in touch screen mode. When the user touches eight fingers on the execution area a2 of the touch display 110A, the touch detection unit 111A of the touch display 110A generates a touch signal after detecting the touch operation T1 of the user (i.e., the finger signal of 8 fingers detected in the execution area a 2), and transmits the touch signal to the touch processing unit 112A.
Next, the control method 800 executes step S830 to transmit a first notification signal from the touch processing unit 112A to the processor 130 according to the touch signal, and executes step S840 to determine an execution mode by the processor 130 according to the first notification signal. In one embodiment, please refer to fig. 9A together, fig. 9A is a schematic diagram of an execution mode according to some embodiments of the present invention. When the touch processing unit 112A receives the touch signal, it sends a first notification signal to the processor 130. In this case, the first notification signal is used to notify the processor 130 that the touch detection unit 111A detects the touch operation T1 of the user.
Referring to fig. 10, fig. 10 is a flowchart of step S830 according to an embodiment of the invention. In the above embodiment, the control method 800 executes step S831 to determine whether the touch operation T1 is a gesture operation by the touch processing unit 112A. In one embodiment, the gesture operation is multi-finger touch (as shown in fig. 9A), and the control method 800 executes step 832 to determine whether the touch operation T1 includes a plurality of touch positions TP corresponding to the multi-fingers and determine whether the touch positions TP are located in the execution area (first execution area) a 1. When the touch processing unit 112A determines that the touch positions TP are not all located in the execution area a1, the control method 800 further performs step S834 to determine whether the touch processing unit 112A determines that the touch positions TP are all located in the execution area (second execution area) a 2. When the touch positions TP are located in the execution area a2, the control method 800 executes step S835 to transmit a corresponding first notification signal from the touch processing unit 112A to the processor 130. Next, the control method 800 executes step S840 to determine the execution mode by the processor 130 according to the first notification signal.
In one embodiment, when the processor 130 determines the execution mode, the control method 800 executes step 850 to transmit a determination notification signal from the processor 130 to the touch processing unit 112A. Then, the touch processing unit 112A executes the execution mode determined by the processor 130 according to the determination notification signal.
Referring to fig. 9A and 9B, when the touch operation T1 is located in the execution area a2, the processor 130 controls the touch display 110A to display the interface setting corresponding to the touch operation T1 with eight fingers located in the execution area a 2. In one embodiment, an application executed by the processor 130 controls the touch display 110A to display the interface I1. The interface I1 includes a first sub-interface I11 and a second sub-interface I12, wherein the first sub-interface I11 is a touch pad, and the second sub-interface I12 is a virtual keyboard.
Meanwhile, the processor 130 sends a determination notification signal to the touch processing unit 112A for instructing the touch processing unit 112A to switch the execution mode of the execution area corresponding to the first sub-interface I11 in the touch display 110A to the touch pad mode, and maintain the execution mode of the execution area corresponding to the second sub-interface I12 in the touch display 110A to the touch screen mode. Thus, the touch signal received from the first sub-interface I11 is determined as a mouse operation and the touch signal received from the second sub-interface I12 is determined as a normal touch operation.
In one embodiment, the ranges of the execution region A1 and the execution region A2 shown in FIG. 9A may be determined by the partition line L. For example, if the separation line L is closer to the edge E2, the extent of the execution region a1 may be larger, and the extent of the execution region a2 may be smaller; the closer the separation line L is to the edge E1, the larger the extent of the execution region a2 and the smaller the extent of the execution region a 1.
In this embodiment, the processor 130 sends a determination notification signal to the touch processing unit 112A to instruct the touch processing unit 112A to switch the execution mode of the entire touch display 110A to the touch pad mode. As shown in FIG. 11, when the user's touch operation moves from position P1 to position P2 of the touch display 110A, the mouse moves from position M1 to position M2 of the touch display 110B. In the touchpad mode, the touch processing unit 112A receives a touch signal from the touch display 110A, the touch processing unit 112A converts the position information (e.g., positions P1, P2) of the touch operation on the touch display 110A into the relative position information (e.g., positions M1, M2) on the touch display 110B, and notifies the processor 130 to control the touch display 110B to display the mouse moving from position M1 to position M2 according to the relative position information (e.g., positions M1, M2).
Please refer to fig. 12. Fig. 12 is a flow chart illustrating a control method 1200 according to some embodiments of the invention. In another embodiment, the control method 1200 shown in fig. 12 can be applied to the electronic device 100 of fig. 1, and the touch displays 110A and 110B and the processor 130 are used for switching the execution modes of the touch displays 110A and 110B according to the following steps of the control method 1200. In this embodiment, the steps S1210 to S1220 are the same as the steps S310 to S320, and are not described herein again.
In light of the above, the control method 1200 further executes step S1230 to receive the touch signal from the touch processing unit 112A, and step S1240 to determine the execution mode from the touch processing unit 112A according to the touch signal. In this embodiment, the touch processing unit 112A determines whether to switch the execution mode according to the touch signal. Referring to fig. 13, in this embodiment, the current execution mode of the touch display 110A is preset to be the touch screen mode. When the user touches five fingers on any area of the touch display 110A, the touch detection unit 111A of the touch display 110A detects the touch operation T3 of the user (i.e., detects finger signals of 5 fingers), generates a touch signal, and transmits the touch signal to the touch processing unit 112A. Then, the touch processing unit 112A switches the execution mode of the touch display 110A from the touch screen mode to the touch pad mode according to the touch signal.
In one embodiment, when the touch processing unit 112A receives the touch signal for a certain period of time and does not receive another touch signal, the touch processing unit 112A switches the execution mode again (in this embodiment, the touch panel mode is switched back to the original touch screen mode). In this embodiment, the execution mode includes a touch screen mode and a touch pad mode.
Referring to fig. 14, fig. 14 is a flowchart illustrating a control method 1400 according to some embodiments of the invention. In another embodiment, the control method 1400 shown in fig. 10 can be applied to the electronic device 100 of fig. 1, and the touch displays 110A and 110B and the processor 130 are used for determining the execution modes of the touch displays 110A and 110B according to the following steps of the control method 1400. In this embodiment, the control method 1400 performs step S1410 to detect a touch operation by the touch detection unit 111A and generate a touch signal, and performs step S1420 to transmit the touch signal to the processor 130 by the touch detection unit 111A.
In step S1430, the processor 130 determines an execution mode according to the touch signal received from the touch detection unit 111A and generates a determination notification signal to be transmitted to the touch processing unit 112A. Next, the control method 1400 further executes step S1440 to execute the execution mode by the touch processing unit 112A according to the determination notification signal.
Referring to fig. 4A, the partition line L is used to determine the ranges of the execution area a1 and the execution area a2 of the touch display 110B, and the execution mode of the specific range can be switched after the ranges are determined. In one embodiment, the processor 130 can determine whether to execute at least one of the system mode and the application mode from a touch signal received by a touch detection unit 111A (e.g., a virtual key) located on the touch display 110A or 110B. When the processor 130 determines that the execution mode in the execution area a1 is the application mode and the execution mode in the execution area a2 is the system mode, a corresponding determination notification signal is sent to the touch processing unit 112A. Then, the touch processing unit 112A sets the execution mode of the execution area A1 to the application mode and sets the execution mode of the execution area A2 to the system mode according to the determination notification signal.
In this embodiment, the touch information of the execution area A1 executing the application mode is returned to the specific application program executed by the processor, and the operating system executed by the processor does not receive the touch information. The touch information of the execution area a2 of the execution system mode is sent back to the operating system executed by the processor, and the operating system receives the touch information of the user and determines whether to process further according to the touch information.
In another embodiment, the detection signal received by at least one motion sensing unit (e.g., the motion sensing unit 1810 in fig. 18) located at the periphery of the touch display 110A or 110B may be utilized to determine whether to execute at least one of the system mode or the application mode. For example, the motion sensing unit may be a physical button or a magnetic sensor.
Referring to fig. 15, in an embodiment, the processor 130 may determine an execution mode (e.g., a touch pad mode) and an execution area of the execution mode according to a touch signal received from the touch display 110A or 110B, and then the processor 130 transmits a determination notification signal including execution area setting information of the determined execution mode to the touch processing unit 112A, and the touch processing unit 112A adjusts the area and the execution mode of the execution area in the touch display 110A according to the execution area setting information. As shown in fig. 15, the range of the execution area a3 is defined by the coordinates (X1, Y1) corresponding to the position P7 and the coordinates (X2, Y2) corresponding to the position P8 of the touch display 110A (i.e., execution area setting information). The touch processing unit 112A adjusts the execution mode of the execution area a3 in the touch display 110A to the touch pad mode according to the determination notification signal.
Referring to fig. 16, fig. 16 is a flowchart illustrating a control method 1600 according to some embodiments of the invention. In another embodiment, the control method 1600 shown in fig. 16 can be applied to the electronic device 100 of fig. 1, and the touch displays 110A and 110B and the processor 130 are used for switching the execution modes of the touch displays 110A and 110B according to the following steps of the control method 1600. In this embodiment, the steps S1610 to S1620 are the same as the steps S310 to S320, and are not described herein again. In view of the above, the control method 1600 further executes step S1630 to determine the execution mode by the processor 130 according to the touch signal received from the touch detection unit 111A.
As mentioned above, referring to fig. 17A, in this embodiment, when the user performs the touch operation T21 on the touch display 110A (i.e., the user slides the finger from the position P9 of the touch display 110A to the position P10 of the touch display 110A), the touch detection unit 111A of the touch display 110A generates a touch signal according to the detected touch operation T21, and transmits the touch signal to the processor 130.
As shown in fig. 17A and 17B, the processor 130 determines whether the touch operation T21 is consistent with the gesture operation according to the touch signal. In this embodiment, the gesture operation may be implemented as a sliding touch, but the invention is not limited thereto. Then, when the touch operation T21 corresponds to the sliding touch, the processor 130 determines to adjust the current execution mode to another execution mode, and if not, maintains the current execution mode. In this embodiment, after the processor 130 determines that the touch operation T21 is a sliding touch, the processor 130 controls the touch display 110A to move the interface I3 displayed in the execution area a1 to the execution area a2 (as shown in fig. 17B).
Referring to fig. 18, the electronic device 1800 includes touch displays 110A and 110B, a processor 130, and a motion sensing unit 1810. The processor 130 is electrically connected to the touch displays 110A and 110B and the motion sensing unit 1810. The motion sensing unit 1810 may be implemented as at least one of a magnetic sensor, a light sensor, a pressure sensor, or a physical key.
Referring to fig. 19, fig. 19 is a flow chart illustrating a control method 1900 according to some embodiments of the invention. In another embodiment, the control method 1900 shown in fig. 19 can be applied to the electronic device 1800 shown in fig. 18, and the touch displays 110A and 110B, the processor 130 and the plurality of motion sensing units 1810 are used for switching the execution modes of the touch displays 110A or 110B according to the following steps of the control method 1900.
In one embodiment, the control method 1900 executes step S1910 to detect the object O (e.g., a physical keyboard) by the motion sensor 1810 and generate a detection signal, and executes step S1920 to transmit the detection signal to the processor 130 by the motion sensor 1810. Referring to fig. 20, the electronic device 1800 is provided with a plurality of motion sensing units 1810A to 1810F. If there is an object O approaching the motion sensor units 1810C to 1810F, the motion sensor units 1810C to 1810F transmit a detection signal to the processor 130. Next, the control method 1900 executes step S1930 to determine an execution mode according to the detection signal by the processor 130. In this embodiment, after the processor 130 receives the detection signal from the motion sensing units 1810C to 1810F, it determines the position of the object O (e.g., the object O is located in the execution area A4 of the touch display 110B) according to the detection signal, and controls the touch display 110A to continue displaying as the touch display screen in the execution area A5.
Referring to fig. 21, when the motion sensor units 1810A to 1810D detect that an object O is approaching (step S1910), the motion sensor units 1810A to 1810D transmit corresponding detection signals to the processor 130 (step S1920). The control method 1900 further executes step S1930 in which the processor 130 determines the position of the object O (e.g., the object O is located in the execution area a5 of the touch display 110B) according to the detection signals received from the motion sensing units 1810A-1810D to determine the execution mode (e.g., the touch display 110A is controlled to generate the corresponding interface I4 in the execution area a4) and generate a determination notification signal to the touch processing unit 112A, and then the touch processing unit 112A adjusts the execution area of the virtual touchpad corresponding to the interface I4 in the touch display 110A to the touchpad mode according to the determination notification signal.
Referring to fig. 22, the main difference between the electronic device in this embodiment and the electronic device 100 in the foregoing embodiment is that the electronic device in fig. 22 includes touch displays 110A and 110B with different sizes or resolutions, and an input element O' (e.g., a physical keyboard). In this embodiment, the execution modes of the touch displays 110A and 110B can be switched or adjusted according to the above-mentioned control methods. In addition, the execution modes of the touch displays 110A and 110B in this embodiment are substantially the same as those mentioned in the previous embodiments.
In view of the above, the embodiments of the present invention mainly process a part of the Touch determination and execution switching modes through a Touch chip (Touch IC), so as to reduce the burden of the processor and achieve the effect of improving the performance of the processor.
Additionally, the above illustration includes exemplary steps in sequential order, but the steps need not be performed in the order shown. It is within the contemplation of the present disclosure to perform these steps in a different order. Steps may be added, substituted, changed in order, and/or omitted as appropriate within the spirit and scope of embodiments of the present disclosure.
Although the present invention has been described with reference to the above embodiments, it should be understood that various changes and modifications can be made therein by those skilled in the art without departing from the spirit and scope of the invention.

Claims (13)

1. An electronic device, comprising:
a processor; and
a plurality of touch displays electrically connected to the processor, the plurality of touch displays comprising:
the touch detection unit is used for detecting touch operation and generating a touch signal; and
the touch control processing unit is electrically connected with the touch control detection unit and used for receiving the touch control signal;
the touch detection unit is used for transmitting the touch signal to the touch processing unit or the processor.
2. The electronic device as claimed in claim 1, wherein when the touch signal is transmitted to the touch processing unit, the touch processing unit transmits a first notification signal to the processor according to the touch signal, and the processor determines an execution mode according to the first notification signal.
3. The electronic device of claim 2, wherein the processor determines and executes the execution mode based on the first notification signal.
4. The electronic device as claimed in claim 2, wherein when the processor determines the execution mode, a determination notification signal is generated and transmitted to the touch processing unit, and the touch processing unit executes the execution mode according to the determination notification signal.
5. The electronic device as claimed in claim 1, wherein when the touch signal is transmitted to the touch processing unit, the touch processing unit determines and switches an execution mode according to the touch signal.
6. The electronic device as claimed in claim 5, wherein when the touch processing unit receives the touch signal for a predetermined period of time and does not receive another touch signal, the touch processing unit switches the execution mode again.
7. The electronic device as claimed in claim 2, wherein the touch processing unit determines whether the touch operation includes a plurality of touch positions and determines whether the plurality of touch positions are all located in a first execution area or all located in a second execution area, the touch processing unit transmits the corresponding first notification signal to the processor according to the touch signal when the plurality of touch positions are all located in the first execution area or all located in the second execution area, the processor executes the interface setting corresponding to the first execution area when the touch operation is located in the first execution area, and the processor executes the interface setting corresponding to the second execution area when the touch operation is located in the second execution area.
8. The electronic device of claim 1, further comprising:
the motion sensing unit is electrically connected with the processor and used for sensing an object and transmitting a plurality of corresponding sensing signals to the processor;
the processor is used for determining an execution mode according to the detection signals.
9. The electronic device as claimed in claim 1, wherein when the touch signal is transmitted to the processor, the processor determines whether the touch operation is gesture operation according to the touch signal, if yes, the processor determines to switch from a current execution mode to another execution mode, and if not, the current execution mode is maintained.
10. The electronic device as claimed in claim 1, wherein when the touch signal is transmitted to the processor, the processor determines and executes an execution mode according to the touch signal.
11. The electronic device as claimed in claim 1, wherein when the touch signal is transmitted to the processor and the processor determines the execution mode according to the touch signal, the processor transmits a determination notification signal to the touch processing unit, and the touch processing unit executes the execution mode according to the determination notification signal.
12. The electronic device as claimed in claim 11, wherein the determination notification signal comprises execution area setting information, and the touch processing unit adjusts an area and an execution mode of an execution area in the touch displays according to the execution area setting information.
13. The electronic device of claim 9, wherein the gesture operation comprises one of a swipe touch or a multi-finger touch.
CN201910370320.9A 2018-07-27 2019-05-06 Electronic device Active CN110780798B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862703900P 2018-07-27 2018-07-27
US62/703,900 2018-07-27

Publications (2)

Publication Number Publication Date
CN110780798A true CN110780798A (en) 2020-02-11
CN110780798B CN110780798B (en) 2022-05-24

Family

ID=69382992

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910370320.9A Active CN110780798B (en) 2018-07-27 2019-05-06 Electronic device

Country Status (2)

Country Link
CN (1) CN110780798B (en)
TW (1) TWI742366B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI772853B (en) * 2020-08-06 2022-08-01 仁寶電腦工業股份有限公司 Electronic device and method for operating touch control interface

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101751112A (en) * 2008-12-12 2010-06-23 宏达国际电子股份有限公司 Portable electronic device and method for awakening same from sleep mode by using touch screen
CN101770290A (en) * 2009-01-06 2010-07-07 欧缔科技股份有限公司 Method for converting touch control plate into touch control mode or digital key and/or hot key input mode
CN103914646A (en) * 2013-01-08 2014-07-09 三星电子株式会社 Touch event processing method and portable device implementing the same
CN104536597A (en) * 2014-12-22 2015-04-22 合肥联宝信息技术有限公司 Method and device for achieving multi-point touch of notebook computer
CN104571909A (en) * 2013-10-23 2015-04-29 纬创资通股份有限公司 Input device, input mode switching method thereof and computer device
CN104679362A (en) * 2013-11-27 2015-06-03 纬创资通股份有限公司 Touch device and control method thereof
CN105320448A (en) * 2014-08-04 2016-02-10 南京瀚宇彩欣科技有限责任公司 Controller
CN105892915A (en) * 2016-03-30 2016-08-24 联想(北京)有限公司 Information processing method and electronic device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120081317A1 (en) * 2010-10-01 2012-04-05 Imerj LLC Method and system for performing copy-paste operations on a device via user gestures
US20110296333A1 (en) * 2010-05-25 2011-12-01 Bateman Steven S User interaction gestures with virtual keyboard
CN203241885U (en) * 2013-05-20 2013-10-16 汤中亮 Double-screen notebook computer
CN206363196U (en) * 2016-11-04 2017-07-28 李绍宏 Double-screen notebook

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101751112A (en) * 2008-12-12 2010-06-23 宏达国际电子股份有限公司 Portable electronic device and method for awakening same from sleep mode by using touch screen
CN101770290A (en) * 2009-01-06 2010-07-07 欧缔科技股份有限公司 Method for converting touch control plate into touch control mode or digital key and/or hot key input mode
CN103914646A (en) * 2013-01-08 2014-07-09 三星电子株式会社 Touch event processing method and portable device implementing the same
US20140191994A1 (en) * 2013-01-08 2014-07-10 Samsung Electronics Co., Ltd. Touch event processing method and portable device implementing the same
CN104571909A (en) * 2013-10-23 2015-04-29 纬创资通股份有限公司 Input device, input mode switching method thereof and computer device
CN104679362A (en) * 2013-11-27 2015-06-03 纬创资通股份有限公司 Touch device and control method thereof
CN105320448A (en) * 2014-08-04 2016-02-10 南京瀚宇彩欣科技有限责任公司 Controller
CN104536597A (en) * 2014-12-22 2015-04-22 合肥联宝信息技术有限公司 Method and device for achieving multi-point touch of notebook computer
CN105892915A (en) * 2016-03-30 2016-08-24 联想(北京)有限公司 Information processing method and electronic device

Also Published As

Publication number Publication date
TWI742366B (en) 2021-10-11
CN110780798B (en) 2022-05-24
TW202008126A (en) 2020-02-16

Similar Documents

Publication Publication Date Title
RU2582854C2 (en) Method and device for fast access to device functions
KR102120930B1 (en) User input method of portable device and the portable device enabling the method
US9990062B2 (en) Apparatus and method for proximity based input
US7884806B2 (en) Proximity sensor device and method with keyboard emulation
US9354780B2 (en) Gesture-based selection and movement of objects
US20140306897A1 (en) Virtual keyboard swipe gestures for cursor movement
US20100328236A1 (en) Method for Controlling a Computer System and Related Computer System
JP2012247960A (en) Information processing apparatus, information processing method, and computer program
TW201015394A (en) Method of tracking touch inputs and related touch input tracking device and computer system
EP2473909A1 (en) Methods for mapping gestures to graphical user interface commands
JP2011134273A (en) Information processor, information processing method, and program
US20140298275A1 (en) Method for recognizing input gestures
CN110780798B (en) Electronic device
WO2012111227A1 (en) Touch input device, electronic apparatus, and input method
JP2006085218A (en) Touch panel operating device
US10338692B1 (en) Dual touchpad system
WO2016208099A1 (en) Information processing device, input control method for controlling input upon information processing device, and program for causing information processing device to execute input control method
TWI405105B (en) Signal handling method of compound touch panel
US10782872B2 (en) Electronic device with touch processing unit
KR20150102363A (en) Apparatus for controlling user interface based on multi-touches, and Method thereof
US20200319793A1 (en) Information processing device, information processing method, and program
US20130063347A1 (en) Method of processing signal of portable computer and portable computer using the method
KR20140086805A (en) Electronic apparatus, method for controlling the same and computer-readable recording medium
TW201528114A (en) Electronic device and touch system, touch method thereof
KR20140070264A (en) Method and apparatus for sliding objects across a touch-screen display

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant