CN111104079A - Control method, electronic device and non-transitory computer readable recording medium device - Google Patents

Control method, electronic device and non-transitory computer readable recording medium device Download PDF

Info

Publication number
CN111104079A
CN111104079A CN201811267704.XA CN201811267704A CN111104079A CN 111104079 A CN111104079 A CN 111104079A CN 201811267704 A CN201811267704 A CN 201811267704A CN 111104079 A CN111104079 A CN 111104079A
Authority
CN
China
Prior art keywords
touch
screen
data
user interface
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811267704.XA
Other languages
Chinese (zh)
Inventor
叶俊材
林宏益
吕孟儒
曾建智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Asustek Computer Inc
Original Assignee
Asustek Computer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Asustek Computer Inc filed Critical Asustek Computer Inc
Priority to CN201811267704.XA priority Critical patent/CN111104079A/en
Publication of CN111104079A publication Critical patent/CN111104079A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1431Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display using a single graphics controller
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1647Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Abstract

The invention provides a control method which is suitable for an electronic device comprising a first screen and a second screen. The control method comprises the following steps: receiving touch data generated by the second screen in response to the touch behavior; judging whether the touch behavior belongs to a touch panel operation instruction or a touch screen operation instruction according to the touch data; when the touch behavior belongs to a touch control panel operation instruction, triggering corresponding touch control panel operation according to touch control data; and when the touch behavior belongs to the touch screen operation instruction, triggering corresponding touch screen operation according to the touch data.

Description

Control method, electronic device and non-transitory computer readable recording medium device
Technical Field
The invention relates to a control method, an electronic device and a non-transitory computer readable recording medium device.
Background
In recent years, dual-screen output has been widely used in various electronic devices because it can provide better user experience for users. When the electronic device is a notebook computer, the conventional method is to use one of the screens as an extended screen of the other screen. However, this conventional method cannot provide multiple application modes for users to apply to different situations. In addition, the conventional notebook computer has a lengthy transmission flow due to inefficient interaction between the two screens, and the overall performance is affected due to low touch response efficiency.
Disclosure of Invention
The embodiment of the invention provides a control method which is suitable for an electronic device comprising a first screen and a second screen. The control method comprises the following steps: receiving touch data generated by the second screen in response to the touch behavior; judging whether the touch behavior belongs to a touch panel operation instruction or a touch screen operation instruction according to the touch data; when the touch behavior belongs to a touch control panel operation instruction, triggering corresponding touch control panel operation according to touch control data; and when the touch behavior belongs to the touch screen operation instruction, triggering corresponding touch screen operation according to the touch data.
The present disclosure further provides an electronic device, comprising: the first screen is used for displaying a first picture image; a second screen for generating touch data in response to the touch behavior; the processor is used for receiving the touch data and judging whether the touch behavior belongs to a touch panel operation instruction or a touch screen operation instruction according to the touch data; and when the touch behavior belongs to the touch screen operation instruction, the processor triggers the corresponding touch screen operation according to the touch data.
A non-transitory computer readable recording medium device for storing at least one program is also provided. When the electronic device including the first screen and the second screen loads and executes at least one program, the at least one program can cause the electronic device to execute the following steps: receiving touch data generated by the second screen in response to the touch behavior; judging whether the touch behavior belongs to a touch panel operation instruction or a touch screen operation instruction according to the touch data; when the touch behavior belongs to a touch control panel operation instruction, triggering corresponding touch control panel operation according to touch control data; and when the touch behavior belongs to the touch screen operation instruction, triggering corresponding touch screen operation according to the touch data.
Other features and embodiments of the present invention will be described in detail below with reference to the drawings.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a block schematic diagram of an embodiment of an electronic device;
FIG. 2 is a flow chart illustrating an embodiment of a control method;
FIG. 3 is a schematic flow chart of another embodiment of a control method;
FIG. 4 is a schematic diagram illustrating an embodiment of an electronic device operating in a normal power-on mode;
FIG. 5 is a schematic diagram illustrating an embodiment of an electronic device operating in a pen-electricity multiplexing mode;
FIG. 6 is a schematic diagram illustrating an embodiment of an electronic device operating in a panorama mode;
FIG. 7 is a schematic diagram illustrating an embodiment of an electronic device operating in a panorama multiplexing mode;
fig. 8 is a schematic diagram illustrating an embodiment of the electronic device operating in the book mode.
Detailed Description
Referring to fig. 1 to 8, the control method according to any embodiment of the present invention can be implemented in an electronic device 100, so that the electronic device 100 can determine that the touch behavior of the user belongs to a touch pad operation instruction or a touch screen operation instruction, so as to trigger a corresponding touch pad operation or touch screen operation accordingly. The electronic device 100 includes a first body 101, a second body 102 and a processor 130. The first body 101 includes a first screen 110, and the second body 102 includes a second screen 120. The first body 101 and the second body 102 may be connected to each other by a hinge. The processor 130 is coupled to the first screen 110 and the second screen 120. In some embodiments, the processor 130 may be integrated in the first body 101 or the second body 102. In this embodiment, the processor 130 is integrated in the first body 101, but the disclosure is not limited thereto.
In an embodiment, the first screen may be a display screen and the second screen 120 is a touch display screen, and the user can perform various touch actions through the second screen 120, but the disclosure is not limited thereto, and the first screen 110 and the second screen 120 may also be both touch display screens, so that the user can perform various touch actions through the first screen 110 and the second screen 120.
The first screen 110 is used for displaying a first frame image I1. Here, the first user screen image I1 is displayed as a full screen in the display screen of the first screen 110, as shown in fig. 4. In some embodiments, the first screen image I1 may include a user interface and its interface items, such as a desktop, a folder, an image (icon) of an application program, a system task bar W1, etc., an image display during execution of the application program, an image display of a touch operation, an image display of a key input, or a combination thereof, but the present invention is not limited thereto.
The second screen 120 is used for a user to perform various touch behaviors, and the second screen 120 can generate corresponding touch data in response to the touch behaviors of the user. In an embodiment, the second screen 120 may include a touch data retrieving module 121 and a transmission control module 122. The touch data retrieving module 121 is configured to retrieve corresponding touch data D1 according to a touch behavior of a user, and the transmission control module 122 is configured to transmit the touch data D1 retrieved by the touch data retrieving module 121 to the processor 130. For example, the touch data D1 may include coordinate information and/or force information of the touch point. Therefore, when the processor 130 performs the subsequent operation, the position and force of the user touch can be determined according to the touch data D1 to perform the corresponding operation.
In some embodiments, the transmission control module 122 may transmit the touch data to the processor 130 through various wired or Wireless communication interfaces, such as an I2C interface, a USB interface (USB), a Wireless Universal Serial Bus (WUSB), a Bluetooth (Bluetooth), and the like.
Referring to fig. 1 to 2, the processor 130 is configured to receive the touch data D1 (step S10), and determine whether the touch behavior of the user belongs to a touch pad operation command or a touch screen operation command according to the touch data D1 (step S20). When the processor 130 determines that the touch behavior belongs to the touch pad operation command, the processor 130 may trigger a corresponding touch pad operation according to the touch data D1 (step S30). When the processor 130 determines that the touch behavior belongs to the touch screen operation command, the processor 130 triggers a corresponding touch screen operation according to the touch data D1 (step S40).
In one embodiment of step S10, the processor 130 may include a driver module 131, and the processor 130 may receive the touch data D1 through the driver module 131. In addition, the processor 130 may further include a retrieving program module 132, and the processor 130 may retrieve the input characteristics of the touch data D1 for subsequent use through the retrieving program module 132. In some embodiments, the captured input features may include input locations, number of packets, sliding distance, sliding time, inter-click time, or a combination thereof, but the present invention is not limited thereto, and the input features may be any suitable parameters for assisting the subsequent determination.
In an embodiment of step S20, the processor 130 may further include a user interface setting module 133 and a determining module 134. The ui setting module 133 can be used to generate ui layout information D2 of the second screen 120. The determining module 134 is used for determining whether the touch behavior belongs to the touch panel operation command or the touch screen operation command according to the user interface layout information D2 and the touch data D1.
In some embodiments, the user interface layout information D2 may include information about which configuration areas are divided in the second screen 120, which configuration areas are used as the virtual touchpad area, which configuration areas are used as the virtual keyboard area, and which configuration areas are used as the extended display area. Therefore, the determining module 134 can determine whether the touch behavior belongs to a touch pad operation instruction or a touch screen operation instruction according to which configuration area the input position in the touch data D1 falls in. In some embodiments, when the input position in the touch data D1 is within the virtual touchpad area or the virtual keyboard area, the determining module 134 can determine that the touch action belongs to a touchpad operation command. When the input position in the touch data D1 is within the extended display area, the determining module 134 determines that the touch action belongs to the touch screen operation command.
In an embodiment of step S30, the processor 130 may further include a touch pad data processing module 135, and when the determining module 134 determines that the touch behavior belongs to the touch pad operation command, the determining module 134 may directly output the touch data D1 or the input features extracted from the touch data D1 by the extractor module 132 to the touch pad data processing module 135, and the touch pad data processing module 135 outputs the touch pad data to the OS for performing the corresponding touch pad operation. Here, the touch pad operation is to perform touch positioning at a relative position.
In some embodiments, the touch pad operation command may include a track operation command, a click input command, and the like when the input position in the touch data D1 falls within the virtual touch pad area, but the disclosure is not limited thereto. Taking the track operation command as an example, the track operation command and the corresponding touch pad operation thereof may be, for example: the sliding command of one finger causes the cursor (cursor) displayed in the first screen 110 or the second screen 120 to move according to the sliding command, or the sliding command of two fingers causes the frame displayed in the first screen 110 or the second screen 120 to be scrolled or zoomed, etc. Taking the click input command as an example, the click input command and the corresponding touch pad operation thereof can be, for example: a click input instruction, a double click input instruction, and the like to perform operations such as clicking or starting. When the input position in the touch data D1 is located in the virtual keyboard region, the touch pad operation command and the corresponding touch pad operation may include outputting a corresponding character or key function in the displayed image of the first screen 110 or the second screen 120 according to the character or key function corresponding to the touch pad operation command, but the invention is not limited thereto.
In some embodiments, the OS may execute the corresponding touch pad operation with its built-in driver (Inbox driver), such as a Windows touch pad accurate driver (Windows touch pad driver).
In an embodiment of step S40, the processor 130 may further include a touch screen data processing module 136, and when the determining module 134 determines that the touch behavior belongs to the touch screen operation command, the determining module 134 may output the touch data D1 or the input feature extracted from the touch data D1 by the extractor module 132 to the touch screen data processing module 136, and the touch screen data processing module 136 outputs the touch screen data processing module 136 to the OS for performing the corresponding touch screen operation. Here, the touch screen operation is to perform touch positioning at an absolute position.
In some embodiments, the touch screen operation command and the corresponding touch screen operation may include a click operation command such as a single-click operation command and a double-click operation command to perform a screen operation such as clicking or starting a program, a sliding operation command to perform a sliding screen operation, or a zooming operation command to perform a zooming screen operation, but the disclosure is not limited thereto.
Fig. 3 is a flowchart illustrating another embodiment of the control method. Referring to fig. 1 to 3, in another embodiment of the step S20, the processor 130 may further include a control module 137. Besides, the determining module 134 determines whether the touch behavior belongs to the touch panel operation command or the touch screen operation command according to the user interface layout information D2 and the touch data D1, the determining module 134 further determines whether the touch behavior belongs to the user interface control command according to the user interface layout information D2 and the touch data D1. When the determining module 134 determines that the touch behavior belongs to the ui control command, the determining module 134 may generate and output a command code Cmd corresponding to the touch data D1 to the control module 137, so that the control module 137 may perform a corresponding control operation according to the command code Cmd (step S50).
In an embodiment of the step S50, when the control module 137 determines that the command code Cmd received from the determining module 134 corresponds to control of a certain application APP in the first frame image I1 displayed on the first screen 110, the control module 137 controls the corresponding application APP.
In some embodiments, if the user wants to perform a gesture operation with a certain gesture during the execution of the application APP, the application APP requests to perform the gesture operation, and generates corresponding gesture data D3 to the touch pad data processing module 135 through the control module 137, so as to trigger the corresponding gesture operation through the touch pad data processing module 135. The gesture operation may be, for example, a two-finger gesture to zoom the object, but the disclosure is not limited thereto.
In another embodiment of step S50, the processor 130 may further include a graphics processing module 138. When the control module 137 determines that the command code Cmd received from the determining module 134 corresponds to the user interface configuration of the second screen 120, the control module 137 generates the interface setting data D4 and outputs the interface setting data D4 to the graphic processing module 138 and the user interface setting module 133, respectively. The graphic processing module 138 can update the ui configuration on the second screen 120 according to the interface setting data D4, and the ui setting module 133 can generate new ui layout information D2 to the determining module 134 according to the interface setting data D4, so that the determining module 134 can know the current ui configuration on the second screen 120.
In a further embodiment of step S50, when the control module 137 determines that the command code Cmd received from the determining module 134 corresponds to an application APP controlling a first screen image I1 displayed on the first screen 110 and adjusts the user interface configuration of the second screen 120, the control module 137 controls the corresponding application APP and generates the interface setting data D4.
In some implementations, the graphics processing module 138 may be a Graphics Processor (GPU).
In some embodiments, the driver module 131, the extractor module 132, the user interface setting module 133, the determining module 134, the touch data processing module 135 and the touch key data processing module 136 may be included in the same processing driver module U1. The control module 137 may be a computer control application.
Referring to fig. 4 and 5, in some embodiments, the configuration of the user interface on the second screen 120 may present different configurations according to different application modes of the electronic device 100. In some implementations, the user interface configuration on the second screen 120 can include the virtual keyboard region a1, the virtual touchpad region a2, the extended display region A3, or a combination thereof. The virtual keyboard area a1 can display a keyboard image covering a plurality of keys, so that a user can know what symbol or function is input corresponding to each touch position in each virtual keyboard area a 1. An image representing the touch pad can be displayed in the virtual touch pad area a2, so that the user can know that he/she can perform the touch pad operation in the virtual touch pad area a 2. The extended display area a3 is used as an extended display screen of the first screen 110 for users to use freely.
Referring to fig. 1 to 5, in some embodiments, the processor 130 may further include a virtual display module 139. The virtual display module 139 may be configured to generate the second frame image I2. When the extended display area A3 is configured on the user interface of the second screen 120, the gpu 138 outputs the second frame image I2 received from the virtual display module 139 to the extended display area A3 for display.
In some implementations, the virtual display module 139 includes a virtual display driver module 1391 and a virtual screen 1392. The virtual display driver module 1391 is used to generate the second frame image I2, and the virtual screen 1392 is generated by the virtual display driver module 1391, so that the OS can consider the virtual screen 1392 to be a real screen.
In some embodiments, the application modes of the electronic device 100 may include a general pen mode, an electrical pen multiplexing mode, a panoramic multiplexing mode, and a book mode, but the disclosure is not limited thereto.
Referring to fig. 1 to 4, in an embodiment of the general handwriting mode, the user interface configuration of the second screen 120 of the electronic device 100 may include a virtual keyboard area a1 and a virtual touchpad area a 2. In the normal handwriting mode, the user can touch the keys displayed in the virtual touchpad area a1 to enable the electronic device 100 to output corresponding characters or key functions on the screen of the first screen 110, and can control the cursor displayed on the first screen 110, execute the application program, or perform scrolling, zooming, etc. operations by clicking, sliding, etc. on the virtual touchpad area a2, so that the user operates the electronic device 100 in substantially the same way as a normal traditional handwriting.
In the normal stylus mode, the determining module 134 of the electronic device 100 determines whether the touch behavior belongs to the ui control command according to the touch data D1 in step S20. When the determining module 134 determines that the touch behavior belongs to the ui control command, step S50 is executed. When the determining module 134 determines that the touch behavior does not belong to the ui control command, the determining module 134 determines that the touch behavior belongs to the touch panel operation command or the touch screen operation command according to the ui layout information D2 and the touch data D1, and performs step S30 or step S40 according to the determination result. Referring to fig. 4, in this embodiment, the user interface configuration of the second screen 120 only includes the virtual keyboard area a1 and the virtual touchpad area a2, and in step S20, the determining module 134 determines that the touch action performed in the virtual keyboard area a1 or the virtual touchpad area a2 belongs to the touchpad operation command, so that the determining module 134 selects to perform step S30, and in step S30, triggers the corresponding touchpad operation according to whether the input position in the touch data D1 falls within the virtual touchpad area or the virtual touchpad area.
In some embodiments, when operating in the normal stylus mode, the virtual keyboard area a1 may be located above the screen of the second screen 120, and the virtual touchpad area a2 is located below the screen of the second screen 120, but the present disclosure is not limited thereto, and the user may swap the positions of the virtual keyboard area a1 and the virtual touchpad area a2 in the screen of the second screen 120 through interface adjustment settings. In addition, the user can also adjust the ratio of the virtual keyboard area a1 in the screen of the second screen 120 and the ratio of the virtual touchpad area a2 in the screen of the second screen 120 by the interface adjustment setting. In addition, in the normal handwriting Mode, the gpu 138 can output the image data of the virtual keyboard area a1 and the virtual touchpad area a2 to the second screen 120 in the Direct Mode (Direct Mode).
FIG. 5 is a schematic diagram illustrating an embodiment of an electronic device operating in a pen-electricity multiplexing mode. Referring to fig. 1 to 5, in an embodiment of the pen multiplexing mode, the user interface configuration of the second screen 120 of the electronic device 100 may include a virtual keyboard area a1 and an extended display area A3, which is exemplified by a combination of a virtual keyboard area a1 and an extended display area A3, wherein the virtual keyboard area a1 may be replaced by a virtual touchpad area a 2. Here, the extended display area A3 may be located above in the screen of the second screen 120, and the virtual keyboard area a1 may be located below in the screen of the second screen 120. At this time, the user may regard the extended display area a3 as an extended screen of the first screen 110 to perform an operation. Therefore, the user can process different tasks on the extended display area a3 of the first screen 110 and the second screen 120 at the same time. In addition, the user can adjust the ratio of the extended display area A3 in the frame of the second screen 120 and the ratio of the virtual keyboard area a1 in the frame of the second screen 120 by adjusting the setting of the interface.
In the pen-electricity multiplexing mode, the determining module 134 of the electronic device 100 can determine whether the touch behavior belongs to the ui control command according to the touch data D1 in step S20. When the determining module 134 determines that the touch behavior belongs to the ui control command, step S50 is executed. When the determining module 134 determines that the touch behavior does not belong to the ui control command, the determining module 134 determines that the touch behavior belongs to the touch panel operation command or the touch screen operation command according to the ui layout information D2 and the touch data D1, and performs step S30 or step S40 according to the determination result.
Referring to fig. 5, in the embodiment, the user interface configuration of the second screen 120 of the electronic device 100 includes a virtual keyboard area a1 and an extended display area A3, and in step S20, the determining module 134 determines that the touch action performed in the virtual keyboard area a1 belongs to the touch pad operation command, and performs step S30 according to the determination result. The touch behavior determination module 134 executed in the extended display area a3 determines that the touch screen operation command belongs to, and performs step S40 according to the determination result.
Under the pen-electricity multiplexing mode, the graphic processing module 138 of the electronic device 100 integrates the second screen image I2 displayed in the extended display area A3 and the keyboard image displayed in the virtual keyboard area a1, and outputs the integrated image to the second screen 120 for display.
Fig. 6 is a schematic diagram of an embodiment of an electronic device operating in a panorama mode. Referring to fig. 1 to 6, in an embodiment of the panorama mode, the user interface configuration of the second screen 120 of the electronic device 100 only includes the extended display area a 3. At this time, the user can operate the entire second screen 120 as an extended screen of the first screen 110. Moreover, the first screen 110 and the second screen 120 can jointly display a single job, thereby expanding the displayable range of the single job. Here, the system job list W1 is displayed on the second screen 120. In the panoramic mode, the determining module 134 of the electronic device 100 directly determines in step S20 that the touch behavior belongs to the touch screen operation command and then selects to perform step S40. The keyboard input part can be performed by the user through the external keyboard K1.
Fig. 7 is a schematic diagram of an embodiment of an electronic device operating in a panorama multiplexing mode. Referring to fig. 1 to 7, in an embodiment of the panoramic multiplexing mode, the user interface configuration of the second screen 120 of the electronic device 100 is substantially the same as the user interface configuration when operating in the panoramic mode. At this time, the first screen 110 and the second screen 120 may display different jobs, respectively, so that the user may process different jobs at the same time. In the panoramic multiplexing mode, the determining module 134 of the electronic device 100 can directly determine in step S20 that the touch behavior belongs to the touch screen operation command and then select to execute step S40. The keyboard input part can be performed by the user through the external keyboard K1.
Fig. 8 is a schematic diagram illustrating an embodiment of the electronic device operating in the book mode. Referring to fig. 1 to 8, in an embodiment of the book mode, the user interface configuration of the second screen 120 of the electronic device 100 is substantially the same as the user interface configuration when operating in the panorama multiplexing mode. In the book mode, the user can use the electronic device 100 by rotating it horizontally by 90 degrees. Moreover, the images displayed on the first screen 110 and the second screen 120 can be adjusted according to the rotation direction of the electronic device 100. Here, fig. 8 is illustrated by rotating 90 degrees to the left, but the disclosure is not limited thereto. In the book mode, the determining module 134 of the electronic device 100 directly determines in step S20 that the touch behavior belongs to the touch screen operation command and then selects to execute step S40. The keyboard input part can be performed by the user through the external keyboard K1.
In some embodiments, the control method of any embodiment of the present invention can be implemented by a non-transitory computer-readable recording medium device. The non-transitory computer readable recording medium device stores at least one program, so that when the electronic device 100 loads and executes the at least one program, the at least one program enables the electronic device 100 to execute the control method of any of the above embodiments. In one embodiment, the non-transitory computer readable recording medium device may be a storage inside the electronic device 100. In some embodiments, the storage may be implemented by one or more storage elements, and each storage element may be, but is not limited to, a non-volatile memory, such as a Read Only Memory (ROM) or a Flash memory, or a volatile memory, such as a Random Access Memory (RAM). In another embodiment, the non-transitory computer readable recording medium device can be a remote storage device and transmitted to the electronic device 100 through a wired or wireless manner. In another embodiment, the non-transitory computer readable recording medium device can be a storage device external to the electronic device 100, and is connected to and accesses the program code of the storage device through a reader or a connector of the electronic device 100.
In some embodiments, the processor 130 may be implemented by a SoC chip, a Central Processing Unit (CPU), a Microcontroller (MCU), an Application Specific Integrated Circuit (ASIC), or the like. In addition, the first screen 110 and/or the second screen 120 can be a capacitive touch display screen, a resistive touch display screen, or other touch display screens made by using suitable touch sensing elements.
In summary, in the control method, the electronic device and the non-transitory computer-readable recording medium device according to the embodiments of the present disclosure, the determining module determines that the touch behavior belongs to the touch pad operation instruction or the touch screen operation instruction according to the touch data, and then the operating system executes the corresponding touch pad operation or touch screen operation, so as to simplify the transmission flow of the touch data and improve the transmission rate.
The above-described embodiments and/or implementations are only for illustrating the preferred embodiments and/or implementations of the present technology, and are not intended to limit the implementations of the present technology in any way, and those skilled in the art may make modifications or changes to other equivalent embodiments without departing from the scope of the technical means disclosed in the present disclosure, but should be construed as the technology or implementations substantially the same as the present technology.

Claims (17)

1. A control method is applied to an electronic device, wherein the electronic device comprises a first screen and a second screen, and the control method comprises the following steps:
receiving touch data generated by the second screen in response to the touch behavior;
judging whether the touch behavior belongs to a touch panel operation instruction or a touch screen operation instruction according to the touch data;
when the touch behavior belongs to the touch control panel operation instruction, triggering corresponding touch control panel operation according to the touch control data; and
and when the touch behavior belongs to the touch screen operation instruction, triggering corresponding touch screen operation according to the touch data.
2. The method of claim 1, wherein the step of determining whether the touch behavior belongs to the touch pad operation command or the touch screen operation command according to the touch data comprises determining whether the touch behavior belongs to the touch pad operation command or the touch screen operation command according to the touch data and user interface layout information.
3. The method of claim 2, wherein the step of determining whether the touch behavior belongs to the touch pad operation command or the touch screen operation command according to the touch data comprises:
judging whether the touch behavior belongs to a user interface control instruction according to the touch data and the user interface layout information;
when the touch behavior is judged to belong to the user interface control command, generating a command code corresponding to the touch data to a control module; and
when the touch behavior is judged not to belong to the user interface control instruction, judging whether the touch behavior belongs to the touch panel operation instruction or the touch screen operation instruction according to the touch data and the user interface layout information;
wherein, the step of triggering the corresponding touch operation according to the touch data is to trigger the corresponding touch pad operation by outputting the touch data to a touch pad data processing module; and
the step of triggering the corresponding touch screen operation according to the touch data is to trigger the corresponding touch screen operation by outputting the touch data to a touch screen data processing module.
4. The control method as claimed in claim 3, wherein the control module controls the application program when the control module determines that the command code corresponds to the application program for controlling the display on the first screen.
5. The control method as claimed in claim 3, wherein when the control module determines that the command code corresponds to adjusting the UI configuration of the second screen, the control module generates an interface setting data for updating the UI configuration and the UI layout information of the second screen.
6. The control method of claim 5, wherein the user interface configuration on the second screen comprises an extended display area, a virtual keyboard area, a virtual touchpad area, or a combination thereof.
7. The control method as claimed in claim 6, wherein the first screen is used for displaying a first image, the control method further comprising:
generating a second picture image by using the virtual display module; and
when the user interface configuration comprises the extended display area, the second picture image is output to the extended display area of the user interface configuration.
8. The control method according to claim 4, further comprising:
when the application program requests gesture operation, gesture data is generated to the touch pad data processing module.
9. An electronic device, comprising:
the first screen is used for displaying a first picture image;
a second screen for generating touch data in response to the touch behavior; and
and the processor is used for receiving the touch data and judging that the touch behavior belongs to a touch panel operation instruction or a touch screen operation instruction according to the touch data, wherein when the touch behavior belongs to the touch panel operation instruction, the processor triggers corresponding touch panel operation according to the touch data, and when the touch behavior belongs to the touch screen operation instruction, the processor triggers corresponding touch screen operation according to the touch data.
10. The electronic device of claim 9, wherein said processor comprises:
the user interface setting module is used for generating user interface layout information; and
and the judging module is used for judging that the touch behavior belongs to the touch panel operating instruction or the touch screen operating instruction according to the touch data and the user interface layout information.
11. The electronic device of claim 10, wherein the determining module determines whether the touch behavior belongs to a user interface control command, the touchpad operation command or the touchscreen operation command according to the touch data and the user interface layout information, wherein when the touch behavior belongs to the user interface control command, the determining module generates an instruction code corresponding to the touch data to the control module of the processor, when the touch behavior belongs to the touchpad operation command, outputs the touch data to the touchpad data processing module of the processor, and when the touch behavior belongs to the touchscreen operation command, outputs the touch data to the touchscreen data processing module of the processor.
12. The electronic device of claim 11, wherein the control module controls the application program when the control module determines that the command code corresponds to the application program displayed on the first screen.
13. The electronic device of claim 8, wherein when the control module determines that the command code corresponds to adjusting the user interface configuration on the second screen, the control module generates interface setting data, the processor further comprises a graphics processing module, the graphics processing module updates the user interface configuration on the second screen according to the interface setting data, and the user interface setting module updates the user interface layout information according to the interface setting data.
14. The electronic device of claim 13, wherein the user interface configuration on the second screen comprises an extended display area, a virtual keyboard area, a virtual touchpad area, or a combination thereof.
15. The electronic device of claim 14, wherein the processor further comprises a virtual display module for generating a second frame image, and the graphics processing module outputs the second frame image to the extended display area of the user interface configuration when the user interface configuration on the second screen includes the extended display area.
16. The electronic device of claim 12, wherein the control module further generates gesture data to the touch pad data processing module when the application requests a gesture operation.
17. A non-transitory computer readable recording medium device storing a program, when an electronic device including a first screen and a second screen loads and executes the program, the program causing the electronic device to perform the steps of:
receiving touch data generated by the second screen in response to the touch behavior;
judging whether the touch behavior belongs to a touch panel operation instruction or a touch screen operation instruction according to the touch data;
when the touch behavior belongs to the touch control panel operation instruction, triggering corresponding touch control panel operation according to the touch control data; and
and when the touch behavior belongs to the touch screen operation instruction, triggering corresponding touch screen operation according to the touch data.
CN201811267704.XA 2018-10-29 2018-10-29 Control method, electronic device and non-transitory computer readable recording medium device Pending CN111104079A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811267704.XA CN111104079A (en) 2018-10-29 2018-10-29 Control method, electronic device and non-transitory computer readable recording medium device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811267704.XA CN111104079A (en) 2018-10-29 2018-10-29 Control method, electronic device and non-transitory computer readable recording medium device

Publications (1)

Publication Number Publication Date
CN111104079A true CN111104079A (en) 2020-05-05

Family

ID=70419657

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811267704.XA Pending CN111104079A (en) 2018-10-29 2018-10-29 Control method, electronic device and non-transitory computer readable recording medium device

Country Status (1)

Country Link
CN (1) CN111104079A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1734392A (en) * 2004-08-10 2006-02-15 株式会社东芝 Electronic apparatus having universal human interface
US20110047459A1 (en) * 2007-10-08 2011-02-24 Willem Morkel Van Der Westhuizen User interface
US20110072345A1 (en) * 2009-09-18 2011-03-24 Lg Electronics Inc. Mobile terminal and operating method thereof
US20110239157A1 (en) * 2010-03-24 2011-09-29 Acer Incorporated Multi-Display Electric Devices and Operation Methods Thereof
CN102830892A (en) * 2011-06-16 2012-12-19 宏碁股份有限公司 Touch control method and electronic device
US20140143676A1 (en) * 2011-01-05 2014-05-22 Razer (Asia-Pacific) Pte Ltd. Systems and Methods for Managing, Selecting, and Updating Visual Interface Content Using Display-Enabled Keyboards, Keypads, and/or Other User Input Devices
US20170052698A1 (en) * 2011-02-10 2017-02-23 Samsung Electronics Co., Ltd. Portable device comprising a touch-screen display, and method for controlling same
US20180059717A1 (en) * 2014-02-06 2018-03-01 Samsung Electronics Co., Ltd. Electronic device and method for controlling displays

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1734392A (en) * 2004-08-10 2006-02-15 株式会社东芝 Electronic apparatus having universal human interface
US20110047459A1 (en) * 2007-10-08 2011-02-24 Willem Morkel Van Der Westhuizen User interface
US20110072345A1 (en) * 2009-09-18 2011-03-24 Lg Electronics Inc. Mobile terminal and operating method thereof
US20110239157A1 (en) * 2010-03-24 2011-09-29 Acer Incorporated Multi-Display Electric Devices and Operation Methods Thereof
US20140143676A1 (en) * 2011-01-05 2014-05-22 Razer (Asia-Pacific) Pte Ltd. Systems and Methods for Managing, Selecting, and Updating Visual Interface Content Using Display-Enabled Keyboards, Keypads, and/or Other User Input Devices
US20170052698A1 (en) * 2011-02-10 2017-02-23 Samsung Electronics Co., Ltd. Portable device comprising a touch-screen display, and method for controlling same
CN102830892A (en) * 2011-06-16 2012-12-19 宏碁股份有限公司 Touch control method and electronic device
US20180059717A1 (en) * 2014-02-06 2018-03-01 Samsung Electronics Co., Ltd. Electronic device and method for controlling displays

Similar Documents

Publication Publication Date Title
TWI705361B (en) Control method, electronic device and non-transitory computer readable storage medium device
US10133396B2 (en) Virtual input device using second touch-enabled display
JP5373011B2 (en) Electronic device and information display method thereof
US20150199125A1 (en) Displaying an application image on two or more displays
EP2613244A2 (en) Apparatus and method for displaying screen on portable device having flexible display
WO2019184490A1 (en) Method for use in displaying icons of hosted applications, and device and storage medium
US10319073B2 (en) Universal digital content zooming techniques
EP3610361B1 (en) Multi-stroke smart ink gesture language
US20220276756A1 (en) Display device, display method, and program
CN107632761B (en) Display content viewing method, mobile terminal and computer readable storage medium
EP2998838A1 (en) Display apparatus and method for controlling the same
US20020101406A1 (en) Touch-controlled hot key device
US20190302952A1 (en) Mobile device, computer input system and computer readable storage medium
US20220121355A1 (en) Terminal, method for controlling same, and recording medium in which program for implementing the method is recorded
CN111104079A (en) Control method, electronic device and non-transitory computer readable recording medium device
US20130205201A1 (en) Touch Control Presentation System and the Method thereof
CN109558051B (en) Switching processing method and device of multifunctional page and computer readable storage medium
CN111104078B (en) Control method, electronic device and non-transitory computer readable recording medium device
JP2021076959A (en) Information processing device and information processing method
TWI686742B (en) Control method, electronic device and non-transitory computer readable storage medium device
US11762501B2 (en) Information processing apparatus and control method
JP2012008805A (en) Information processor, control program, storage medium, and display control device
US20220066630A1 (en) Electronic device and touch method thereof
CN110730944B (en) Amplified input panel
JP6844699B2 (en) Display device and display method of display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination