CN111078056B - Electronic device and signal processing method - Google Patents

Electronic device and signal processing method Download PDF

Info

Publication number
CN111078056B
CN111078056B CN201911403266.XA CN201911403266A CN111078056B CN 111078056 B CN111078056 B CN 111078056B CN 201911403266 A CN201911403266 A CN 201911403266A CN 111078056 B CN111078056 B CN 111078056B
Authority
CN
China
Prior art keywords
display panel
touch point
point information
sensing signal
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911403266.XA
Other languages
Chinese (zh)
Other versions
CN111078056A (en
Inventor
李华桥
李静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201911403266.XA priority Critical patent/CN111078056B/en
Publication of CN111078056A publication Critical patent/CN111078056A/en
Application granted granted Critical
Publication of CN111078056B publication Critical patent/CN111078056B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1641Details related to the display arrangement, including those related to the mounting of the display in the housing the display being formed by a plurality of foldable display components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Abstract

An electronic apparatus includes a first display panel, a second display panel, and a first processing device. The first display panel is provided with at least one first force-sensitive element, and the first force-sensitive element is used for sensing the touch operation of an operating body on the first display panel and generating a first sensing signal; the second display panel is provided with at least one second force sensitive element, and the second force sensitive element is used for sensing the touch operation of the operating body on the second display panel and generating a second sensing signal; the first processing device is used for receiving the first induction signal and the second induction signal and obtaining input information based on the first induction signal and the second induction signal. The present disclosure also provides a signal processing method.

Description

Electronic device and signal processing method
Technical Field
The present disclosure relates to a robot communication method and a robot.
Background
With the rapid development of science and technology, electronic devices such as mobile phones and notebook computers have been increasingly applied to the work and life of people. In order to increase the functions of electronic devices and to provide users with better use experience, the forms of electronic devices are increasingly diversified. The double-screen notebook computer is a new notebook computer in recent years, and the two bodies of the double-screen notebook computer are respectively provided with the display screen, so that a larger operation space can be provided for a user, and a better visual effect is achieved. However, in the related art, when the dual-screen notebook computer is unfolded at 180 degrees, the distance between the touch areas of the two screens is too large, so that interactive operation cannot be performed between the two screens, for example, cross-screen dragging operation, and the user experience is poor.
Disclosure of Invention
One aspect of the present disclosure provides an electronic device including: the touch control device comprises a first display panel, at least one first force-sensitive element, a second force-sensitive element and a control unit, wherein the first force-sensitive element is used for sensing touch operation of an operating body on the first display panel and generating a first sensing signal; the second display panel is provided with at least one second force sensitive element, and the second force sensitive element is used for sensing the touch operation of an operating body on the second display panel and generating a second sensing signal; and the first processing device is used for receiving the first induction signal and the second induction signal and obtaining input information based on the first induction signal and the second induction signal.
Optionally, the first processing device is further configured to: determining whether the first sensing signal and the second sensing signal satisfy a specific cross-screen input condition; and generating cross-screen input information under the condition that the first induction signal and the second induction signal meet the cross-screen input condition, wherein the cross-screen input condition comprises that the time interval between the first induction signal and the second induction signal is less than a specific time length.
Optionally, the first display panel and the second display panel each include a display area and a bezel area; the first display panel and the second display panel are configured to acquire touch point information of respective display areas, the touch point information including position information of a touch point; the electronic equipment further comprises a second processing device, and the second processing device is used for executing corresponding cross-screen input operation based on the cross-screen input information and the touch point information of the display area.
Optionally, the first processing device is further configured to generate first input information about the first display panel based on the first sensing signal, or generate second input information about the second display panel based on the second sensing signal; the second processing device is further used for executing a single-screen input operation on the first display panel based on the first input information or executing a single-screen input operation on the second display panel based on the second input information; or the second processing device is further used for executing the single-screen input operation related to the first display panel based on the touch point information of the first display panel and executing the single-screen input operation related to the second display panel based on the touch point information of the second display panel.
Optionally, the second processing device is configured to: in response to receiving the cross-screen input information, performing a sliding operation from the first display panel to the second display panel when touch point information of the first display panel satisfies a first predetermined condition, wherein the first predetermined condition includes: the touch point information of the first display panel represents that the operation body slides on the first display panel from an initial position to a direction close to the second display panel; and/or the touch point information of the first display panel represents that the operation body slides from an initial position to an edge area of the first display panel on the first display panel.
Optionally, the first display panel has a first display area and a first border area, and a plurality of first force-sensitive elements are disposed on the first display panel; the second display panel is provided with a second display area and a second frame area, and a plurality of second force-sensitive elements are arranged on the first display panel; the first processing device is further configured to: obtaining touch point information on the first display panel based on first sensing signals of the first force-sensitive elements, wherein the touch point information on the first display panel comprises touch point information of the first frame area; and obtaining touch point information on the second display panel based on second sensing signals of the plurality of second force sensitive elements, wherein the touch point information on the second display panel comprises touch point information of the second frame area.
Optionally, the second processing device is further configured to: obtaining touch point information of the first display area and the second display area; and executing corresponding cross-screen input operation based on the touch point information of the first display area, the first frame area, the second frame area and the second display area.
Optionally, the second processing device is further configured to: performing a sliding operation between the first display panel and the second display panel based on a trajectory of touch points of the first display area, the first bezel area, the second bezel area, and the second display area.
Optionally, the second processing device is further configured to: obtaining touch point information of the first display area and the second display area from the first processing device; or obtaining touch point information of the first display area and the second display area based on touch signals from the first display panel and the second display panel; or comparing the touch point information from the first processing device with touch point information obtained based on touch signals of the first display panel and the second display panel, and if the comparison result satisfies a specific condition, correcting the touch point information from the first processing device using the touch point information obtained based on the touch signals of the first display panel and the second display panel.
Another aspect of the present disclosure provides a signal processing method, including: receiving a first sensing signal from at least one first force sensitive element; receiving a second sensing signal from the at least one second force sensitive element; and obtaining input information based on the first sensing signal and the second sensing signal, wherein the at least one first force-sensitive element is arranged on a first display panel, the at least one first force-sensitive element is used for sensing touch operation of an operating body on the first display panel and generating a first sensing signal, the at least one second force-sensitive element is arranged on a second display panel, and the second force-sensitive element is used for sensing touch operation of the operating body on the second display panel and generating a second sensing signal.
Another aspect of the present disclosure provides a computer system comprising:
one or more processors;
a memory for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method as described above.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions for implementing the method as described above when executed.
Another aspect of the disclosure provides a computer program comprising computer executable instructions for implementing the method as described above when executed.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1 schematically shows an application scenario of an electronic device according to an embodiment of the present disclosure;
FIG. 2 schematically illustrates a composition diagram of an electronic device according to an embodiment of the disclosure;
FIG. 3 schematically illustrates a schematic diagram of a cross-screen operation according to an embodiment of the disclosure;
fig. 4 schematically shows a schematic view of a first display panel and a second display panel according to an embodiment of the present disclosure;
FIG. 5 schematically illustrates a schematic diagram of a cross-screen operation according to another embodiment of the present disclosure;
fig. 6 schematically shows a flow chart of a signal processing method according to an embodiment of the present disclosure; and
fig. 7 schematically shows a block diagram of a computer system suitable for implementing a signal processing method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks. The techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable storage medium having instructions stored thereon for use by or in connection with an instruction execution system.
An embodiment of the present disclosure provides an electronic apparatus including a first display panel, a second display panel, and a first processing device. The first display panel is provided with at least one first force-sensitive element, and the first force-sensitive element is used for sensing the touch operation of an operating body on the first display panel and generating a first sensing signal. The second display panel is provided with at least one second force-sensitive element, and the second force-sensitive element is used for sensing the touch operation of the operating body on the second display panel and generating a second sensing signal. The first processing device is used for receiving the first induction signal and the second induction signal and obtaining input information based on the first induction signal and the second induction signal.
Fig. 1 schematically shows an application scenario of an electronic device according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of a scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1, the electronic device according to the embodiment of the disclosure may be, for example, a dual-screen notebook computer 100, the dual-screen notebook computer 100 has a first body 110 and a second body 120 that are relatively rotatable, the first body 110 is provided with a first display screen 111, the second body 120 is provided with a second display screen 121, and both the first display screen 111 and the second display screen 121 may be used for displaying images.
The first display screen 111 and the second display screen 121 may each be a touch panel, and a user may perform touch input by touching the display panel. For a dual-screen notebook 100, one important function is dual-screen interaction, such as dragging icons/files between the two screens.
The capacitive screen is a way to implement touch input, the first display screen 111 and the second display screen 121 may both adopt capacitive screens, and the dual-screen notebook computer may receive and process signals of the two capacitive screens by using a central processing unit CPU, and execute corresponding input operations according to processing results of the capacitive signals. However, in this way, when the cross-screen dragging is performed between the two display screens, the distance between the touch areas of the two display screens is required to be small, otherwise, the CPU cannot recognize the cross-screen dragging operation from one screen to the other screen. Through the design of countershaft etc., can make the gap between two bodies very little, but even if the gap between two bodies is very little, all have the marginal area that can not discern touch-control operation on two display screens, still have certain width after the marginal area stack of two display screens, make still have certain distance between the touch-control area of two display screens, can not ensure the realization of two screen interactive operation.
The electronic equipment of the embodiment of the disclosure is provided with the force sensitive elements on the two display screens respectively, and utilizes the processing device to analyze the sensing signals of the force sensitive elements on the two display panels to obtain the input information between the two display panels, for example, whether the operation body performs the cross-screen operation can be determined according to the sequence and the interval time of the sensing signals on the two display panels, if the operation body performs the cross-screen operation, a cross-screen input event is generated, at least part of the problem that the double-screen interaction operation cannot be realized due to the large gap between the touch areas of the two display screens in the related art is solved, and further, the cross-screen input event is obtained by analyzing the stress conditions of the two screens, and further, the technical effect of the double-screen interaction operation is realized.
Fig. 2 schematically shows a composition diagram of an electronic device according to an embodiment of the disclosure.
As shown in fig. 2, the electronic device 200 includes a first display panel 210, a second display panel 220, and a first processing device 230.
The first display panel 210 is provided with at least one first force-sensitive element 240, and the first force-sensitive element 240 is used for sensing a touch operation of an operating body on the first display panel 210 and generating a first sensing signal.
For example, the first force sensitive element 240 may be an elastic wave sensor. When the operating body performs a touch operation on the first display panel 210, an elastic vibration wave is formed on the first display panel 210, and the elastic wave sensor can capture a vibration signal generated by the first display panel 210. The first sensing signal may be, for example, a vibration signal including waveform information of a vibration wave.
The second display panel 220 is provided with at least one second force sensitive element 250, and the second force sensitive element 250 is used for sensing a touch operation of an operating body on the second display panel 220 and generating a second sensing signal.
For example, the second force sensitive element 250 may be an elastic wave sensor. When the operating body performs a touch operation on the second display panel 220, an elastic vibration wave is formed on the second display panel 220, and the elastic wave sensor can capture a vibration signal generated by the second display panel 220. The second sensing signal may be, for example, a vibration signal including waveform information of a vibration wave.
The first processing device 230 is configured to receive the first sensing signal and the second sensing signal, and obtain the input information based on the first sensing signal and the second sensing signal. The first processing device 230 may be, for example, a processing chip.
According to an embodiment of the present disclosure, obtaining the input information based on the first sensing signal and the second sensing signal may include: determining whether the first sensing signal and the second sensing signal satisfy a specific cross-screen input condition, and generating cross-screen input information under the condition that the first sensing signal and the second sensing signal satisfy the cross-screen input condition. The cross-screen input condition comprises that the time interval between the first sensing signal and the second sensing signal is less than a specific time length.
For example, the first processing device 230 may determine the moving direction and the interval time of the operation body between the two display panels according to the sequence and the time interval of receiving the first sensing signal and the second sensing signal, for example, if the first sensing signal is received first and then the second sensing signal is received at an interval of 0.2S, it may be considered that the operation body is moved from the first display panel 210 to the second display panel 220, and the interval time of moving from the first display panel 210 to the second display panel 220 is 0.2S. If the specific time is 1S and 0.2S is less than 1S, a cross-screen event from the first display panel 210 to the second display panel 220 may be generated. Then, the electronic device can execute a corresponding cross-screen input operation according to the cross-screen event.
According to the embodiment of the disclosure, the electronic device can obtain the input information between the two display panels according to the induction signal analysis of the force sensitive elements on the two display panels, for example, a screen crossing event between the two display panels can be obtained, and at least part of the problem that in the related art, the double-screen interaction operation cannot be realized due to the large gap between the touch areas of the two display screens is solved, so that the screen crossing input event is obtained by analyzing the stress condition of the two screens, and the technical effect of the double-screen interaction operation is realized.
According to an embodiment of the present disclosure, the first display panel and the second display panel each include a display area and a bezel area, and the first display panel and the second display panel are configured to acquire touch point information of the respective display areas, the touch point information including coordinate position information of a touch point.
The electronic device further comprises second processing means, which may be, for example, a CPU. The first processing device may transmit the cross-screen input information to the second processing device, and the first display panel and the second display panel may transmit respective touch point information to the second processing device.
For example, the first display panel and the second display panel may both be capacitive touch screens, the display area may refer to a touchable area on the first display panel and the second display panel, when the operating body operates in the touchable area, the touchable area may generate a corresponding capacitive signal, and the CPU may determine a specific coordinate position of the touch point according to the capacitive signal. The bezel area may refer to a non-touch area on the first display panel and the second display panel.
The second processing means may perform a corresponding cross-screen input operation based on the cross-screen input information and the touch point information of the display area. The second processing device may first determine whether the touch point of the display area satisfies a specific condition, and perform the cross-screen operation based on the cross-screen input information and the touch point information on the two display panels only when the touch point of the display area satisfies the specific condition.
For example, in a case where the touch point information of the first display panel satisfies the first predetermined condition, a sliding operation from the first display panel to the second display panel is performed in response to receiving the cross-screen input information. Wherein the first predetermined condition comprises: the touch point information of the first display panel represents that the operation body slides from the initial position to the direction close to the second display panel on the first display panel; and/or the touch point information of the first display panel represents that the operation body slides from the initial position to the edge of the display area of the first display panel on the first display panel. Wherein the sliding operation includes a drag operation.
FIG. 3 schematically illustrates a schematic diagram of a cross-screen operation according to an embodiment of the disclosure.
As shown in fig. 3, when the touch point information represents that the operator drags the icon 201 on the first display panel 210 from the position a far from the second display panel to the position B in the direction approaching the second display panel, the cross-screen input information sent by the first processing device is received, and the information of the position C on the second display panel is received, and it is considered that the user wants to perform the cross-screen dragging operation from the position B to the position C, the icon 201 can be controlled to move from the position B to the position C.
According to an embodiment of the present disclosure, the second processing device is further configured to perform a single screen input operation with respect to the first display panel based on the touch point information of the first display panel, and perform a single screen input operation with respect to the second display panel based on the touch point information of the second display panel.
For example, when the operator performs an input operation only on the first display panel, such as a click, press, slide, or the like operation, the CPU may perform a single-screen input operation such as a click, press, slide, or the like operation on the first display panel according to the touch point information.
According to an embodiment of the present disclosure, the first processing device is further configured to generate first input information regarding the first display panel based on the first sensing signal or generate second input information regarding the second display panel based on the second sensing signal.
The second processing means is also for performing a single-screen input operation with respect to the first display panel based on the first input information or performing a single-screen input operation with respect to the second display panel based on the second input information.
For example, the first processing device matched with the elastic wave sensor may calculate operation information of the operation body on the first display panel and the second display panel from waveform information of the vibration waves on the first display panel and the second display panel, for example, may obtain a degree of pressing of the operation body at the operation point and an orientation of the touch point, and may further analyze whether the operation performed by the operation body is a click operation or a slide operation, and in a case where the operation body performs the slide operation, the first processing device may further analyze a direction in which the operation body slides according to a change in the orientation of the touch point. In the case where the first force sensitive element and the second force sensitive element are plural, the first processing means may further calculate a specific coordinate position of a touch point on the first display panel in combination with sensing signals of the plural first force sensitive elements, and may calculate a specific coordinate position of a touch point on the second display panel in combination with sensing signals of the plural second force sensitive elements.
The first processing device may transmit the obtained operation information to the CPU, and the CPU may perform a single-screen input operation such as clicking, pressing, sliding, or the like on the first display panel or the second display panel according to the operation information.
Fig. 4 schematically illustrates a schematic view of a first display panel and a second display panel according to an embodiment of the present disclosure.
As shown in fig. 4, according to the embodiment of the disclosure, the first display panel 210 has a first display area 211 and a first frame area 212, and the first display panel 220 has a plurality of first force-sensitive elements 240 disposed thereon.
The first processing means is further for: touch point information on the first display panel 210 is obtained based on the first sensing signals of the plurality of first force sensitive elements 240, wherein the touch point information on the first display panel includes touch point information of the first bezel area 212.
The first processing device may calculate coordinate information of the touch point on the first display panel 210 in combination with the waveform information of the vibration waves of the plurality of first force sensitive elements 240. Since the elastic wave sensor senses a vibration signal generated when a touch is made, and the first display region 211 and the first frame region 212 are integrated, not only the vibration signal of the first display region 211 can be transmitted to the plurality of first force sensitive elements 240, but also the vibration signal of the first frame region 212 can be transmitted to the plurality of first force sensitive elements 240, and the first processing device can calculate a specific position of a touch point on the first frame region 212 according to the vibration signal of the first frame region 212, so that the first frame region 212 also serves as a touch-enabled region.
Similarly, the second display panel 220 has a second display region 221 and a second bezel region 222, and the first display panel 220 is provided with a plurality of second force sensitive elements 250. The first processing means is further for: touch point information on the second display panel 250 is obtained based on the second sensing signals of the plurality of second force sensitive elements 250, wherein the touch point information on the second display panel 250 includes touch point information of the second bezel area 222. The second frame region 222 is also a touch-enabled region.
After the first processing device obtains the touch point information on the first and second bezel areas 212 and 222, the touch point information on the first and second bezel areas 212 and 222 may be sent to the second processing device.
According to an embodiment of the disclosure, the second processing device is further configured to: obtaining touch point information of a first display area and a second display area; and performing a corresponding cross-screen input operation based on the touch point information of the first display area 211, the first bezel area 212, the second bezel area 222, and the second display area 221.
For example, the second processing device may obtain touch point information on the first display area 211 and the second display area 221 according to the capacitance signal, and perform a corresponding cross-screen input operation in combination with the touch point information on the first bezel area 212 and the second bezel area 222 obtained from the first processing device.
The electronic device of the embodiment of the disclosure can basically recognize the touch operation of the operation body in any area of the two display panels by combining the touch point information of the first display area 211, the first frame area 212, the second frame area 222 and the second display area 221, and because the gap between the first body and the second body can be made small by designing the middle structure of the rotating shaft and the like, the distance between the first frame area 212 and the second frame area 222 can also be made small, so that the first display panel and the second display panel form a whole display panel.
According to an embodiment of the disclosure, the second processing device is further configured to: the sliding operation between the first display panel 210 and the second display panel 220 is performed based on the trajectories of the touch points of the first display region 211, the first bezel region 212, the second bezel region 222, and the second display region 221.
FIG. 5 schematically shows a schematic diagram of a cross-screen operation according to another embodiment of the present disclosure.
As shown in fig. 5, the track of the touch point can be known through the touch point information of the first display area 211, the first bezel area 212, the second bezel area 222, and the second display area 221, and then a corresponding sliding or dragging operation can be performed according to the track of the touch point.
According to an embodiment of the present disclosure, the second processing device may further obtain touch point information of the first display area and the second display area based on touch signals from the first display panel and the second display panel.
As described above, the second processing device may obtain touch point information of the first display area and the second display area from the capacitance signals of the first display panel and the second display panel.
According to an embodiment of the present disclosure, the second processing device may further obtain touch point information of the first display area and the second display area from the first processing device.
The first processing device can obtain the touch point information on the first display panel 210 according to the sensing signals of the plurality of first force-sensitive elements 240, wherein the touch point information on the first display panel 210 not only includes the touch point information of the first frame area 212, but also includes the touch point information of the first display area 211. Similarly, the first processing device can obtain the touch point information of the second display area 221 according to the sensing signals of the plurality of first force-sensitive elements 250. Accordingly, the second processing device can obtain touch point information of the first display region 211 and the second display region 221 from the first processing device.
According to an embodiment of the present disclosure, the second processing device may further compare the touch point information from the first processing device with touch point information obtained based on touch signals of the first display panel and the second display panel, and correct the touch point information from the first processing device using the touch point information obtained based on the touch signals of the first display panel and the second display panel in a case where a comparison result satisfies a specific condition.
For example, the comparison result satisfying the specific condition may mean that the comparison result represents that a difference between the touch point information from the first processing device and the touch point information obtained based on the touch signals of the first display panel and the second display panel is greater than a predetermined threshold, reflecting that the difference between the two is large.
Because the position of the touch point can be directly reflected by the capacitance signal, the touch point information on the first display area and the second display area obtained based on the capacitance signal is more accurate. In order to improve the accuracy of the touch point information obtained by the first processing device, the touch point information of the first processing device may be corrected by using the touch point information on the first display area and the second display area obtained based on the capacitance signal, so that the touch point information obtained by the first processing device is more accurate, the accuracy of the touch point information obtained by the first processing device is improved, and the accuracy of the touch point information of the frame area may be further improved.
Another aspect of the embodiments of the present disclosure also provides a signal processing method.
Fig. 6 schematically shows a flow chart of a signal processing method according to an embodiment of the present disclosure.
As shown in fig. 6, the signal processing method includes operations S310 to S330.
Receiving a first sensing signal from at least one first force sensitive element in operation S310;
receiving a second sensing signal from the at least one second force sensitive element in operation S320; and
in operation S330, input information is obtained based on the first sensing signal and the second sensing signal,
the touch control system comprises a first display panel, at least one first force-sensitive element, at least one second display panel and at least one second force-sensitive element, wherein the first force-sensitive element is arranged on the first display panel, the first force-sensitive element is used for sensing touch operation of an operating body on the first display panel and generating a first sensing signal, the second force-sensitive element is arranged on the second display panel, and the second force-sensitive element is used for sensing touch operation of the operating body on the second display panel and generating a second sensing signal.
According to an embodiment of the present disclosure, obtaining the input information based on the first sensing signal and the second sensing signal may include: determining whether the first sensing signal and the second sensing signal meet a specific cross-screen input condition; and generating cross-screen input information under the condition that the first induction signal and the second induction signal meet a cross-screen input condition, wherein the cross-screen input condition comprises that the time interval between the first induction signal and the second induction signal is less than a specific time.
According to an embodiment of the present disclosure, the first display panel and the second display panel each include a display area and a bezel area; the first display panel and the second display panel are configured to acquire touch point information of the respective display areas, the touch point information including position information of a touch point.
The signal processing method further includes: and executing corresponding cross-screen input operation based on the cross-screen input information and the touch point information of the display area.
According to an embodiment of the present disclosure, the signal processing method further includes: generating first input information on the first display panel based on the first sensing signal or generating second input information on the second display panel based on the second sensing signal; performing a single-screen input operation with respect to the first display panel based on the first input information or performing a single-screen input operation with respect to the second display panel based on the second input information; or performing a single screen input operation with respect to the first display panel based on the touch point information of the first display panel, and performing a single screen input operation with respect to the second display panel based on the touch point information of the second display panel.
According to an embodiment of the present disclosure, performing a corresponding cross-screen input operation based on cross-screen input information and touch point information of a display area includes: and in response to receiving the cross-screen input information, executing a sliding operation from the first display panel to the second display panel under the condition that the touch point information of the first display panel meets a first preset condition. Wherein the first predetermined condition comprises: the touch point information of the first display panel represents that the operation body slides from the initial position to the direction close to the second display panel on the first display panel; and/or the touch point information of the first display panel represents that the operation body slides from the initial position to the edge of the display area of the first display panel on the first display panel.
According to an embodiment of the present disclosure, a first display panel has a first display area and a first bezel area, and a plurality of first force sensitive elements are disposed on the first display panel; the second display panel is provided with a second display area and a second frame area, and the first display panel is provided with a plurality of second force-sensitive elements.
The signal processing method further includes: obtaining touch point information on the first display panel based on the first sensing signals of the first force-sensitive elements, wherein the touch point information on the first display panel comprises touch point information of the first frame area; and obtaining touch point information on the second display panel based on the second sensing signals of the plurality of second force sensing elements, wherein the touch point information on the second display panel comprises touch point information of the second frame area.
According to an embodiment of the present disclosure, the signal processing method further includes: obtaining touch point information of a first display area and a second display area; and executing corresponding cross-screen input operation based on the touch point information of the first display area, the first frame area, the second frame area and the second display area.
According to an embodiment of the present disclosure, the signal processing method further includes: and performing a sliding operation between the first display panel and the second display panel based on the trajectories of the touch points of the first display area, the first bezel area, the second bezel area, and the second display area.
According to an embodiment of the present disclosure, the signal processing method further includes: obtaining touch point information of the first display area and the second display area from the first processing device; obtaining touch point information of the first display area and the second display area based on touch signals from the first display panel and the second display panel; the touch point information from the first processing device is compared with touch point information obtained based on touch signals of the first display panel and the second display panel, and if the comparison result satisfies a specific condition, the touch point information from the first processing device is corrected using the touch point information obtained based on the touch signals of the first display panel and the second display panel.
The signal processing method according to the embodiment of the disclosure may refer to fig. 1 to fig. 5, and the above description about the corresponding contents, which are not repeated herein.
The signal processing method according to the embodiments of the present disclosure may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented by hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or implemented in any one of three implementations of software, hardware, and firmware, or in a suitable combination of any of them. Alternatively, the signal processing method according to an embodiment of the present disclosure may be at least partially implemented as computer program modules, which, when executed, may perform corresponding functions.
FIG. 7 schematically illustrates a block diagram of a computer system suitable for implementing the above-described method according to an embodiment of the present disclosure. The computer system illustrated in FIG. 7 is only one example and should not impose any limitations on the scope of use or functionality of embodiments of the disclosure.
As shown in fig. 7, computer system 500 includes a processor 510, a computer-readable storage medium 520, and a signal receiver 530. The computer system 500 may perform a method according to an embodiment of the disclosure.
In particular, processor 510 may include, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 510 may also include on-board memory for caching purposes. Processor 510 may be a single processing unit or a plurality of processing units for performing different actions of a method flow according to embodiments of the disclosure.
Computer-readable storage media 520, for example, may be non-volatile computer-readable storage media, specific examples including, but not limited to: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and so on.
The computer-readable storage medium 520 may include a computer program 521, which computer program 521 may include code/computer-executable instructions that, when executed by the processor 510, cause the processor 510 to perform a method according to an embodiment of the disclosure, or any variation thereof.
The computer program 521 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 521 may include one or more program modules, including for example 521A, modules 521B, … …. It should be noted that the division and number of modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, and when these program modules are executed by the processor 510, the processor 510 may execute the method according to the embodiment of the present disclosure or any variation thereof.
According to an embodiment of the present disclosure, the processor 510 may interact with the signal receiver 530 to perform a signal processing method according to an embodiment of the present disclosure or any variation thereof.
According to an embodiment of the present invention, at least part of the signal processing method may be implemented as computer program modules described with reference to fig. 7, which, when executed by the processor 510, may implement the respective operations described above.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (9)

1. An electronic device, comprising:
the touch control device comprises a first display panel, at least one first force-sensitive element, a second force-sensitive element and a control unit, wherein the first force-sensitive element is used for sensing touch operation of an operating body on the first display panel and generating a first sensing signal;
the second display panel is provided with at least one second force sensitive element, and the second force sensitive element is used for sensing the touch operation of an operating body on the second display panel and generating a second sensing signal; and
the first processing device is used for receiving the first induction signal and the second induction signal and obtaining input information based on the first induction signal and the second induction signal;
the first processing device is further configured to:
determining whether the first sensing signal and the second sensing signal satisfy a specific cross-screen input condition; and
generating cross-screen input information in case that the first sensing signal and the second sensing signal satisfy the cross-screen input condition,
wherein the cross-screen input condition comprises that a time interval between the first sensing signal and the second sensing signal is less than a specific time length.
2. The apparatus of claim 1, wherein:
the first display panel and the second display panel both comprise a display area and a frame area;
the first display panel and the second display panel are configured to acquire touch point information of respective display areas, the touch point information including position information of a touch point;
the electronic equipment further comprises a second processing device, and the second processing device is used for executing corresponding cross-screen input operation based on the cross-screen input information and the touch point information of the display area.
3. The apparatus of claim 2, wherein:
the first processing device is further used for generating first input information about the first display panel based on the first sensing signal or generating second input information about the second display panel based on the second sensing signal;
the second processing device is further used for executing a single-screen input operation on the first display panel based on the first input information or executing a single-screen input operation on the second display panel based on the second input information; or
The second processing device is further configured to perform a single-screen input operation with respect to the first display panel based on the touch point information of the first display panel, and perform a single-screen input operation with respect to the second display panel based on the touch point information of the second display panel.
4. The apparatus of claim 2, wherein the second processing device is configured to:
performing a sliding operation from the first display panel to the second display panel in response to receiving the cross-screen input information in a case where touch point information of the first display panel satisfies a first predetermined condition,
wherein the first predetermined condition comprises:
the touch point information of the first display panel represents that the operation body slides on the first display panel from an initial position to a direction close to the second display panel; and/or
The touch point information of the first display panel represents that the operation body slides from an initial position to the edge of the display area of the first display panel on the first display panel.
5. The apparatus of claim 1, wherein:
the first display panel is provided with a first display area and a first frame area, and a plurality of first force-sensitive elements are arranged on the first display panel;
the second display panel is provided with a second display area and a second frame area, and a plurality of second force-sensitive elements are arranged on the first display panel;
the first processing device is further configured to:
obtaining touch point information on the first display panel based on first sensing signals of the first force-sensitive elements, wherein the touch point information on the first display panel comprises touch point information of the first frame area;
and obtaining touch point information on the second display panel based on second sensing signals of the plurality of second force sensitive elements, wherein the touch point information on the second display panel comprises touch point information of the second frame area.
6. The apparatus of claim 5, wherein the second processing means is further for:
obtaining touch point information of the first display area and the second display area;
and executing corresponding cross-screen input operation based on the touch point information of the first display area, the first frame area, the second frame area and the second display area.
7. The apparatus of claim 6, wherein the second processing device is further to:
performing a sliding operation between the first display panel and the second display panel based on a trajectory of touch points of the first display area, the first bezel area, the second bezel area, and the second display area.
8. The apparatus of claim 6, wherein the second processing device is further to:
obtaining touch point information of the first display area and the second display area from the first processing device; or
Obtaining touch point information of the first display area and the second display area based on touch signals from the first display panel and the second display panel; or
The touch point information from the first processing device is compared with touch point information obtained based on touch signals of the first display panel and the second display panel, and if the comparison result satisfies a specific condition, the touch point information from the first processing device is corrected using the touch point information obtained based on the touch signals of the first display panel and the second display panel.
9. A signal processing method, comprising:
receiving a first sensing signal from at least one first force sensitive element;
receiving a second sensing signal from the at least one second force sensitive element; and
deriving input information based on the first sensing signal and the second sensing signal,
the touch control system comprises at least one first force-sensitive element, at least one second force-sensitive element and at least one first display panel, wherein the at least one first force-sensitive element is arranged on the first display panel and is used for sensing touch operation of an operating body on the first display panel and generating a first sensing signal, the at least one second force-sensitive element is arranged on the second display panel and is used for sensing touch operation of the operating body on the second display panel and generating a second sensing signal;
the signal processing method further includes:
determining whether the first sensing signal and the second sensing signal satisfy a specific cross-screen input condition; and
generating cross-screen input information in case that the first sensing signal and the second sensing signal satisfy the cross-screen input condition,
wherein the cross-screen input condition comprises that a time interval between the first sensing signal and the second sensing signal is less than a specific time length.
CN201911403266.XA 2019-12-30 2019-12-30 Electronic device and signal processing method Active CN111078056B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911403266.XA CN111078056B (en) 2019-12-30 2019-12-30 Electronic device and signal processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911403266.XA CN111078056B (en) 2019-12-30 2019-12-30 Electronic device and signal processing method

Publications (2)

Publication Number Publication Date
CN111078056A CN111078056A (en) 2020-04-28
CN111078056B true CN111078056B (en) 2021-07-16

Family

ID=70320175

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911403266.XA Active CN111078056B (en) 2019-12-30 2019-12-30 Electronic device and signal processing method

Country Status (1)

Country Link
CN (1) CN111078056B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112130799A (en) * 2020-09-24 2020-12-25 联想(北京)有限公司 Control method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106293390A (en) * 2016-07-28 2017-01-04 努比亚技术有限公司 Local based on double screen terminal picture extraction element and method
CN106354374A (en) * 2016-09-30 2017-01-25 维沃移动通信有限公司 Icon moving method and mobile terminal
CN108920075A (en) * 2018-06-26 2018-11-30 努比亚技术有限公司 Dual-screen mobile terminal control method, mobile terminal and computer readable storage medium
CN109428969A (en) * 2017-08-25 2019-03-05 中兴通讯股份有限公司 Edge touch control method, device and the computer readable storage medium of double screen terminal
CN110308821A (en) * 2019-06-28 2019-10-08 联想(北京)有限公司 Touch-control response method and electronic equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5853016B2 (en) * 2010-03-24 2016-02-09 ネオノード インコーポレイテッド Lens array for light-based touch screen
US20130005469A1 (en) * 2011-06-30 2013-01-03 Imerj LLC Dual screen game module
CN103164067B (en) * 2011-12-19 2016-04-27 联想(北京)有限公司 Judge the method and the electronic equipment that touch input
US10585637B2 (en) * 2017-03-27 2020-03-10 Lenovo (Beijing) Co., Ltd. Information processing method and electronic device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106293390A (en) * 2016-07-28 2017-01-04 努比亚技术有限公司 Local based on double screen terminal picture extraction element and method
CN106354374A (en) * 2016-09-30 2017-01-25 维沃移动通信有限公司 Icon moving method and mobile terminal
CN109428969A (en) * 2017-08-25 2019-03-05 中兴通讯股份有限公司 Edge touch control method, device and the computer readable storage medium of double screen terminal
CN108920075A (en) * 2018-06-26 2018-11-30 努比亚技术有限公司 Dual-screen mobile terminal control method, mobile terminal and computer readable storage medium
CN110308821A (en) * 2019-06-28 2019-10-08 联想(北京)有限公司 Touch-control response method and electronic equipment

Also Published As

Publication number Publication date
CN111078056A (en) 2020-04-28

Similar Documents

Publication Publication Date Title
US10031604B2 (en) Control method of virtual touchpad and terminal performing the same
US8847904B2 (en) Gesture recognition method and touch system incorporating the same
US8525776B2 (en) Techniques for controlling operation of a device with a virtual touchscreen
US9285880B2 (en) Touch panel device and method of controlling a touch panel device
US8446389B2 (en) Techniques for creating a virtual touchscreen
US8413075B2 (en) Gesture movies
US9304656B2 (en) Systems and method for object selection on presence sensitive devices
US9678606B2 (en) Method and device for determining a touch gesture
US20110069018A1 (en) Double Touch Inputs
US20050270278A1 (en) Image display apparatus, multi display system, coordinate information output method, and program for implementing the method
US8743065B2 (en) Method of identifying a multi-touch rotation gesture and device using the same
US20130207905A1 (en) Input Lock For Touch-Screen Device
EP2776911A1 (en) User interface indirect interaction
KR20100078234A (en) Apparatus and method for inputing control signal using dual touch sensor
US20140015785A1 (en) Electronic device
US10168895B2 (en) Input control on a touch-sensitive surface
US10007770B2 (en) Temporary secure access via input object remaining in place
CN104750299A (en) Multi-touch touch screen and its junction area touch sensing method
CN116507995A (en) Touch screen display with virtual track pad
CN111078056B (en) Electronic device and signal processing method
KR101442438B1 (en) Single touch process to achieve dual touch experience field
US20100271300A1 (en) Multi-Touch Pad Control Method
KR101422447B1 (en) Method and apparatus for changing page of e-book using pressure modeling
US20200042049A1 (en) Secondary Gesture Input Mechanism for Touchscreen Devices
CN109901779B (en) Display method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant