WO2021190326A1 - 触控显示装置、触控显示方法及存储介质 - Google Patents

触控显示装置、触控显示方法及存储介质 Download PDF

Info

Publication number
WO2021190326A1
WO2021190326A1 PCT/CN2021/080401 CN2021080401W WO2021190326A1 WO 2021190326 A1 WO2021190326 A1 WO 2021190326A1 CN 2021080401 W CN2021080401 W CN 2021080401W WO 2021190326 A1 WO2021190326 A1 WO 2021190326A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
touch
display unit
display area
unit
Prior art date
Application number
PCT/CN2021/080401
Other languages
English (en)
French (fr)
Inventor
王胜辉
杨成
张俊杰
蔡建松
蔡来收
刘康健
王立
Original Assignee
高创(苏州)电子有限公司
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 高创(苏州)电子有限公司, 京东方科技集团股份有限公司 filed Critical 高创(苏州)电子有限公司
Publication of WO2021190326A1 publication Critical patent/WO2021190326A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers

Definitions

  • One or more embodiments of this specification relate to the field of touch control technology, and in particular to a touch display device, a touch display method, and a computer-readable storage medium.
  • the touch unit is used to detect a user's touch operation, generate a touch signal, and send the generated touch signal to the processing unit;
  • the processing unit is configured to determine a target display unit corresponding to the touch operation from the first display unit and the second display unit according to the touch signal, and send the touch signal to the The target display unit corresponding to the touch operation;
  • the first display unit and/or the second display unit as the target display unit executes a corresponding operation according to the touch signal, and displays the operation result.
  • the processing unit is used to divide the total display area corresponding to the touch display device into a first display area and a second display area; wherein, the first display area corresponds to the first display unit; the second display area corresponds to Second display unit;
  • the touch signal includes: coordinates of the touch operation in the total display area;
  • the processing unit is configured to determine the target display area to which the coordinates belong, and use the display unit corresponding to the target display area as the target display unit; wherein the target display area is the first display area and the target display area. At least one of the second display area.
  • the processing unit is configured to pre-configure the correspondence between the coordinates in the total display area and the coordinates in the first display area and between the coordinates in the total display area and the coordinates in the second display area.
  • the processing unit is further configured to determine the coordinates of the touch operation in the target display area according to the correspondence and the coordinates of the touch operation in the total display area, and use the coordinates in the target display area.
  • the coordinates in the area replace the coordinates in the touch signal.
  • the first display unit includes: a first display board connected to the processing unit, a second display board, a switch connecting the first display board and the second display board, and a connection To the display screens of the first display board and the second display board; wherein, the first display board corresponds to the first display system; the second display board corresponds to the second display system; wherein,
  • the switch In response to the current operating system of the first display unit being the second display system, the switch receives the touch signal from the first display board and outputs it to the second display board.
  • the second display unit includes: a display board connected to the processing unit and a display screen connected to the display board.
  • first display unit and the second display unit are placed side by side and reuse the touch unit.
  • the touch control unit includes: an outer frame and a sensor emitting element and a receiving and sensing element mounted on the outer frame; wherein,
  • the outer frame surrounds the peripheral edge of the geometric figure obtained by splicing the first display unit and the second display unit that are placed side by side;
  • the sensor transmitting element and the receiving sensing element detect a user's touch operation in the area surrounded by the outer frame, and generate the touch signal.
  • the sensor emitting element is an infrared emitting element; and the receiving and sensing element is an infrared sensing element.
  • the first display unit and the second display unit are electronic whiteboards.
  • the touch display method described in the embodiment of this specification includes: after detecting a user's touch operation, generating a touch signal; according to the touch signal, from the first display unit and the second display unit included in the touch display device Determining the target display unit corresponding to the touch operation; and sending the touch signal to the target display unit corresponding to the touch operation, and the target display unit performs the corresponding operation according to the touch signal, And display the operation result.
  • the above method may further include: dividing the total display area corresponding to the touch display device into a first display area and a second display area; wherein the first display area corresponds to the first display unit; the second display area corresponds to the second display Unit; where,
  • the touch signal includes: coordinates of the touch operation in the total display area;
  • the determining the target display unit corresponding to the touch operation from the first display unit and the second display unit included in the touch display device according to the touch signal includes: determining the target display area to which the coordinates belong, and A display unit corresponding to the target display area is used as the target display unit; wherein the target display area is at least one of the first display area and the second display area.
  • the above method may further include: pre-configuring the correspondence between the coordinates in the total display area and the coordinates in the first display area and the correspondence between the coordinates in the total display area and the coordinates in the second display area Relationship; determine the coordinates of the touch operation in the target display area according to the correspondence and the coordinates of the touch operation in the total display area, and use the coordinates in the target display area to replace The coordinates in the touch signal.
  • the embodiment of this specification also proposes a non-transitory computer-readable storage medium, wherein the non-transitory computer-readable storage medium stores computer instructions, and the computer instructions are used to make the computer execute the above touch display method .
  • FIG. 1 is a schematic diagram of the internal structure of the touch display device according to one or more embodiments of this specification;
  • FIG. 2 is a schematic diagram of the positional relationship between the touch unit and the first display unit and the second display unit in some embodiments of this specification;
  • FIG. 3 is a schematic structural diagram of a touch display device according to some embodiments of this specification.
  • FIG. 4 is a schematic diagram of the structure of the touch display device according to some other embodiments of the specification.
  • FIG. 5 is a schematic flowchart of the touch display method according to one or more embodiments of this specification.
  • the touch display device may include: a first display unit 101, a second display unit 102, a touch unit 104, and a processing unit connected to the first display unit, the second display unit, and the touch unit 103.
  • the above-mentioned touch unit is used to detect a user's touch operation, generate a touch signal, and send the generated touch signal to the processing unit.
  • the processing unit determines the target display unit corresponding to the touch operation from the first display unit and the second display unit according to the touch signal (that is, according to the touch signal, the first display unit or the second display
  • the unit is determined as the target display unit corresponding to the above-mentioned touch operation), and the above-mentioned touch signal is sent to the target display unit corresponding to the above-mentioned touch operation.
  • the first display unit or the second display unit After receiving the touch signal, the first display unit or the second display unit performs a corresponding operation according to the touch signal, and displays the operation result.
  • the above-mentioned first display unit and second display unit may be electronic whiteboards.
  • the touch display device described in this specification can also be called a multi-screen interactive electronic whiteboard. It can also be called a multi-screen interactive tablet.
  • the multi-screen interactive electronic whiteboard can realize multiple functions such as multimedia display and touch operation, and can support multiple screens, so that it can be widely used in intelligent multimedia teaching or multimedia conferences.
  • the processing unit 103 may pre-divide the total display area corresponding to the touch display device into a first display area and a second display area; wherein, the first display area corresponds to the first display unit 101; The second display area corresponds to the second display unit 102.
  • the aforementioned touch signal may include: the coordinates of the user's touch operation in the aforementioned total display area.
  • the processing unit 103 will determine the target display area to which the coordinates in the touch signal belong, and use the display unit corresponding to the target display area as the target display unit; wherein, the target display area is the first display area and the One of the second display areas.
  • the touch signal when the touch display device supports multi-touch, especially when the touch operation on the first display unit 101 and the touch operation on the second display unit 102 are included at the same time, the touch signal may be Including: the multiple coordinates of the user's touch operation in the above-mentioned total display area, wherein the target display area to which some of the coordinates belong is the first display area, and the target display area to which the rest of the coordinates belong is the second display area .
  • the above-mentioned first display unit 101 and the second display unit 102 may be placed side by side and multiplexed with one touch unit 104. That is, two display units share one touch unit.
  • the aforementioned touch unit 104 may include: an outer frame, and a sensor emitting element and a receiving sensing element mounted on the outer frame.
  • the above-mentioned outer frame may surround the peripheral edge of the geometric figure obtained by splicing the first display unit 101 and the second display unit 102 placed side by side.
  • FIG. 2 shows the positional relationship between the touch unit 104 and the first display unit 101 and the second display unit 102 in some embodiments of this specification.
  • the first display unit 101 and the second display unit 102 placed side by side are spliced together to form a rectangle.
  • the outer frame of the above-mentioned touch unit 104 is also a rectangle, and surrounds the above-mentioned side-by-side second display unit.
  • a display unit 101 and a second display unit 102 form the outer edge of the rectangle.
  • the area covered by the rectangle can be regarded as the total display area corresponding to the touch display device.
  • the above-mentioned total display area can be divided into two display areas: display area A and display area B (as shown by the dotted line in FIG. 2), where the first display unit 101 corresponds to the display area A; The second display unit 102 corresponds to the display area B.
  • the division of the above-mentioned display area is related to the size of the screen of the display unit and the arrangement between the display units. That is, the processing unit 103 may divide the total display area into a first display area and a second display area according to the size of the display screens of the two display units and the splicing manner in advance.
  • the divided display area is generally equivalent to the size and shape of the screen of the display unit, so that a one-to-one correspondence between the display area and the display unit is realized.
  • the sensor emitting element and the receiving sensing element installed on the outer frame can detect the user's touch operation in the area surrounded by the outer frame, and generate a touch signal.
  • the above-mentioned sensor emitting element may be an infrared emitting element; the above-mentioned receiving and sensing element may be an infrared sensing element.
  • the above-mentioned receiving and sensing element may be an infrared sensing element.
  • the touch unit 104 can detect the user's touch operation and determine that the touch operation is within the total display area. Location information, that is, coordinates. Then, the touch control unit 104 will generate a touch signal, which includes the coordinates of the touch operation in the total display area. Next, the touch control unit 104 feeds back the touch signal to the processing unit 103, and the processing unit 103 determines the target display area to which the coordinates of the touch operation in the total display area belong, and sets the target display area corresponding to the target display area.
  • the display unit serves as the target display unit. Wherein, the target display area is one of the first display area and the second display area.
  • the processing unit 103 may also pre-configure the correspondence between the coordinates in the total display area and the coordinates in the first display area, and the coordinates in the total display area and the coordinates in the second display area. Correspondence between internal coordinates. In this way, after receiving the touch signal, the processing unit 103 may further determine the coordinates of the touch operation in the target display area according to the corresponding relationship and the coordinates of the touch operation in the total display area, and use the target display area. The coordinates in the display area replace the coordinates in the touch signal.
  • the first display unit 101 or the second display unit 102 may perform corresponding operations according to the coordinates in the touch signal and its current operation mode, and display the operation result in the target display area. .
  • the above-mentioned operation mode may be a touch mode or a drawing mode, and so on.
  • the touch mode usually means that the first display unit 101 or the second display unit 102 normally displays the graphical user interface, and the user can use a variety of touch methods, such as single-click, double-click, long-press, drag or slide, etc. Perform operations on objects such as the graphical user interface or icons displayed on the graphical user interface.
  • the drawing mode usually means that the first display unit 101 or the second display unit 102 is currently displaying a whiteboard or background board, and the user can write or draw on the whiteboard by touching.
  • the first display unit 101 or the second display unit 102 will further display the content written or drawn by the user on the basis of the whiteboard or the background board.
  • the operation mode of the above-mentioned first display unit 101 or the second display unit 102 may also be switched by a user's touch operation.
  • control buttons for various operation modes of a display device will be displayed at a designated position in the above-mentioned total display area, or a designated operation will trigger the display of various operation modes for a display device.
  • Operation mode control buttons At this time, the user can select the current operation mode of the display device by clicking these buttons. At this time, the interaction process of switching the operation mode is the same as the interaction process of the ordinary touch operation, and will not be repeated here.
  • each display unit can usually only be displayed in its corresponding display area.
  • the processing unit 103 may be connected to the first display unit 101 or the second display unit 102 through one or a combination of a serial interface (serial port for short) or a USB interface.
  • a serial interface serial port for short
  • a USB interface USB interface
  • the above-mentioned processing unit 103 may be connected to all the display units through a serial port or a USB interface; it may also be connected by a serial port for one display unit and a USB interface for another display unit.
  • the first display unit 101 is connected to the aforementioned processing unit 103 through a USB interface
  • the second display unit 102 is connected to the aforementioned processing unit 103 through a serial port.
  • first display unit 101 and second display unit 102 may use a single operating system, for example, may include an OPS board supporting the Windows system or an Android board supporting the Android system.
  • first display unit 101 and second display unit 102 may also adopt dual operating systems, for example, both include an OPS board supporting the Windows system and an Android board supporting the Android system.
  • the aforementioned display device should also include a display screen.
  • the above-mentioned first display unit 101 may generally adopt a dual operating system
  • the above-mentioned second display unit 102 may generally adopt a single operating system. Therefore, the user's needs can be met and the cost of the touch display device can be saved.
  • the first display unit may include: a first display board card, a second display board card connected to the processing unit 103, a switch connecting the first display board card and the second display board card, and a connection To the display screens of the first display board and the second display board.
  • the first display board card corresponds to the first display system
  • the second display board card corresponds to the second display system
  • the switch will change from the first
  • the display board receives the touch signal and outputs it to the second display board.
  • the second display unit may include: a display board connected to the processing unit 103 and a display screen connected to the display board.
  • the above-mentioned first board may be an Android board; the first operating system may be an Android system; the second board may be an OPS board; and the second operating system may be a Windows operating system.
  • the reverse is also possible.
  • the switch in the above-mentioned first display unit 101 will determine its own working status according to the current operating system of the first display unit 101, for example, When the current operating system is the operating system corresponding to the first board, it does not work; when the current operating system is the operating system corresponding to the second board, the touch signal will be received from the first board and output To the second board.
  • the Android board when the first display unit 101 uses the Android system as its first operating system, the Android board will be connected to the processing unit 103. At this time, the switch will be installed on the Android board and connect the output to the OPS board.
  • the switch When the current operating system of the first display unit 101 is Android, the switch does not work; and when the current operating system of the first display unit 101 is Windows, the switch will receive from the Android board The touch signal is output to the OPS board.
  • the OPS board will be connected to the processing unit 103.
  • the switch will be installed on the OPS board and connect the output to the Android board.
  • the switch does not work; and when the current operating system of the first display unit 101 is Android, the switch will receive from the OPS board. The touch signal is output to the Android board.
  • the above-mentioned switch may be a USB switch (USB Switch).
  • the total display area formed by splicing two display units is divided into at least two display areas, and the two divided display areas are corresponding to the two display units.
  • a multi-screen touch display device is realized.
  • the processing unit 103 can determine the target display area to which the user’s operation belongs according to the position of the user’s touch operation, and then determine the display unit corresponding to the user’s touch operation, so that the user’s
  • the touch signal generated by the touch operation is sent to the corresponding display unit, and the corresponding display unit executes the corresponding operation and gives the corresponding response to realize the multi-screen interactive touch operation, which can be performed on these multiple screens.
  • the above-mentioned touch display device can also support multiple systems, and the configuration is more flexible, so as to meet the application requirements of users to a greater extent.
  • the above method is not only applicable to a dual-screen touch display device, but also can be generally applied to a multi-screen touch display device.
  • one or more display units may be further added.
  • the multiple display units mentioned above multiplex a set of touch unit 104 and processing unit 103 together.
  • the above-mentioned processing unit 103 can determine the total display area and divide the total display area according to the size and arrangement of the respective display screens of the multiple display units, so that the divided multiple display areas are one-to-one corresponding to each display unit.
  • the processing unit 103 can determine the display unit corresponding to the user's touch operation according to the coordinate information in the touch signal, so as to send the touch signal to the corresponding display unit for touch operation processing.
  • FIG. 3 shows a schematic structural diagram of the touch display device according to an embodiment of the specification.
  • the touch display device may include: a touch unit 301, a processing unit 302, a main display unit 303 connected to the processing unit 302 through a USB interface, and a secondary display unit connected to the processing unit 302 through a serial port 304.
  • the above-mentioned main display unit 303 and the auxiliary display unit 304 are both a single operating system, for example, may include an OPS board supporting the Windows system or an Android board supporting the Android system.
  • both the primary display unit 303 and the secondary display unit 304 can use Android boards, as shown in FIG. 3.
  • the main display unit 303 and the auxiliary display unit 304 may be spliced side by side, and the outer frame of the touch unit 301 may cover the outer side of the main display unit 303 and the auxiliary display unit 304,
  • the area surrounded by the outer frame of the touch unit 301 is divided into two display areas according to the size of the display screens of the main display unit 303 and the auxiliary display unit 304 and the splicing method (as shown by the two dashed boxes in the figure), Among them, the display area A corresponds to the main display unit 303; the display area B corresponds to the sub display unit 304.
  • the coordinates in the area surrounded by the outer frame of the touch unit 301 corresponding to the main display unit 303 and the sub display unit 304 can be established according to the size and position of the display screens of the main display unit 303 and the sub display unit 304. Correspondence of coordinates in the display area.
  • the output coordinate range of the total display area occupied by the main display unit 303 can be defined as X axis: 0 ⁇ 160, Y axis: 0 ⁇ 100; the output coordinate range delimiting the total display area occupied by the auxiliary display unit 304 is X axis: 165 ⁇ 255, Y axis: 0 ⁇ 80.
  • the above-mentioned coordinate range can be determined according to the size of the display screens of the main display unit and the auxiliary display unit.
  • the display screens of the main display unit and the auxiliary display unit can be spliced together to determine the size of the total display Size, and split the total display screen to determine the range of X-axis coordinates and Y-axis coordinates output by the touch unit 201;
  • the scale respectively determines the output coordinate range of the main display unit and the output coordinate range of the auxiliary display unit.
  • the above-mentioned method is not only applicable to dual-screen touch display devices, but can also be universally applied to multi-screen interactive touch display devices.
  • the corresponding relationship between the coordinates in the area surrounded by the outer frame of the touch unit 301 and the coordinates in the display area corresponding to the main display unit and the sub display unit can be established.
  • the processing unit 302 is connected to the main display unit 303 through a USB interface and to the secondary display unit 304 through a serial port, in actual applications, the processing unit 302 is It is also possible to use a USB interface to connect to the aforementioned secondary display unit 304; also to use a serial port to connect to the aforementioned main display unit 303; or to use other connection interfaces between devices without going beyond the scope of this specification.
  • this example provides a dual-screen interactive touch display device, which can include two independent screens.
  • the user can control the above two independent screens individually or simultaneously through an infrared touch frame. It is like operating a screen with two screens.
  • this example also has the characteristics of easy implementation and low cost.
  • the associated operation of the above-mentioned main display unit 303 and the auxiliary display unit 304 can also be realized through the hardware and software associated design of the above-mentioned main display unit 303 and the auxiliary display unit 304.
  • FIG. 4 shows a schematic structural diagram of a dual-screen interactive touch display device according to another embodiment of this specification.
  • the dual-screen interactive touch display device may include: a touch unit 401, a processing unit 402, a main display unit 403 connected to the aforementioned processing unit 402 via a USB interface, and a serial port connected to the aforementioned processing unit 402
  • the secondary display unit 404 may include: a touch unit 401, a processing unit 402, a main display unit 403 connected to the aforementioned processing unit 402 via a USB interface, and a serial port connected to the aforementioned processing unit 402
  • the secondary display unit 404 may include: a touch unit 401, a processing unit 402, a main display unit 403 connected to the aforementioned processing unit 402 via a USB interface, and a serial port connected to the aforementioned processing unit 402
  • the secondary display unit 404 may include: a touch unit 401, a processing unit 402, a main display unit 403 connected to the aforementioned processing unit
  • the above-mentioned main display unit 403 supports dual operating systems, including: an Android board supporting the Android system and an OPS board supporting the Windows system. Among them, assuming that the Android operating system is the default operating system, at this time, the Android board will be connected to the aforementioned USB interface. In order to switch the touch data between the two systems, the main display unit 403 will also include a USB switch that connects the Android board and the OPS board that supports the Windows system. The USB switch will be installed on the Android board. , And connect the output to the OPS board. When the current operating system is Android, the switch does not work; when the current operating system is Windows, the switch receives touch data from the Android board and outputs it to the OPS board.
  • the aforementioned secondary display unit 404 supports a single operating system, for example, it may include an OPS board that supports the Windows system or an Android board that supports the Android system. For cost considerations, the aforementioned secondary display unit 404 may use an Android board.
  • the main display unit 403 and the auxiliary display unit 404 may be spliced side by side, and the outer frame of the touch unit 401 may cover the outer side of the main display unit 403 and the auxiliary display unit 404,
  • the area surrounded by the outer frame of the touch unit 401 is divided into two display areas (as shown by the two dashed boxes in the figure) according to the size of the display screens of the main display unit 403 and the auxiliary display unit 404 and the splicing method.
  • the display area A corresponds to the main display unit 403; the display area B corresponds to the sub display unit 404.
  • the method for dividing the display area and the method for mapping the coordinates between the display areas can refer to the example shown in FIG. 3 above, and the description is not repeated here.
  • processing unit 402 is connected to the secondary display unit 404 through a serial port, in actual applications, the processing unit 402 can also use a USB interface to connect to the secondary display unit. 404 or other connection interfaces between devices can be used without going beyond the scope of this specification.
  • the main display unit 403 supports dual operating systems and the auxiliary display unit 404 supports single operating systems
  • the auxiliary display unit 404 may also support dual operating systems. operating system.
  • the aforementioned secondary display unit 404 may also include a switch for connecting its Android board and OPS board, so as to realize the switching of the touch signal between the two operating systems that it supports.
  • this example provides a dual-screen interactive touch display device, which can include two independent screens.
  • the user can control the above two independent screens individually or simultaneously through a touch unit. It is like operating a screen with two screens.
  • the above-mentioned main display unit 403 supports not only the Android system but also the Windows system, which improves the compatibility of the above-mentioned dual-screen interactive touch display device and can meet the various needs of users.
  • the dual-screen interactive touch display device also has the characteristics of easy implementation and low cost.
  • the processing unit may also preconfigure the correspondence between the coordinates in the total display area and the coordinates in the first display area, and the correspondence between the coordinates in the total display area and the coordinates in the second display area. relation.
  • the above-mentioned corresponding relationship may be a coordinate mapping relationship, and usually may be a pre-configured mathematical expression representing the coordinate mapping relationship. According to this mathematical expression, the coordinates of the user's touch operation in the target display area can be calculated according to the coordinates of the user's touch operation in the area surrounded by the outer frame of the touch unit.
  • the above-mentioned coordinate mapping relationship can be determined at the same time when the display area is divided and configured.
  • the output coordinate range of each display device is divided according to the size and arrangement position of the display screens of the multiple display devices.
  • the mapping relationship of the above coordinates can be directly determined, that is, the The mapping relationship from the coordinates in the range enclosed by the infrared touch frame to the coordinates in the display device.
  • the following uses a specific example to illustrate the specific operation process of the above-mentioned processing unit in detail. It is assumed that a planar rectangular coordinate system is used to represent the area surrounded by the outer frame of the touch unit, that is, the total display area is the range of X-axis coordinates from 0 to 255, and the Y-axis coordinates range from 0 to 100.
  • the above area is divided into two display areas: the X-axis coordinate range of display area 1 is 0-160, and the Y-axis coordinate range is 0-100.
  • This display area corresponds to the first display unit;
  • the X-axis coordinate range of display area 2 is 161- 255, the Y-axis coordinate range is 0-100, and the display area corresponds to the second display unit.
  • the mapping relationship from the coordinates in the total display area to the coordinates of the second display unit can be established as the X-axis coordinate minus 160, and the Y-axis coordinate remains unchanged.
  • the processing unit may first determine that the position of the touch operation belongs to display area 1, that is, corresponds to the first Display unit. Then, the processing unit then calculates the correspondence between the coordinates of each position in the total display area (the coordinates of each position in the display area 1) and the coordinates of each position in the display area corresponding to the first display unit (for example, a mapping expression) Obtain the position of the user's touch operation in the display area corresponding to the first display unit, such as (50, 50), and connect the touch signal to the output port of the first display unit through. At this time, after receiving the above touch signal, the first display unit will perform a corresponding operation according to the current operation mode, and output a corresponding response.
  • the processing unit may first determine that the position of the touch operation belongs to display area 2, that is, corresponds to the first Two display unit. Then, the processing unit then calculates the corresponding relationship between the coordinates of each position in the total display area (the coordinates of each position in the display area 2) and the coordinates of each position in the display area corresponding to the second display unit (for example, a mapping expression) Obtain the position of the user's touch operation in the display area corresponding to the second display unit, such as (40, 50), and connect the touch signal to the output port of the second display unit. At this time, after receiving the aforementioned touch signal, the second display unit will perform a corresponding operation according to the current operation mode, and output a corresponding response.
  • the above processing unit can also directly send the above coordinates (200, 50) to the corresponding second display unit, and the second display unit determines that the touch operation is on its own according to its own configuration. Corresponding position on the display.
  • the processing unit can determine the display area to which the user's operation belongs according to the position of the user's touch operation, and then determine the output port that outputs the touch signal, so that the corresponding The output port will send the touch signal to the corresponding display unit, and the corresponding display unit will perform the corresponding operation and give the corresponding response to realize the touch operation of the multi-screen interactive touch display device.
  • the screen can be operated individually, or multiple screens can be operated at the same time, and it can even provide a hardware basis for associated operations between multiple screens.
  • the embodiments of this specification also provide a touch display method, which can be implemented by a touch display device.
  • FIG. 5 is a flowchart of the touch display method according to an embodiment of the specification. As shown in Figure 5, the method may include:
  • Step 502 After detecting the user's touch operation, generate a touch signal.
  • the touch signal may include the coordinates of the user's touch operation in the total display area corresponding to the touch display device.
  • Step 504 Determine a target display unit corresponding to the above-mentioned touch operation from the first display unit and the second display unit included in the touch display device according to the touch signal.
  • Step 506 Send the touch signal to the target display unit corresponding to the touch operation.
  • the target display unit After receiving the touch signal, the target display unit performs a corresponding operation according to the touch signal, and displays the operation result.
  • the above method may further include: dividing the total display area corresponding to the touch display device into a first display area and a second display area; wherein the first display area corresponds to the first display unit; The second display area corresponds to the second display unit; among them,
  • determining the target display unit corresponding to the touch operation from the first display unit and the second display unit included in the touch display device according to the touch signal may include: determining to which the coordinates belong A target display area, and a display unit corresponding to the target display area is used as the target display unit; wherein the target display area is one of the first display area and the second display area.
  • the above method may further include: pre-configuring the correspondence between the coordinates in the total display area and the coordinates in the first display area, and the coordinates in the total display area and the second The corresponding relationship between the coordinates in the display area.
  • step 506 it may be further performed: determining the coordinates of the touch operation in the target display area according to the correspondence and the coordinates of the touch operation in the total display area, And use the coordinates in the target display area to replace the coordinates in the touch signal.
  • the processing unit can determine the display unit corresponding to the user's operation according to the position of the user's touch operation, and the corresponding The display unit performs corresponding operations and gives corresponding responses to realize multi-screen interactive touch operations.
  • this correspondence can be achieved by dividing the total display area corresponding to the touch display device into two display areas, and the two display areas and the two display units can be one-to-one corresponding to each other. It can be seen that through the touch display method of this specification, multiple screens of the touch display device can be operated individually, multiple screens can be operated simultaneously, and even related operations between multiple screens can be provided. Develop the hardware and software foundation.
  • the embodiment of this specification also proposes a non-transitory computer-readable storage medium, wherein the non-transitory computer-readable storage medium stores computer instructions, and the computer instructions It is used to make the computer execute the above touch display method.
  • the computer-readable medium of this embodiment includes permanent and non-permanent, removable and non-removable media, and information storage can be realized by any method or technology.
  • the information can be computer-readable instructions, data structures, program modules, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical storage, Magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices.
  • PRAM phase change memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • RAM random access memory
  • ROM read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory or other memory technology
  • the accompanying drawings may or may not be shown in relation to integrated circuit (IC) chips and other components.
  • IC integrated circuit
  • Well-known power/ground connection IC
  • the device may be shown in the form of a block diagram in order to avoid making one or more embodiments of this specification difficult to understand, and this also takes into account the fact that the details about the implementation of these block diagram devices are highly dependent on the implementation of the present invention. Description of the platform of one or more embodiments (that is, these details should be fully within the understanding of those skilled in the art).
  • DRAM dynamic RAM

Abstract

一种触控显示装置、方法及存储介质,该装置包括:第一显示单元(101)、第二显示单元(102)、触控单元(104)以及连接所述第一显示单元(101)、所述第二显示单元(102)和所述触控单元(104)的处理单元(103);其中,所述触控单元(104)用于检测用户的触控操作,生成触控信号,以及将生成的触控信号发送至所述处理单元(103);所述处理单元(103)用于根据所述触控信号从所述第一显示单元(101)和第二显示单元(102)中确定与所述触控操作对应的目标显示单元,并将所述触控信号发送至与所述触控操作对应的目标显示单元;以及作为目标显示单元的所述第一显示单元(101)和/或第二显示单元(102)在接收到所述触控信号后,根据所述触控信号执行相应操作,并显示操作结果。

Description

触控显示装置、触控显示方法及存储介质
相关申请的交叉引用
本申请主张在2020年3月24日在中国提交的中国专利申请号No.202010212406.1的优先权,其全部内容通过引用包含于此。
技术领域
本说明书一个或多个实施例涉及触摸控制技术领域,尤其涉及一种触控显示装置、触控显示方法及计算机可读存储介质。
背景技术
目前多数的交互式电子白板或者又可称为交互式平板都是单个屏幕的,也即只能用一个屏幕进行单一的显示或触控操作,而不能同时支持多个屏幕。
随着多媒体教学、多媒体会议等技术领域的广泛应用和持续发展,出现了希望交互式电子白板或交互式平板可以支持多个屏幕,并且既可以对这多个屏幕进行单独操作,也可以对多个屏幕进行同时操作,甚至进行关联操作的需求。
发明内容
本说明书实施例所述的触控显示装置包括:
第一显示单元、第二显示单元、触控单元以及连接所述第一显示单元、所述第二显示单元和所述触控单元的处理单元;其中,
所述触控单元用于检测用户的触控操作,生成触控信号,以及将生成的触控信号发送至所述处理单元;
所述处理单元用于根据所述触控信号从所述第一显示单元和第二显示单元中确定与所述触控操作对应的目标显示单元,并将所述触控信号发送至与所述触控操作对应的目标显示单元;以及
作为所述目标显示单元的所述第一显示单元和/或第二显示单元在接收到所述触控信号后,根据所述触控信号执行相应操作,并显示操作结果。
其中,所述处理单元用于预先将所述触控显示装置对应的总显示区域划分为第一显示区域和第二显示区域;其中,第一显示区域对应第一显示单元;第二显示区域对应第二显示单元;
所述触控信号包括:所述触控操作在所述总显示区域内的坐标;
所述处理单元用于确定所述坐标所属的目标显示区域,并将所述目标显示区域对应的显示单元作为所述目标显示单元;其中,所述目标显示区域为所述第一显示区域和所述第二显示区域中的至少一个。
其中,所述处理单元用于预先配置所述总显示区域内坐标与所述第一显示区域内坐标之间的对应关系以及所述总显示区域内坐标与所述第二显示区域内坐标之间的对应关系;
所述处理单元进一步用于根据所述对应关系以及所述触控操作在所述总显示区域内的坐标确定所述触控操作在所述目标显示区域内的坐标,并使用在所述目标显示区域内的坐标替换所述触控信号中的坐标。
其中,所述第一显示单元包括:连接至所述处理单元的第一显示板卡、第二显示板卡、连接所述第一显示板卡和所述第二显示板卡的切换开关以及连接至所述第一显示板卡和第二显示板卡的显示屏;其中,所述第一显示板卡对应第一显示系统;所述第二显示板卡对应第二显示系统;其中,
响应于所述第一显示单元当前的操作系统为第二显示系统,所述切换开关从所述第一显示板卡接收所述触控信号并输出至所述第二显示板卡。
其中,所述第二显示单元包括:连接至所述处理单元的显示板卡以及连接至所述显示板卡的显示屏。
其中,所述第一显示单元和所述第二显示单元并列放置且复用所述触控单元。
其中,所述触控单元包括:一个外框以及安装在外框上的传感器发射元件和接收感测元件;其中,
所述外框包围在由并列放置的所述第一显示单元和所述第二显示单元拼接得到的几何图形的外围边缘上;
所述传感器发射元件和接收感测元件检测用户在所述外框所围绕区域内的触控操作,生成所述触控信号。
其中,所述传感器发射元件为红外发射元件;以及所述接收感测元件为红外感测元件。
其中,所述第一显示单元和第二显示单元为电子白板。
本说明书实施例所述的触控显示方法包括:检测到用户的触控操作后,生成触控信号;根据所述触控信号从触控显示装置包含的第一显示单元和第二显示单元中确定与所述触控操作对应的目标显示单元;以及将所述触控信号发送至与所述触控操作对应的目标显示单元,由所述目标显示单元根据所述触控信号执行相应操作,并显示操作结果。
上述方法可以进一步包括:将所述触控显示装置对应的总显示区域划分为第一显示区域和第二显示区域;其中,第一显示区域对应第一显示单元;第二显示区域对应第二显示单元;其中,
所述触控信号包括:所述触控操作在所述总显示区域内的坐标;
所述根据所述触控信号从触控显示装置包含的第一显示单元和第二显示单元中确定与所述触控操作对应的目标显示单元包括:确定所述坐标所属的目标显示区域,并将所述目标显示区域对应的显示单元作为所述目标显示单元;其中,所述目标显示区域为所述第一显示区域和所述第二显示区域中的至少一个。
上述方法可以进一步包括:预先配置所述总显示区域内坐标与所述第一显示区域内坐标之间的对应关系以及所述总显示区域内坐标与所述第二显示区域内坐标之间的对应关系;根据所述对应关系以及所述触控操作在所述总显示区域内的坐标确定所述触控操作在所述目标显示区域内的坐标,并使用在所述目标显示区域内的坐标替换所述触控信号中的坐标。
本说明书实施例还提出了一种非暂态计算机可读存储介质,其中,所述非暂态计算机可读存储介质存储计算机指令,所述计算机指令用于使所述计算机执行上述触控显示方法。
附图说明
为了更清楚地说明本说明书一个或多个实施例或相关技术中的技术方案,下面将对实施例或相关技术描述中所需要使用的附图作简单地介绍,显而易 见地,下面描述中的附图仅仅是本说明书一个或多个实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本说明书一个或多个实施例所述触控显示装置的内部结构示意图;
图2为本说明书的一些实施例中触控单元与第一显示单元和第二显示单元之间的位置关系示意图;
图3为本说明书一些实施例所述的触控显示装置的结构示意图;
图4为本说明书另一些实施例所述的触控显示装置的结构示意图;
图5为本说明书一个或多个实施例所述的触控显示方法流程示意图。
具体实施方式
为使本公开的目的、技术方案和优点更加清楚明白,以下结合具体实施例,并参照附图,对本公开进一步详细说明。
需要说明的是,除非另外定义,本说明书一个或多个实施例使用的技术术语或者科学术语应当为本公开所属领域内具有一般技能的人士所理解的通常意义。本说明书一个或多个实施例中使用的“第一”、“第二”以及类似的词语并不表示任何顺序、数量或者重要性,而只是用来区分不同的组成部分。“包括”或者“包含”等类似的词语意指出现该词前面的元件或者物件涵盖出现在该词后面列举的元件或者物件及其等同,而不排除其他元件或者物件。“连接”或者“相连”等类似的词语并非限定于物理的或者机械的连接,而是可以包括电性的连接,不管是直接的还是间接的。“上”、“下”、“左”、“右”等仅用于表示相对位置关系,当被描述对象的绝对位置改变后,则该相对位置关系也可能相应地改变。
本说明书的实施例提供了一种触控显示装置。如图1所示,触控显示装置可以包括:第一显示单元101、第二显示单元102、触控单元104以及连接上述第一显示单元、上述第二显示单元和上述触控单元的处理单元103。
在本说明书的实施例中,上述触控单元用于检测用户的触控操作,生成触控信号,以及将生成的触控信号发送至处理单元。
上述处理单元根据上述触控信号从上述第一显示单元和第二显示单元中 确定与上述触控操作对应的目标显示单元(即,根据上述触控信号,将上述第一显示单元或第二显示单元确定为与上述触控操作对应的目标显示单元),并将上述触控信号发送至与上述触控操作对应的目标显示单元。
上述第一显示单元或第二显示单元在接收到所述触控信号后,根据上述触控信号执行相应操作,并显示操作结果。
需要说明的是,在本说明书的实施例中,上述第一显示单元和第二显示单元可以是电子白板,此时,本说明书所述的触控显示装置亦可称为多屏幕交互式电子白板亦可称为多屏幕交互式平板。有别于传统电子白板,该多屏幕交互式电子白板可以实现多媒体显示以及触控操作等多种功能,并可以支持多个屏幕,从而可以广泛应用于智能化的多媒体教学或多媒体会议中。
在本说明书的实施例中,上述处理单元103可以预先将上述触控显示装置对应的总显示区域划分为第一显示区域和第二显示区域;其中,第一显示区域对应第一显示单元101;第二显示区域对应第二显示单元102。
此时,上述触控信号可以包括:用户的触控操作在上述总显示区域内的坐标。上述处理单元103将确定上述触控信号中的坐标所属的目标显示区域,并将上述目标显示区域对应的显示单元作为目标显示单元;其中,上述目标显示区域为所述第一显示区域和所述第二显示区域之一。在一实施例中,当触控显示装置支持多点触控时,特别是同时包括第一显示单元101上的触控操作以及第二显示单元102上的触控操作时,上述触控信号可以包括:用户的触控操作在上述总显示区域内的多个坐标,其中,部分坐标所属的目标显示区域为所述第一显示区域,其余部分坐标所属的目标显示区域为所述第二显示区域。
在本说明书的实施例中,上述第一显示单元101和第二显示单元102可以是并列放置且复用一个触控单元104的。也即两个显示单元共用一个触控单元。此时,上述触控单元104可以包括:一个外框以及安装在外框上的传感器发射元件和接收感测元件。具体地,上述外框可以包围在由并列放置的第一显示单元101和第二显示单元102拼接得到的几何图形的外围边缘上。
图2显示了本说明书的一些实施例中上述触控单元104与上述第一显示单元101和第二显示单元102之间的位置关系。如图2所示,并列放置的第 一显示单元101和第二显示单元102拼接在一起构成了一个矩形,如此,上述触控单元104的外框也是一个矩形,并包围在上述并列放置的第一显示单元101和第二显示单元102所构成矩形的外围边缘上。该矩形覆盖的区域即可视为该触控显示装置所对应的总显示区域。此时,在本说明书的实施例中,上述总显示区域可以划分为两个显示区域:显示区域A和显示区域B(如图2中虚线所示),其中,第一显示单元101对应显示区域A;第二显示单元102对应显示区域B。上述显示区域的划分与显示单元屏幕的大小以及显示单元之间的排列方式有关。也即,上述处理单元103可以预先根据上述两个显示单元的显示屏的尺寸以及拼接方式将上述总显示区域划分为第一显示区域和第二显示区域。通常,划分出的显示区域一般与显示单元屏幕的大小和形状基本相当,从而实现显示区域与显示单元的一一对应。
在本说明书的实施例中,上述外框上安装的传感器发射元件和接收感测元件可以检测用户在上述外框所围绕区域内的触控操作,并生成触控信号。
此外,在本说明书的一些实施例中,上述传感器发射元件可以为红外发射元件;上述接收感测元件可以为红外感测元件。上述红外发射元件和红外感测元件可以有多个,共同在上述外框所围绕的区域内也即总显示区域内形成红外感测网,从而可以检测到用户在上述区域内的触控操作。
如此,在用户在上述触控显示装置所对应的总显示区域内进行触控操作时,上述触控单元104可以检测到用户的触控操作,并确定上述触控操作在上述总显示区域内的位置信息,也即坐标。然后,上述触控单元104将生成触控信号,其中包括上述触控操作在上述总显示区域内的坐标。接下来,上述触控单元104将触控信号反馈给处理单元103,由上述处理单元103确定上述触控操作在上述总显示区域内的坐标所属的目标显示区域,并将该目标显示区域对应的显示单元作为所述目标显示单元。其中,上述目标显示区域为上述第一显示区域和第二显示区域中的一个。
在本说明书的另一些实施例中,上述处理单元103还可以预先配置上述总显示区域内坐标与上述第一显示区域内坐标之间的对应关系以及上述总显示区域内坐标与上述第二显示区域内坐标之间的对应关系。这样,在接收到触控信号之后,上述处理单元103可以进一步根据上述对应关系以及上述触 控操作在总显示区域内的坐标确定上述触控操作在目标显示区域内的坐标,并使用该在目标显示区域内的坐标替换触控信号中的坐标。
上述第一显示单元101或者第二显示单元102在接收到上述触控信号后,可以根据上述触控信号中的坐标以及自身当前的操作模式执行相应操作,并在上述目标显示区域内显示操作结果。
在本说明书的实施例中,上述操作模式可以是触控模式或画图模式等等。其中,触控模式通常是指第一显示单元101或第二显示单元102正常显示图形用户界面,用户可以通过多种触控方式,例如单击、双击、长按、拖拽或者滑动等方式对图形用户界面或者图形用户界面上显示的图标等对象进行操作。而画图模式通常是指第一显示单元101或第二显示单元102当前显示一个白板或者背景板,用户可以通过触摸的方式在白板上进行书写或者绘画等,第一显示单元101或第二显示单元102会在白板或者背景板的基础之上进一步显示用户书写或者绘画的内容。
在本说明书的实施例中,上述第一显示单元101或第二显示单元102的操作模式也可以是通过用户的触控操作来进行切换的。例如,在本申请的一些实施例中,在上述总显示区域内的指定位置会显示针对一个显示装置的各种操作模式的控制按键,或者通过指定的操作会触发显示针对一个显示装置的各种操作模式的控制按键,此时,用户可以通过点击这些按键来选择当前对该显示装置的操作模式。这时,对操作模式进行切换的交互过程与普通的触控操作的交互过程是相同的,在此不再赘述。
需要说明的是,在本说明书的实施例中,每个显示单元通常仅可以其对应的显示区域中进行显示。
在本说明书的实施例中,上述处理单元103可以通过串行接口(简称串口)或者USB接口之一或其组合与上述第一显示单元101或第二显示单元102进行连接。例如,上述处理单元103可以均通过串口或者均通过USB接口与所有显示单元连接;也可以对于一个显示单元使用串口连接,而对于另一个显示单元使用USB接口连接。例如,第一显示单元101通过USB接口连接至上述处理单元103,而第二显示单元102则通过串口连接至上述处理单元103。
在本说明书的一些实施例中,上述第一显示单元101和第二显示单元102可以采用单操作系统,例如可以包括支持Windows系统的OPS板卡或支持安卓系统的安卓板卡。
在本说明书的另一些实施例中,上述第一显示单元101和第二显示单元102还可以采用双操作系统,例如同时包括支持Windows系统的OPS板卡以及支持安卓系统的安卓板卡。
可以理解,除了上述安卓板卡和/或OPS板卡之外,上述显示装置还应当包括显示屏。
特别地,在本说明书的一些实施例中,上述第一显示单元101通常可以采用双操作系统,而上述第二显示单元102则通常可以采用单操作系统。从而既可以满足用户的需求,还可以节约触控显示装置的成本。
在这种情况下,上述第一显示单元可以包括:连接至上述处理单元103的第一显示板卡、第二显示板卡、连接第一显示板卡和第二显示板卡的切换开关以及连接至第一显示板卡和第二显示板卡的显示屏。其中,第一显示板卡对应第一显示系统;第二显示板卡对应第二显示系统;而且,在上述第一显示单元当前的操作系统为第二显示系统时,上述切换开关将从第一显示板卡接收上述触控信号并输出至上述第二显示板卡。
此时,上述第二显示单元可以包括:连接至上述处理单元103的显示板卡以及连接至上述显示板卡的显示屏。
在本说明书的实施例中,上述第一板卡可以为安卓板卡;第一操作系统可以为安卓系统;第二板卡可以为OPS板卡;第二操作系统可以为Windows操作系统。当然,反过来也是可以的。
当上述第一显示单元101采用双操作系统时,在本说明书的实施例中,上述第一显示单元101中的切换开关将根据第一显示单元101当前的操作系统确定自身的工作状态,例如,在当前的操作系统是第一板卡对应的操作系统时,则不工作;而在当前的操作系统是第二板卡对应的操作系统时,则将从第一板卡接收触控信号并输出到第二板卡上。
例如,当上述第一显示单元101将安卓系统作为自身的第一操作系统时,安卓板卡将连接至上述处理单元103。此时,切换开关将安装在安卓板卡上, 并将输出连接到OPS板卡。当上述第一显示单元101当前的操作系统是安卓系统时,则切换开关不工作;而当上述第一显示单元101当前的操作系统为Windows系统时,则上述切换开关将从上述安卓板卡接收触控信号并输出至OPS板卡。
反过来,当上述第一显示单元101将Windows系统作为自身的第一操作系统时,OPS板卡将连接至上述处理单元103。此时,切换开关将安装在OPS板卡上,并将输出连接到安卓板卡。当上述第一显示单元101当前的操作系统是Windows系统时,则切换开关不工作;而当上述第一显示单元101当前的操作系统为安卓系统时,则上述切换开关将从上述OPS板卡接收触控信号并输出至安卓板卡。
在本说明书的实施例中,上述切换开关可以是USB切换开关(USB Switch)。
可以看出,在本说明书的实施例中,通过将两个显示单元拼接而成的总显示区域划分为至少两个显示区域,并将划分的两个显示区域和两个显示单元对应起来,可以实现一种多屏幕触控显示装置。这样,当用户进行触控操作时,处理单元103可以根据用户的触控操作的位置确定用户的操作所属的目标显示区域,进而确定用户的触控操作所对应的显示单元,从而可以将由用户的触控操作所产生的触控信号发送至对应的显示单元,由对应的显示单元执行相应的操作,给出相应的响应,以实现多屏幕交互式触控操作,既可以对这多个屏幕进行单独操作,也可以对多个屏幕进行同时操作,甚至可以为多个屏幕之间的关联操作提供硬件基础。而且,由于多个显示单元可以共用同一套触控单元和处理单元,上述触控显示装置的实现成本较低。更进一步,上述触控显示装置还可以支持多种系统,配置更为灵活,从而更大程度地满足用户的应用需求。
需要说明的是,上述方法不仅适用于双屏幕的触控显示装置还可以通用于多屏幕的触控显示装置。例如,除了上述第一显示单元101和第二显示单元102之外,还可以进一步增加一个或多个显示单元。上述多个显示单元一起复用一套触控单元104和处理单元103。上述处理单元103可以根据多个显示单元各自显示屏的大小以及排列方式确定总显示区域并进行总显示区域 的划分,从而将划分后的多个显示区域与各个显示单元一一对应起来。并在接收到触控信号后,处理单元103可以根据触控信号中的坐标信息确定用户触控操作所对应的显示单元,从而将触控信号发送给相应的显示单元进行进行触控操作处理。
下面将结合附图以及具体示例详细说明本说明书实施例的技术方案。
图3显示了本说明书一个实施例所述的触控显示装置的结构示意图。如图3所示,该触控显示装置可以包括:触控单元301、处理单元302、通过USB接口连接至上述处理单元302的主显示单元303以及通过串口连接至上述处理单元302的副显示单元304。
上述主显示单元303与副显示单元304均是单操作系统,例如可以包括支持Windows系统的OPS板卡或支持安卓系统的安卓板卡。出于成本方面的考虑,上述主显示单元303与副显示单元304均可以使用安卓板卡,如图3中所示。
在图3所示的示例中,上述主显示单元303与副显示单元304可以并排拼接在一起,且上述触控单元301的外框可以覆盖在上述主显示单元303与副显示单元304的外侧,并根据上述主显示单元303与副显示单元304的显示屏的尺寸及拼接方式将上述触控单元301的外框所围绕区域划分为两个显示区域(如图中两个虚线框所示),其中,显示区域A对应于主显示单元303;显示区域B对应于副显示单元304。
更进一步,还可以根据上述主显示单元303与副显示单元304的显示屏的大小及位置建立上述触控单元301的外框所围绕区域内的坐标与主显示单元303与副显示单元304所对应显示区域内坐标的对应关系。
具体地,假设上述触控单元301输出的X轴坐标的范围为0~255,输出的Y轴坐标范围为0~100,则可以划定主显示单元303占用的总显示区域的输出坐标范围为X轴:0~160,Y轴:0~100;划定副显示单元304占用的总显示区域的输出坐标范围为X轴:165~255,Y轴:0~80。
在实际的应用中,上述坐标范围可以根据主显示单元和副显示单元的显示屏的大小来确定,例如,可以将主显示单元和副显示单元的显示屏先拼接 在一起,确定总显示屏的大小,并对总显示屏进行切分,以确定触控单元201输出的X轴坐标的范围和Y轴坐标范围;然后,再根据主显示单元和副显示单元的显示屏占总显示屏大小的比例分别确定主显示单元的输出坐标范围和副显示单元的输出坐标范围。需要说明的是,上述方法不仅适用于双屏幕的触控显示装置还可以通用于多屏幕的交互式触控显示装置。更进一步,基于上述方法,还可以建立上述触控单元301的外框所围绕区域内的坐标与主显示单元与副显示单元所对应显示区域内坐标的对应关系。
需要说明的是,虽然在图3所示的示例中,上述处理单元302是通过USB接口连接上述主显示单元303且通过串口连接上述副显示单元304的,在实际的应用中,上述处理单元302也可以使用USB接口连接上述副显示单元304;也可以使用串口连接上述主显示单元303;或者还可以使用其他的设备之间的连接接口,而不会超出本说明书的范围。
可以看出,本示例提供了一种双屏幕交互式触控显示装置,可以包括两个独立的屏幕,用户可以通过一个红外触摸框实现对上述两个独立屏幕的单独或同时控制,看起来就像在操作一块包含两个画面的屏幕。而且本示例还具有便于实现且成本低的特点。
另外,通过对上述主显示单元303与副显示单元304进行硬件和软件的关联设计还可以实现上述主显示单元303与副显示单元304的关联操作。
图4显示了本说明书另一个实施例所述的双屏幕交互式触控显示装置的结构示意图。如图4所示,该双屏幕交互式触控显示装置可以包括:触控单元401、处理单元402、通过USB接口连接至上述处理单元402的主显示单元403以及通过串口连接至上述处理单元402的副显示单元404。
上述主显示单元403支持双操作系统,包括:支持安卓系统的安卓板卡以及支持Windows系统的OPS板卡。其中,假设安卓操作系统为默认操作系统,此时,安卓板卡将连接至上述USB接口。为了实现触摸数据在两个系统之间的切换上述主显示单元403还将包括一个连接上述安卓板卡以及支持Windows系统的OPS板卡的USB切换开关,该USB切换开关将安装在安卓板卡上,并将输出连接到OPS板卡。在当前的操作系统是安卓系统时,上述 切换开关不工作;而在当前的操作系统为Windows系统时,上述切换开关从上述安卓板卡接收触控数据并输出至OPS板卡。
上述副显示单元404支持单操作系统,例如可以包括支持Windows系统的OPS板卡或支持安卓系统的安卓板卡。出于成本方面的考虑,上述副显示单元404可以使用安卓板卡。
在图4所示的示例中,上述主显示单元403与副显示单元404可以并排拼接在一起,且上述触控单元401的外框可以覆盖在上述主显示单元403与副显示单元404的外侧,并根据上述主显示单元403与副显示单元404的显示屏的尺寸及拼接方式将上述触控单元401的外框所围绕区域划分为两个显示区域(如图中两个虚线框所示),其中显示区域A对应于主显示单元403;显示区域B对应于副显示单元404。
在本例中,对于显示区域的划分方法以及各显示区域之间坐标的映射方法可以参考上述图3所示的示例,在此不再重复说明。
需要说明的是,虽然在图4所示的示例中,上述处理单元402是通过串口连接上述副显示单元404的,在实际的应用中,上述处理单元402也可以使用USB接口连接上述副显示单元404或者还可以使用其他的设备之间的连接接口,而不会超出本说明书的范围。
需要说明的是,虽然在图4所示的示例中,上述主显示单元403支持双操作系统而上述副显示单元404支持单操作系统,在实际的应用中,上述副显示单元404也可以支持双操作系统。此时,上述副显示单元404也可以包括一个连接自身安卓板卡和OPS板卡的切换开关,从而实现触摸信号在自身所支持两个操作系统的切换。
可以看出,本示例提供了一种双屏幕交互式触控显示装置,可以包括两个独立的屏幕,用户可以通过一个触控单元实现对上述两个独立屏幕的单独或同时控制,看起来就像在操作一块包含两个画面的屏幕。而且,在本示例中,上述主显示单元403不仅支持安卓系统还可以支持Windows系统,提高了上述双屏幕交互式触控显示装置的兼容性,可以满足用户的多种需求,且本示例提供的双屏幕交互式触控显示装置还具有便于实现且成本低的特点。
如前所述,上述处理单元还可以预先配置上述总显示区域内坐标与上述 第一显示区域内坐标之间的对应关系以及上述总显示区域内坐标与上述第二显示区域内坐标之间的对应关系。需要说明的是,上述对应关系可以是一种坐标的映射关系,通常可以是预先配置的表示坐标映射关系的数学表达式。依据该数学表达式,可以根据用户的触控操作在触控单元的外框所围绕区域内的坐标计算出用户的触控操作在目标显示区域内的坐标。通常,上述坐标映射关系可以在进行显示区域划分配置时同时确定。例如,在进行显示区域划分时,各个显示装置的输出坐标范围是按照多个显示装置的显示屏大小以及排列位置进行划分的,则在划分后可以直接确定上述坐标的映射关系,也即建立将红外触摸框所围绕范围内的坐标至显示装置内坐标的映射关系。
下面以一个具体的示例来详细说明上述处理单元的具体操作过程。假设使用平面直角坐标系来表示上述触控单元的外框所围绕区域也即总显示区域为X轴坐标的范围0~255,Y轴坐标范围0~100。且将上述区域划分为两个显示区域:显示区域1的X轴坐标范围0~160,Y轴坐标范围0~100,该显示区域对应第一显示单元;显示区域2的X轴坐标范围161~255,Y轴坐标范围0~100,该显示区域对应第二显示单元。并且可以建立总显示区域范围内坐标至第二显示单元坐标的映射关系为X轴坐标减160,Y轴坐标不变。
在这种情况下,当检测到用户的触控操作在总显示区域内的位置为(50,50)时,处理单元可以首先确定该触控操作的位置属于显示区域1,也即对应第一显示单元。然后,处理单元再根据总显示区域内各位置的坐标(显示区域1内的各位置的坐标)和第一显示单元所对应显示区域内各位置的坐标的对应的关系(例如,映射表达式)得到用户的触控操作在第一显示单元所对应显示区域内的位置,例如(50,50),并将触控信号通过连接到第一显示单元的输出端口。此时,第一显示单元在接收到上述触控信号后将根据当前的操作模式,执行相应的操作,并输出相应的响应。
在另一种情况下,当检测到用户的触控操作在总显示区域内的位置为(200,50)时,处理单元可以首先确定该触控操作的位置属于显示区域2,也即对应第二显示单元。然后,处理单元再根据总显示区域内各位置的坐标(显示区域2内的各位置的坐标)和第二显示单元所对应显示区域内各位置的坐标的对应的关系(例如,映射表达式)得到用户的触控操作在第二显示单元 所对应显示区域内的位置,例如(40,50),并将触控信号通过连接到第二显示单元的输出端口。此时,第二显示单元在接收到上述触控信号后将根据当前的操作模式,执行相应的操作,并输出相应的响应。
当然,在本说明书的另一些实施例中,上述处理单元还可以直接将上述坐标(200,50)发送至对应第二显示单元,而由第二显示单元根据自身的配置确定触控操作在自身显示屏上的对应位置。
可以看出,在本说明书的实施例中,通过触控显示装置对应的总显示区域划分为至少两个显示区域,可以将上述至少两个显示区域和至少两个显示单元对应起来,以实现一种多屏幕交互式触控显示装置。这样,当用户在总显示区域内进行触控操作时,处理单元可以根据用户的触控操作的位置确定用户的操作所属的显示区域,进而确定输出触控信号的输出端口,从而可以通过相应的输出端口将由触控信号发送至对应的显示单元,由对应的显示单元执行相应的操作,给出相应的响应,以实现多屏幕交互式触控显示装置的触控操作,既可以对这多个屏幕进行单独操作,也可以对多个屏幕进行同时操作,甚至可以为多个屏幕之间的关联操作提供了硬件基础。
对应上述触控显示装置,本说明书的实施例还给出了一种触控显示方法,该方法可以由一个触控显示装置实现。
图5为本说明书实施例所述的触控显示方法的流程图。如图5所示,该方法可以包括:
步骤502,检测到用户的触控操作后,生成触控信号。
在本说明书的实施例中,上述触控信号中可以包括用户的触控操作在上述触控显示装置所对应总显示区域内的坐标。
步骤504,根据触控信号从触控显示装置包含的第一显示单元和第二显示单元中确定与上述触控操作对应的目标显示单元。
步骤506,将触控信号发送至与触控操作对应的目标显示单元。
上述目标显示单元在接收到上述触控信号后,根据上述触控信号执行相应操作,并显示操作结果。
在本说明书的实施例中,上述方法可以进一步包括:将上述触控显示装 置对应的总显示区域划分为第一显示区域和第二显示区域;其中,第一显示区域对应第一显示单元;第二显示区域对应第二显示单元;其中,
上述步骤504所述的根据所述触控信号从触控显示装置包含的第一显示单元和第二显示单元中确定与所述触控操作对应的目标显示单元可以包括:确定所述坐标所属的目标显示区域,并将所述目标显示区域对应的显示单元作为所述目标显示单元;其中,所述目标显示区域为所述第一显示区域和所述第二显示区域之一。
在本说明书的实施例中,上述方法可以进一步包括:预先配置所述总显示区域内坐标与所述第一显示区域内坐标之间的对应关系以及所述总显示区域内坐标与所述第二显示区域内坐标之间的对应关系。
此时,在上述步骤506之前,还可以进一步执行:根据所述对应关系以及所述触控操作在所述总显示区域内的坐标确定所述触控操作在所述目标显示区域内的坐标,并使用在所述目标显示区域内的坐标替换所述触控信号中的坐标。
可以看出,在本说明书的实施例中,当用户在上述总显示区域内进行触控操作时,处理单元可以根据用户的触控操作的位置确定用户的操作所对应的显示单元,由对应的显示单元执行相应的操作,给出相应的响应,以实现多屏幕交互式触控操作。具体地,这种对应可以通过将触控显示装置所对应总显示区域划分为两个显示区域,可以将上述两个显示区域和两个显示单元一一对应起来来实现。可以看出,通过本说明书的触控显示方法,既可以对触控显示装置的多个屏幕进行单独操作,也可以对多个屏幕进行同时操作,甚至可以为多个屏幕之间的关联操作提供了硬件和软件基础。
上述对本说明书特定实施例进行了描述。其它实施例在所附权利要求书的范围内。在一些情况下,在权利要求书中记载的动作或步骤可以按照不同于实施例中的顺序来执行并且仍然可以实现期望的结果。另外,在附图中描绘的过程不一定要求示出的特定顺序或者连续顺序才能实现期望的结果。在某些实施方式中,多任务处理和并行处理也是可以的或者可能是有利的。
为了描述的方便,描述以上装置时以功能分为各种模块分别描述。当然, 在实施本说明书一个或多个实施例时可以把各模块的功能在同一个或多个软件和/或硬件中实现。
对应上述触控显示装置和触控显示方法,本说明书实施例还提出了一种非暂态计算机可读存储介质,其中,所述非暂态计算机可读存储介质存储计算机指令,所述计算机指令用于使所述计算机执行上述触控显示方法。
本实施例的计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。
所属领域的普通技术人员应当理解:以上任何实施例的讨论仅为示例性的,并非旨在暗示本公开的范围(包括权利要求)被限于这些例子;在本公开的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,步骤可以以任意顺序实现,并存在如上所述的本说明书一个或多个实施例的不同方面的许多其它变化,为了简明它们没有在细节中提供。
另外,为简化说明和讨论,并且为了不会使本说明书一个或多个实施例难以理解,在所提供的附图中可以示出或可以不示出与集成电路(IC)芯片和其它部件的公知的电源/接地连接。此外,可以以框图的形式示出装置,以便避免使本说明书一个或多个实施例难以理解,并且这也考虑了以下事实,即关于这些框图装置的实施方式的细节是高度取决于将要实施本说明书一个或多个实施例的平台的(即,这些细节应当完全处于本领域技术人员的理解范围内)。在阐述了具体细节(例如,电路)以描述本公开的示例性实施例的情况下,对本领域技术人员来说显而易见的是,可以在没有这些具体细节的情况下或者这些具体细节有变化的情况下实施本说明书一个或多个实施例。因此,这些描述应被认为是说明性的而不是限制性的。
尽管已经结合了本公开的具体实施例对本公开进行了描述,但是根据前面的描述,这些实施例的很多替换、修改和变型对本领域普通技术人员来说将是显而易见的。例如,其它存储器架构(例如,动态RAM(DRAM))可以使用所讨论的实施例。
本说明书一个或多个实施例旨在涵盖落入所附权利要求的宽泛范围之内的所有这样的替换、修改和变型。因此,凡在本说明书一个或多个实施例的精神和原则之内,所做的任何省略、修改、等同替换、改进等,均应包含在本公开的保护范围之内。

Claims (13)

  1. 一种触控显示装置,包括:
    第一显示单元、第二显示单元、触控单元以及连接所述第一显示单元、所述第二显示单元和所述触控单元的处理单元;其中,
    所述触控单元用于检测用户的触控操作,生成触控信号,以及将生成的触控信号发送至所述处理单元;
    所述处理单元用于根据所述触控信号从所述第一显示单元和第二显示单元中确定与所述触控操作对应的目标显示单元,并将所述触控信号发送至与所述触控操作对应的目标显示单元;以及
    作为所述目标显示单元的所述第一显示单元和/或第二显示单元在接收到所述触控信号后,根据所述触控信号执行相应操作,并显示操作结果。
  2. 根据权利要求1所述的触控显示装置,其中,
    所述处理单元用于预先将所述触控显示装置对应的总显示区域划分为第一显示区域和第二显示区域;其中,所述第一显示区域对应所述第一显示单元;所述第二显示区域对应所述第二显示单元;
    所述触控信号包括:所述触控操作在所述总显示区域内的坐标;
    所述处理单元用于确定所述坐标所属的目标显示区域,并将所述目标显示区域对应的显示单元作为所述目标显示单元;其中,所述目标显示区域为所述第一显示区域和所述第二显示区域中的至少一个。
  3. 根据权利要求2所述的触控显示装置,其中,
    所述处理单元用于预先配置所述总显示区域内坐标与所述第一显示区域内坐标之间的对应关系以及所述总显示区域内坐标与所述第二显示区域内坐标之间的对应关系;
    所述处理单元进一步用于根据所述对应关系以及所述触控操作在所述总显示区域内的坐标确定所述触控操作在所述目标显示区域内的坐标,并使用在所述目标显示区域内的坐标替换所述触控信号中的坐标。
  4. 根据权利要求1所述的触控显示装置,其中,所述第一显示单元包括:连接至所述处理单元的第一显示板卡、第二显示板卡、连接所述第一显示板卡和所述第二显示板卡的切换开关以及连接至所述第一显示板卡和第二显示板卡的显示屏;其中,所述第一显示板卡对应第一显示系统;所述第二显示板卡对应第二显示系统;其中,
    响应于所述第一显示单元当前的操作系统为第二显示系统,所述切换开关从所述第一显示板卡接收所述触控信号并输出至所述第二显示板卡。
  5. 根据权利要求1所述的触控显示装置,其中,所述第二显示单元包括:连接至所述处理单元的显示板卡以及连接至所述显示板卡的显示屏。
  6. 根据权利要求1所述的触控显示装置,其中,所述第一显示单元和所述第二显示单元并列放置且复用所述触控单元。
  7. 根据权利要求6所述的触控显示装置,其中,所述触控单元包括:一个外框以及安装在外框上的传感器发射元件和接收感测元件;其中,
    所述外框包围在由并列放置的所述第一显示单元和所述第二显示单元拼接得到的几何图形的外围边缘上;
    所述传感器发射元件和接收感测元件用于检测用户在所述外框所围绕区域内的触控操作,生成所述触控信号。
  8. 根据权利要求7所述的触控显示装置,其中,所述传感器发射元件为红外发射元件;以及所述接收感测元件为红外感测元件。
  9. 根据权利要求1所述的触控显示装置,所述第一显示单元和第二显示单元为电子白板。
  10. 一种触控显示方法,包括:
    检测到用户的触控操作后,生成触控信号;
    根据所述触控信号从触控显示装置包含的第一显示单元和第二显示单元中确定与所述触控操作对应的目标显示单元;以及
    将所述触控信号发送至与所述触控操作对应的目标显示单元,由所述目标显示单元根据所述触控信号执行相应操作,并显示操作结果。
  11. 根据权利要求10所述的触控显示方法,进一步包括:将所述触控显示装置对应的总显示区域划分为第一显示区域和第二显示区域;其中,第一显示区域对应第一显示单元;第二显示区域对应第二显示单元;其中,
    所述触控信号包括:所述触控操作在所述总显示区域内的坐标;
    所述根据所述触控信号从触控显示装置包含的第一显示单元和第二显示单元中确定与所述触控操作对应的目标显示单元包括:确定所述坐标所属的目标显示区域,并将所述目标显示区域对应的显示单元作为所述目标显示单元;其中,所述目标显示区域为所述第一显示区域和所述第二显示区域中的至少一个。
  12. 根据权利要求11所述的触控显示方法,进一步包括:
    预先配置所述总显示区域内坐标与所述第一显示区域内坐标之间的对应关系以及所述总显示区域内坐标与所述第二显示区域内坐标之间的对应关系;
    根据所述对应关系以及所述触控操作在所述总显示区域内的坐标确定所述触控操作在所述目标显示区域内的坐标,并使用在所述目标显示区域内的坐标替换所述触控信号中的坐标。
  13. 一种非暂态计算机可读存储介质,其中,所述非暂态计算机可读存储介质存储计算机指令,所述计算机指令用于使所述计算机执行权利要求10至12任一项所述的触控显示方法。
PCT/CN2021/080401 2020-03-24 2021-03-12 触控显示装置、触控显示方法及存储介质 WO2021190326A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010212406.1 2020-03-24
CN202010212406.1A CN113448451B (zh) 2020-03-24 2020-03-24 触控显示装置、触控显示方法及存储介质

Publications (1)

Publication Number Publication Date
WO2021190326A1 true WO2021190326A1 (zh) 2021-09-30

Family

ID=77807426

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/080401 WO2021190326A1 (zh) 2020-03-24 2021-03-12 触控显示装置、触控显示方法及存储介质

Country Status (2)

Country Link
CN (1) CN113448451B (zh)
WO (1) WO2021190326A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116198525A (zh) * 2023-02-21 2023-06-02 广州小鹏汽车科技有限公司 车载系统控制方法、车辆及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101482790A (zh) * 2008-01-09 2009-07-15 宏达国际电子股份有限公司 可于两显示元件上转移对象的电子装置及其控制方法
CN102035934A (zh) * 2009-09-30 2011-04-27 李晓 双屏便携式电子设备及管理方法
CN103176644A (zh) * 2011-12-26 2013-06-26 英业达股份有限公司 电子装置及操作模式切换方法
CN106775397A (zh) * 2016-12-12 2017-05-31 华中师范大学 Pcb板和应用pcb板的多点触控方法
CN108174027A (zh) * 2017-12-26 2018-06-15 努比亚技术有限公司 双屏控制装置、双屏移动终端及双屏显示方法
US20190129596A1 (en) * 2017-11-02 2019-05-02 Dell Products L. P. Defining a zone to perform an action in a dual-screen tablet

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108469945A (zh) * 2018-03-26 2018-08-31 广州视源电子科技股份有限公司 一种显示终端及其控制方法、装置和存储介质
CN108572766A (zh) * 2018-04-24 2018-09-25 京东方科技集团股份有限公司 一种触控显示装置及触控检测方法
CN110716661B (zh) * 2019-09-04 2023-07-18 广州创知科技有限公司 拼接式智能交互平板

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101482790A (zh) * 2008-01-09 2009-07-15 宏达国际电子股份有限公司 可于两显示元件上转移对象的电子装置及其控制方法
CN102035934A (zh) * 2009-09-30 2011-04-27 李晓 双屏便携式电子设备及管理方法
CN103176644A (zh) * 2011-12-26 2013-06-26 英业达股份有限公司 电子装置及操作模式切换方法
CN106775397A (zh) * 2016-12-12 2017-05-31 华中师范大学 Pcb板和应用pcb板的多点触控方法
US20190129596A1 (en) * 2017-11-02 2019-05-02 Dell Products L. P. Defining a zone to perform an action in a dual-screen tablet
CN108174027A (zh) * 2017-12-26 2018-06-15 努比亚技术有限公司 双屏控制装置、双屏移动终端及双屏显示方法

Also Published As

Publication number Publication date
CN113448451A (zh) 2021-09-28
CN113448451B (zh) 2024-01-23

Similar Documents

Publication Publication Date Title
KR102213212B1 (ko) 멀티윈도우 제어 방법 및 이를 지원하는 전자 장치
KR102037470B1 (ko) 디스플레이 장치, 디스플레이 시스템 및 그 제어 방법
US10235120B2 (en) Display apparatus, display system, and controlling method thereof
CN107589864B (zh) 多点触摸显示面板及其控制方法和系统
CN103729065A (zh) 触控操作映射到实体按键的系统及方法
US20150242038A1 (en) Filter module to direct audio feedback to a plurality of touch monitors
US20100241957A1 (en) System with ddi providing touch icon image summing
KR20140074141A (ko) 단말에서 애플리케이션 실행 윈도우 표시 방법 및 이를 위한 단말
US20160328149A1 (en) Display screen-based physical button simulation method and electronic device
CN104571683A (zh) 触摸感测系统及其驱动方法
US20150370786A1 (en) Device and method for automatic translation
US9304784B2 (en) Positionally informative remote display selection interface
WO2021190326A1 (zh) 触控显示装置、触控显示方法及存储介质
TWI423094B (zh) 光學式觸控裝置及其運作方法
US20190121594A1 (en) Display device, storage medium storing control program, and control method
WO2017022031A1 (ja) 情報端末装置
CN109718554A (zh) 一种实时渲染方法、装置及终端
CN101995987A (zh) 一种具有多点触控式的大屏幕系统
CN108399058A (zh) 一种信号显示控制方法及装置
CN105988643A (zh) 信息处理方法及电子设备
WO2022218152A1 (zh) 窗口切换方法、存储介质及相关设备
CN115016671A (zh) 触控显示控制装置、触控显示装置、设备和方法
CN104820489A (zh) 管理低延时直接控制反馈的系统和方法
CN210377414U (zh) 一种触控显示装置
CN103677430B (zh) 一种触摸设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21775700

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21775700

Country of ref document: EP

Kind code of ref document: A1