WO2020091815A1 - Display monitors with touch events-based on-screen display menus - Google Patents

Display monitors with touch events-based on-screen display menus Download PDF

Info

Publication number
WO2020091815A1
WO2020091815A1 PCT/US2018/059095 US2018059095W WO2020091815A1 WO 2020091815 A1 WO2020091815 A1 WO 2020091815A1 US 2018059095 W US2018059095 W US 2018059095W WO 2020091815 A1 WO2020091815 A1 WO 2020091815A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
osd
controller
command
display
Prior art date
Application number
PCT/US2018/059095
Other languages
French (fr)
Inventor
Tao-Sheng CHU
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2018/059095 priority Critical patent/WO2020091815A1/en
Publication of WO2020091815A1 publication Critical patent/WO2020091815A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1601Constructional details related to the housing of computer displays, e.g. of CRT monitors, of flat displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04897Special input arrangements or commands for improving display capability

Definitions

  • the display monitors can be desktop displays» digital photo frames, and toe like. Further, the display monitors may be externally connected to computing units such as central processing units (CPUs). To provide a visual experience, the display monitors may be designed to have an on-screen display (OSD) menu including display characteristic parameters such as brightness, hue, contrast, and toe like.
  • OSD on-screen display
  • FIG. 1A is a block diagram of an example display monitor, including an on-screen display (OSD) controller to display an OSD menu in an OSD menu area;
  • OSD on-screen display
  • FIG. 1 B is a block diagram of the example display monitor of FIG. 1 A, depicting additional features
  • FIG, 2 is a block diagram of an example display monitor, including an OSD controller to control options displayed by an OSD menu in an OSD menu area;
  • FIG. 3 is a block diagram of an example computing unit, externally connected to an example display monitor, to provide commands to display an OSD menu in an OSD menu area of toe display monitor;
  • FIG. 4A is a block diagram of an example electronic device to display and control an OSD menu in an OSD menu area based on touch events;
  • FIG. 4B is a sequence diagram illustrating an example data flow between components of the example electronic device of FIG. 4A;
  • FIG. 5 is an example flow diagram illustrating displaying and controlling an OSD menu in an OSD menu area based on touch events.
  • a display monitor may indude a screen to display graphical data thereon, and a frame to support the screen.
  • Example display monitor may be a desktop monitor, a digital photo frame, or any other stand-alone device.
  • tine display monitor may provide an on-screen display (OSD) function.
  • OSD on-screen display
  • the display monitor with OSD function may indude multiple user-operable buttons dn tiie frame dr housing of the display monitor. By operating the buttons, the user can open an OSD menu and adjust the display characteristic parameters.
  • buttons dh tiie display monitor may be increased, which results in an increased number and complexity of the buttons dh tiie display monitor. Further, the buttons may occupy a hardware spaas of the display monitor. Furthermore, the buttons may have a noise issue caused by mechanical operation and also result in an additional cost.
  • Examples described herein may provide a display monitor externally connected to a computing unit.
  • the display monitor may include a touch display panel, a touch controller connected to tine touch display panel, and an OSD controller connected to tine touch display panel.
  • the touch controller may transmit a first touch event received from the touch display panel to a touch controller application (e.g., a touch controller driver) running on the computing unit and execute a first command received from tine computing unit to provide an OSD menu area on the touch display panel.
  • the OSD controller may execute a second command received from the computing unit to display an OSD menu in the OSD menu area.
  • FIG. 1A is a block diagram of an example display monitor 100, including an OSD controller 106 to display an OSD menu in an OSD menu area.
  • Example display monitor 100 may be a desktop monitor, a digital photo frame, an electronic paper, or foe like.
  • Display monitor 100 may include a touch display panel 102.
  • touch display panel 102 may include a display layer and a touch layer installed or stacked dn the display layer. The touch layer may receive touch events and the display layer may be used to output data.
  • display monitor 100 may include a touch controller 104 connected to touch display panel 102 to transmit a first touch event from touch display panel 102 to a touch controller application.
  • foe first touch event may include a predefined gesture or a pattern.
  • Example touch controller application may be running on a computing unit that is externally connected to display monitor 100.
  • touch controller 104 may receive a first command from foe computing unit based on foe first touch event and execute the first command to provide foe OSD menu area on touch display panel 102.
  • display monitor 100 may include OSD controller 106 connected to touch display panel 102 to receive a second command from foe computing unit based on the first touch event and execute the second command to display foe OSD menu in foe OSD menu area.
  • touch controller 104 and OSD controller 106 are to simultaneously receive the first command and the second commend, respectively, from the computing unit.
  • FIG. 18 is a block diagram Of example display monitor 100 of FIG.
  • display monitor 100 may indude an external port 110 to conned display monitor 100 to a computing unit 108.
  • external port 110 may be used to conned a video source to display monitor 100, and may also cany audio, USB, commands, and other forms of data.
  • touch Controller 104 may detect a second touch event occurred within the OSD menu area. Further, OSD controller 106 may trigger a parameter command according to the second touch event output by touch controller 104 and control a display state of touch display panel 102 based on the parameter command. In this example, touch controller 104 may transmit the second touch event to computing unit 108. Further, OSD cohtro!ier 106 may receive the second touch event from computing unit 108. In other examples, OSD controller 106 may receive the second touch event directly from touch controller 104.
  • the parameter command may control a brightness parameter, a contrast parameter, a volume parameter, an image position control parameter, a color parameter, a display frequency control parameter, or the like.
  • FIG. 2 is a block diagram of an example display monitor 200, including an OSD controller 208 to control options displayed by an OSD menu 212 in an OSD menu area 210.
  • display monitor 200 may indude a touch display panel 204 to receive touch events.
  • display monitor 200 may indude a touch controller 206 to transmit a first touch event from touch display panel 204 to a touch controller application 214 running on a computing unit 202.
  • computing unit 202 may be externally connected to display monitor 200.
  • the first touch event may indude a predetermined pattern.
  • the predetermined pattern can be two continuous touches on a surface of touch display panel 204, a pattern such as a circle drawn on touch display panel 204, or the like.
  • touch controller 206 may provide OSD menu area 210 on touch display panel 204 via executing a first command received from computing unit 202.
  • touch display pane! 204 thereon may have OSD menu area 210 corresponding to OSD menu 212.
  • OSD menu 212 displayed on touch display panel 204 may be executed as an object.
  • display monitor 200 may indude OSD controller
  • a user may use touch display panel 204 to drag OSD menu 212 to any position on touch display panel 204.
  • touch controller 206 may receive a second touch event occurred within OSD menu area 210 and transmit a touch location corresponding to toe second touch event to computing unit 202.
  • OSD controller 208 may receive the touch location from computing unit 202 and trigger a corresponding parameter command to control options displayed by OSD menu 212. In one example, OSD controller 208 may determine whether the touch location corresponds to an OSD dose command. When the touch location corresponds to the OSD dose command, OSD controller 208 may generate the parameter command to dose OSD menu 212. Further, OSD controller 208 may notify closing of OSD menu 212 to touch controller 206 via computing unit 202. Ih one example, when the touch location does not correspond to toe OSD dose command, OSD controller 208 may generate toe parameter command corresponding to the touch location to control a display slate of touch display panel 204.
  • the parameter command may be a command to control a brightness parameter, a contrast parameter, a volume parameter, an image position control parameter, a color parameter, a display frequency control parameter, or the like.
  • the components of display monitor 100 or 200 may be implemented in hardware, machine-readable instructions, or a combination thereof.
  • touch controller 104 or 206 and OSD controller 106 or 208 may be implemented as engines or modules comprising any combination of hardware and programming to implement the functionalities described herein.
  • each of touch controller 104 or 206 and OSD controller 106 or 208 may be a combination of circuitry and executable instructions, such as a processor coupled to memory having a control program set of instructions thereon to execute the control program.
  • the execution of the control program of touch controller 104 or 206 may cause operation of corresponding display monitor 100 or 200 to detect touch events and execution of the control program of OSD controller 106 or 208 may cause operation of corresponding display monitor 100 or 200 to display the OSD menu and control the display state of display monitor 100 or 200 in accordance with the examples described herein with respect to FIGs. 1A-2.
  • Display monitor 100 or 200 may include computer-readable storage medium including (e,g., encoded with) instructions executable by a processor to implement functionalities described herein in relation to FIGs. 1A, 1B, and 2.
  • the functionalities described herein, in relation to instructions to implement functions of components of display monitor 100 or 200, and any additional instructions described herein in relation to the storage medium may be implemented as engines or modules including any combination of hardware and programming to implement the functionalities of the modules or engines described herein.
  • the functions of components of display monitor 100 or 200 may also be implemented by a respective processor.
  • the processor may include, for example, one processor or multiple processors included in a single device or distributed across multiple devices.
  • FIG. 3 is a block diagram of an example computing unit 300, externally connected to an example display monitor 302, to provide commands to display an OSD menu in an OSD menu area of display monitor 302.
  • Example computing unit 300 may include a processing unit 306 executing an operating system 308, a touch controller application 310 running on processing unit 306, and a control unit 304 communicatively coupled to processing unit 306.
  • Example control unit 304 may be an embedded controller.
  • touch controller application 310 may receive a first touch event from a touch display panel of display monitor 302 that is externally connected to computing unit 300 via an external port (e.g., a display port).
  • touch controller application 310 may determine that the first touch event matches with an OSD menu open command and generate a first command to display an OSD menu based on the determination.
  • touch controller application 310 may include a logic circuit and a look-up table storing predetermined patterns. When toe first touch event is received, touch controller application 310 may determine whether toe first touch event complies with the predetermined pattern corresponding to toe OSD menu open command. When the first touch event complies with the predetermined pattern, touch controller application 310 may generate and transmit the first command to control unit 304. When the first touch event does not match with the OSD menu open command, touch controller application 310 may transmit the first touch event to operating system 308.
  • control unit 304 may transmit the first command to an OSD controller 314 of display monitor 302 to launch the OSD menu on toe touch display panel. Further, control unit 304 may generate and transmit a second command specifying an OSD menu area corresponding to the OSD menu to touch controller 312 of display monitor 302.
  • control unit 304 may receive a touch location corresponding to a second touch event occurred in the OSD menu area from touch controller 312 and transmit the touch location to OSD controller 314.
  • OSD controller 314 may control a display state of the touch display panel via processing an option displayed by the OSD menu corresponding to the touch location.
  • control unit 304 may receive a notification of exiting the OSD menu from OSD controller 314 and notify exiting of toe OSD menu to touch controller 312.
  • toe components of computing unit 300 may be implemented in hardware, machine-readable instructions, or a combination thereof.
  • control unit 304 may be implemented as engines or modules comprising any combination of hardware and programming to implement the functionalities described herein.
  • control unit 304 may be a combination of circuitry and executable instructions, such as a processor coupled to memory having a control program set of instructions thereon to execute the control program.
  • the execution of the control program of control unit 304 may cause operation of computing unit 300 to provide tiie commands to display the OSD menu in the OSD menu area of display monitor 302 in accordance with the examples described herein with respect to FIG. 3.
  • Computing unit 300 may include computer-readable storage medium including (e.g., encoded with) instructions executable by a processor to implement functionalities described herein in relation to FIG. 3.
  • the functionalities described herein, in relation to instructions to implement functions of components of computing unit 300, and any additional instructions described herein in relation to the storage medium may be implemented as engines Or modules including any combination of hardware and programming to implement tiie functionalities of the modules or engines described herein.
  • the functions of components of computing unit 300 may also be implemented by a respective processor.
  • the processor may include, for example, one processor dr multiple processors included in a single device or distributed across multiple devices.
  • FIG. 4A is a block diagram of an example electronic device 400 to display and control an OSD menu in an OSD menu area based on touch events.
  • Example electronic device 400 may be a desktop computer, where a display monitor 402 may be externally connected to a computing unit 404 via an «eternal port (e.g., a display port) of display monitor 402.
  • display monitor 402 may include a touch display panel 406, an OSD controller 408 coupled to touch display panel 406, and a touch controller 410 coupled to touch display panel 406.
  • computing unit 404 may include a processing unit 412 and a control unit 414 coupled to processing unit 412.
  • processing unit 412 may be communicatively connected to touch controller 410 to receive a touch event. Further, a touch controller application may run on processing unit 412 to determine that the touch event matches with the OSD menu open command and provide a corresponding first command to control unit 414.
  • Example processing unit 412 may include a platform controller hub (PCH). The PCH may control data paths and support functions used in conjunction with central processing units (CPUs).
  • control unit 414 may be communicatively connected to OSD controller 408 to transmit the first command to launch Ore OSD menu on touch display panel 406. Further, control unit 414 may be communicatively connected to touch controller 410 to transmit a second command specifying an OSD menu area corresponding to the OSD menu.
  • the operation or data flow between different components or process objects of dispiay monitor 402 and computing unit 404 is described in FIG. 4B.
  • FIG, 4B is a sequence diagram 450 illustrating an example data flow between components of electronic device 400 of FIG. 4A.
  • the sequence diagram may represent toe interactions and toe operations involved in electronic device 400 to display toe OSD menu in the OSD menu area based on a touch event.
  • FIG. 4B illustrates process objects including touch display panel 406, OSD controller 408, touch controller 410, processing unit 412 ⁇ and control unit 414 along with their respective vertical lines originating from them.
  • the vertical lines of touch display panel 406, OSD controller 408, touch controller 410, processing unit 412, and control unit 414 may represent the processes that may exist simultaneously.
  • the horizontal arrows may represent the data flow steps between the vertical lines originating from their respective process objects (e.g., touch display panel 406, OSD controller 408, touch controller 410, processing unit 412, and control unit 414). Further, activation boxes (e.g., 456, 460, and 476) between the horizontal arrows may represent toe process that is being performed in the respective process object.
  • touch controller 410 may detect a first touch event occurred in touch display panel 406. Further, touch controller 410 may transmit toe detected first touch event to processing unit 412, at 454. At 456, processing unit 412 may determine that the first touch event matches with the OSD menu open command and generate the first command to display the OSD menu based on toe determination. At 458, the first command may be transmitted to control unit 414.
  • control unit 414 may generate the second command to specify die OSD menu area corresponding to tine OSD menu, at 460.
  • control unit 414 may transmit the first command to OSD controller 408.
  • control unit 414 may transmit toe second command to touch controller 410. In one example, control unit 414 may transmit toe first command and toe second command simultaneously.
  • touch controller 410 may execute the second command to provide toe OSD menu area on touch display panel 406.
  • OSD controller 408 may execute the first command to display the OSD menu in toe OSD menu area on touch display panel 406.
  • touch controller 410 may detect a second touch event occurred in toe OSD menu area in touch display panel 406. At 472, touch controller 410 may transmit a touch location corresponding to toe second touch event to control unit 414. At 474, control unit 414 may transmit toe touch location to OSD controller 408.
  • OSD controller 408 may trigger a corresponding parameter command to control options displayed by toe OSD menu.
  • OSD controller 408 may determine whether the touch location corresponds to an OSD dose command. When the touch location corresponds to the OSD dose command, OSD controller 408 may generate the parameter command to close toe OSD menu. Further, OSD controller 408 may notify dosing of the OSD menu to control unit 414, at 478. At 480, control unit 414 may notify closing of the OSD menu to touch controller 410. When toe touch location does not correspond to the OSD dose command, OSD controller 408 may generate toe parameter command corresponding to toe touch location to control a display state of touch display panel 406.
  • OSD controller 408 may control toe display state of touch display panel 406 based on toe parameter command.
  • examples described herein may enable to use a graphical user interface (GUI) provided by an application/program running on computing unit 404 to control the display characteristic parameters displayed by the OSD menu through control unit 414.
  • GUI graphical user interface
  • FIG. 5 is an example flow diagram 500 illustrating displaying and controlling an OSD menu in an OSD menu area based on touch events.
  • the process depicted in FIG. 5 represents generalized illustrations, and that other processes may be added, or existing processes may be removed, modified, or rearranged without departing from the scope and spirit of the present application.
  • the processes may represent instructions stored on a computer-readable storage medium that, when executed, may cause a processor to respond, to perform actions, to change states, ahd/or to make decisions.
  • the processes may represent functions and/or actions performed by functionally equivalent circuits like analog circuits, digital signal processing circuits, application specific integrated circuits (ASICs), or other hardware components associated with the system.
  • the flow charts are not intended to limit the implementation of the present application, but rather the flow charts illustrate functional information to design/fabricate circuits, generate machine-readable instructions, or use a combination of hardware and machine-readable instructions to perform tire illustrated processes.
  • an electronic device i.ering a display monitor and a computing unit externally connected to the display monitor
  • a boot sequence e.g. an initial set of operations to initialize the electronic device
  • a check is made by a touch controller of the display monitor to determine whether a first touch event has occurred oh a touch display panel of the display monitor. For example, when the electronic device is in power on mode, the touch controller may enter a monitoring state for continuously detecting touch events on the touch display panel.
  • the touch controller may transmit the first touch event to a touch controller application running on the computing unit.
  • a check is made by the touch controller application to determine whether the first touch event matches with an OSD menu open command.
  • the first touch event may be transmitted to an operating system of toe computing unit to perform an assodated function. Further, the process 500 goes to block 504 to detect a next touch event on the touch display panel.
  • a control unit of die computing unit may receive a first command to launch the OSD menu.
  • the control unit may transmit the first command to an OSD controller of the display monitor to launch the OSD menu on the touch display panel, at 514.
  • the control unit may generate and transmit a second command specifying an OSD menu area corresponding to the OSD menu to the touch controller, at 516.
  • a check is made to determine whether the touch controller has detected a second touch event within toe OSD menu area.
  • toe touch controller may continuously monitor the occurrence of the second touch event witotn toe OSD menu area.
  • the touch controller may transmit a touch location corresponding to toe second touch event to the control unit, at 520.
  • Ihe control unit may transmit toe touch location to toe OSD controller.
  • a check is made by the OSD controller to determine whether the touch location corresponds to an OSD dose command. When the touch location does not correspond to the OSD dose command, the OSD controller may process OSD menu selection corresponding to toe touch location, at 526. Further, the process 500 goes to block 518 to detect a subsequent touch event within the OSD menu area.
  • the OSD controller may generate a parameter command to dose toe OSD menu and notify closing of toe OSD menu to toe control unit.
  • toe control unit may notify dosing of the OSD menu to the touch controller.
  • examples described in FIGs. 1A-5 may eliminate physical buttons, knobs, or touch-sensing buttons, and can resolve toe problems caused by using buttons, knobs, or touch-sensing buttons to control the OSD function.
  • Examples described herein may enhance tee appearance design of the display monitor by reducing/elimiriating tee number of physical buttons on the display monitor.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

In one example, a display monitor may include a touch display panel and a touch controller connected to the touch display panel. The touch controller may transmit a first touch event from the touch display panel to a touch controller application running on a computing unit that is externally connected to the display monitor, receive a first command from the computing unit based on the first touch event, and execute the first command to provide an on-screen display (OSD) menu area on the touch display panel. Further, the display monitor may include an OSD controller connected to the touch display panel to receive a second command from the computing unit based on the first touch event and execute the second command to display an OSD menu in the OSD menu area.

Description

DISPLAY MONITORS WITH TOUCH EVENTS-BASED ON-SCREEN DISPLAY
MENUS
BACKGROUND
[0001] With the advancement in electronic products, touch display panels are integrated into display monitors. The display monitors can be desktop displays» digital photo frames, and toe like. Further, the display monitors may be externally connected to computing units such as central processing units (CPUs). To provide a visual experience, the display monitors may be designed to have an on-screen display (OSD) menu including display characteristic parameters such as brightness, hue, contrast, and toe like.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Examples are described in the following detailed description and in reference to the drawings, in which:
[0003] FIG. 1A is a block diagram of an example display monitor, including an on-screen display (OSD) controller to display an OSD menu in an OSD menu area;
[0004] FIG. 1 B is a block diagram of the example display monitor of FIG. 1 A, depicting additional features;
[0005] FIG, 2 is a block diagram of an example display monitor, including an OSD controller to control options displayed by an OSD menu in an OSD menu area;
[0006] FIG. 3 is a block diagram of an example computing unit, externally connected to an example display monitor, to provide commands to display an OSD menu in an OSD menu area of toe display monitor;
[0007] FIG. 4A is a block diagram of an example electronic device to display and control an OSD menu in an OSD menu area based on touch events; [0008] FIG. 4B is a sequence diagram illustrating an example data flow between components of the example electronic device of FIG. 4A; and
[0009] FIG. 5 is an example flow diagram illustrating displaying and controlling an OSD menu in an OSD menu area based on touch events.
DETAILED DESCRIPTION
[0010] A display monitor may indude a screen to display graphical data thereon, and a frame to support the screen. Example display monitor may be a desktop monitor, a digital photo frame, or any other stand-alone device. TO enable a user to adjust display characteristic parameters (e.g., brightness, hue, contrast, size, position, horizontal/vertical scanning frequency, and so on) of the screen, tine display monitor may provide an on-screen display (OSD) function. For example, the display monitor with OSD function may indude multiple user-operable buttons dn tiie frame dr housing of the display monitor. By operating the buttons, the user can open an OSD menu and adjust the display characteristic parameters.
[0011] However, with development of technology, functions of the display monitor may be increased, which results in an increased number and complexity of the buttons dh tiie display monitor. Further, the buttons may occupy a hardware spaas of the display monitor. Furthermore, the buttons may have a noise issue caused by mechanical operation and also result in an additional cost.
[0012] Examples described herein may provide a display monitor externally connected to a computing unit. The display monitor may include a touch display panel, a touch controller connected to tine touch display panel, and an OSD controller connected to tine touch display panel. The touch controller may transmit a first touch event received from the touch display panel to a touch controller application (e.g., a touch controller driver) running on the computing unit and execute a first command received from tine computing unit to provide an OSD menu area on the touch display panel. Further, the OSD controller may execute a second command received from the computing unit to display an OSD menu in the OSD menu area.
[0013] In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of foe present techniques. It will be apparent, however, to one skilled in foe art that the present apparatus, devices and systems may be practiced without these specific details. Reference in foe specification to“an example" or similar language means that a particular feature, structure, or characteristic described may be included in at least that one example, but not necessarily in other examples.
[0014] Referring now to the figures, FIG. 1A is a block diagram of an example display monitor 100, including an OSD controller 106 to display an OSD menu in an OSD menu area. Example display monitor 100 may be a desktop monitor, a digital photo frame, an electronic paper, or foe like. Display monitor 100 may include a touch display panel 102. In one example, touch display panel 102 may include a display layer and a touch layer installed or stacked dn the display layer. The touch layer may receive touch events and the display layer may be used to output data.
[0015] Further, display monitor 100 may include a touch controller 104 connected to touch display panel 102 to transmit a first touch event from touch display panel 102 to a touch controller application. For example, foe first touch event may include a predefined gesture or a pattern. Example touch controller application may be running on a computing unit that is externally connected to display monitor 100. Further, touch controller 104 may receive a first command from foe computing unit based on foe first touch event and execute the first command to provide foe OSD menu area on touch display panel 102.
[0016] Furthermore, display monitor 100 may include OSD controller 106 connected to touch display panel 102 to receive a second command from foe computing unit based on the first touch event and execute the second command to display foe OSD menu in foe OSD menu area. In one example, touch controller 104 and OSD controller 106 are to simultaneously receive the first command and the second commend, respectively, from the computing unit.
[0017] FIG. 18 is a block diagram Of example display monitor 100 of FIG.
1A, depicting additional features. For example, similarly named elements of FIG. 1B may be similar in structure and/or fundion to elements described with respect to FIG. 1A. As shown in FIG. 1(3, display monitor 100 may indude an external port 110 to conned display monitor 100 to a computing unit 108. For example, external port 110 may be used to conned a video source to display monitor 100, and may also cany audio, USB, commands, and other forms of data.
[0018] In one example, touch Controller 104 may detect a second touch event occurred within the OSD menu area. Further, OSD controller 106 may trigger a parameter command according to the second touch event output by touch controller 104 and control a display state of touch display panel 102 based on the parameter command. In this example, touch controller 104 may transmit the second touch event to computing unit 108. Further, OSD cohtro!ier 106 may receive the second touch event from computing unit 108. In other examples, OSD controller 106 may receive the second touch event directly from touch controller 104. For example, the parameter command may control a brightness parameter, a contrast parameter, a volume parameter, an image position control parameter, a color parameter, a display frequency control parameter, or the like.
[0019] FIG. 2 is a block diagram of an example display monitor 200, including an OSD controller 208 to control options displayed by an OSD menu 212 in an OSD menu area 210. As shown in FIG. 2, display monitor 200 may indude a touch display panel 204 to receive touch events. Further, display monitor 200 may indude a touch controller 206 to transmit a first touch event from touch display panel 204 to a touch controller application 214 running on a computing unit 202. In this example, computing unit 202 may be externally connected to display monitor 200. For example, the first touch event may indude a predetermined pattern. The predetermined pattern can be two continuous touches on a surface of touch display panel 204, a pattern such as a circle drawn on touch display panel 204, or the like. [0020] Further, touch controller 206 may provide OSD menu area 210 on touch display panel 204 via executing a first command received from computing unit 202. Thus, touch display pane! 204 thereon may have OSD menu area 210 corresponding to OSD menu 212. In one example, OSD menu 212 displayed on touch display panel 204 may be executed as an object.
[0021] As shown in FIG. 2, display monitor 200 may indude OSD controller
208 to display OSD menu 212 in OSD menu area 210 via executing a second command received from computing unit202. In one example, a user may use touch display panel 204 to drag OSD menu 212 to any position on touch display panel 204. Further, touch controller 206 may receive a second touch event occurred within OSD menu area 210 and transmit a touch location corresponding to toe second touch event to computing unit 202.
[0022] Further, OSD controller 208 may receive the touch location from computing unit 202 and trigger a corresponding parameter command to control options displayed by OSD menu 212. In one example, OSD controller 208 may determine whether the touch location corresponds to an OSD dose command. When the touch location corresponds to the OSD dose command, OSD controller 208 may generate the parameter command to dose OSD menu 212. Further, OSD controller 208 may notify closing of OSD menu 212 to touch controller 206 via computing unit 202. Ih one example, when the touch location does not correspond to toe OSD dose command, OSD controller 208 may generate toe parameter command corresponding to the touch location to control a display slate of touch display panel 204. For example, the parameter command may be a command to control a brightness parameter, a contrast parameter, a volume parameter, an image position control parameter, a color parameter, a display frequency control parameter, or the like.
[0023] In one example, the components of display monitor 100 or 200 may be implemented in hardware, machine-readable instructions, or a combination thereof. In one example, touch controller 104 or 206 and OSD controller 106 or 208 may be implemented as engines or modules comprising any combination of hardware and programming to implement the functionalities described herein. For example, each of touch controller 104 or 206 and OSD controller 106 or 208 may be a combination of circuitry and executable instructions, such as a processor coupled to memory having a control program set of instructions thereon to execute the control program. The execution of the control program of touch controller 104 or 206 may cause operation of corresponding display monitor 100 or 200 to detect touch events and execution of the control program of OSD controller 106 or 208 may cause operation of corresponding display monitor 100 or 200 to display the OSD menu and control the display state of display monitor 100 or 200 in accordance with the examples described herein with respect to FIGs. 1A-2.
[0024] Display monitor 100 or 200 may include computer-readable storage medium including (e,g., encoded with) instructions executable by a processor to implement functionalities described herein in relation to FIGs. 1A, 1B, and 2. In some examples, the functionalities described herein, in relation to instructions to implement functions of components of display monitor 100 or 200, and any additional instructions described herein in relation to the storage medium, may be implemented as engines or modules including any combination of hardware and programming to implement the functionalities of the modules or engines described herein. The functions of components of display monitor 100 or 200 may also be implemented by a respective processor. In examples described herein, the processor may include, for example, one processor or multiple processors included in a single device or distributed across multiple devices.
[0025] FIG. 3 is a block diagram of an example computing unit 300, externally connected to an example display monitor 302, to provide commands to display an OSD menu in an OSD menu area of display monitor 302. Example computing unit 300 may include a processing unit 306 executing an operating system 308, a touch controller application 310 running on processing unit 306, and a control unit 304 communicatively coupled to processing unit 306. Example control unit 304 may be an embedded controller. In one example, touch controller application 310 may receive a first touch event from a touch display panel of display monitor 302 that is externally connected to computing unit 300 via an external port (e.g., a display port). [0026] Further, touch controller application 310 may determine that the first touch event matches with an OSD menu open command and generate a first command to display an OSD menu based on the determination. In one example, touch controller application 310 may include a logic circuit and a look-up table storing predetermined patterns. When toe first touch event is received, touch controller application 310 may determine whether toe first touch event complies with the predetermined pattern corresponding to toe OSD menu open command. When the first touch event complies with the predetermined pattern, touch controller application 310 may generate and transmit the first command to control unit 304. When the first touch event does not match with the OSD menu open command, touch controller application 310 may transmit the first touch event to operating system 308.
[0027] Further, control unit 304 may transmit the first command to an OSD controller 314 of display monitor 302 to launch the OSD menu on toe touch display panel. Further, control unit 304 may generate and transmit a second command specifying an OSD menu area corresponding to the OSD menu to touch controller 312 of display monitor 302.
[0026] Furthermore, control unit 304 may receive a touch location corresponding to a second touch event occurred in the OSD menu area from touch controller 312 and transmit the touch location to OSD controller 314. in erne example, OSD controller 314 may control a display state of the touch display panel via processing an option displayed by the OSD menu corresponding to the touch location. Further, when the second touch event corresponds to closing of the OSD menu, control unit 304 may receive a notification of exiting the OSD menu from OSD controller 314 and notify exiting of toe OSD menu to touch controller 312.
[0029] In one example, toe components of computing unit 300 may be implemented in hardware, machine-readable instructions, or a combination thereof. In one example, control unit 304 may be implemented as engines or modules comprising any combination of hardware and programming to implement the functionalities described herein. For example, control unit 304 may be a combination of circuitry and executable instructions, such as a processor coupled to memory having a control program set of instructions thereon to execute the control program. The execution of the control program of control unit 304 may cause operation of computing unit 300 to provide tiie commands to display the OSD menu in the OSD menu area of display monitor 302 in accordance with the examples described herein with respect to FIG. 3.
[0030] Computing unit 300 may include computer-readable storage medium including (e.g., encoded with) instructions executable by a processor to implement functionalities described herein in relation to FIG. 3. In some examples, the functionalities described herein, in relation to instructions to implement functions of components of computing unit 300, and any additional instructions described herein in relation to the storage medium, may be implemented as engines Or modules including any combination of hardware and programming to implement tiie functionalities of the modules or engines described herein. The functions of components of computing unit 300 may also be implemented by a respective processor. In examples described herein, the processor may include, for example, one processor dr multiple processors included in a single device or distributed across multiple devices.
[0031] FIG. 4A is a block diagram of an example electronic device 400 to display and control an OSD menu in an OSD menu area based on touch events. Example electronic device 400 may be a desktop computer, where a display monitor 402 may be externally connected to a computing unit 404 via an «eternal port (e.g., a display port) of display monitor 402. As shown in FIG. 4A, display monitor 402 may include a touch display panel 406, an OSD controller 408 coupled to touch display panel 406, and a touch controller 410 coupled to touch display panel 406. Further, computing unit 404 may include a processing unit 412 and a control unit 414 coupled to processing unit 412.
[0032] in one example, processing unit 412 may be communicatively connected to touch controller 410 to receive a touch event. Further, a touch controller application may run on processing unit 412 to determine that the touch event matches with the OSD menu open command and provide a corresponding first command to control unit 414. Example processing unit 412 may include a platform controller hub (PCH). The PCH may control data paths and support functions used in conjunction with central processing units (CPUs).
[0033] In one example, control unit 414 may be communicatively connected to OSD controller 408 to transmit the first command to launch Ore OSD menu on touch display panel 406. Further, control unit 414 may be communicatively connected to touch controller 410 to transmit a second command specifying an OSD menu area corresponding to the OSD menu. The operation or data flow between different components or process objects of dispiay monitor 402 and computing unit 404 is described in FIG. 4B.
[0034] FIG, 4B is a sequence diagram 450 illustrating an example data flow between components of electronic device 400 of FIG. 4A. For example, similarly named elements of FIG.4B may be similar in structure and/or function to elements described with respect to FIG. 4A. The sequence diagram may represent toe interactions and toe operations involved in electronic device 400 to display toe OSD menu in the OSD menu area based on a touch event. FIG. 4B illustrates process objects including touch display panel 406, OSD controller 408, touch controller 410, processing unit 412^ and control unit 414 along with their respective vertical lines originating from them. The vertical lines of touch display panel 406, OSD controller 408, touch controller 410, processing unit 412, and control unit 414 may represent the processes that may exist simultaneously. The horizontal arrows (e.g., 452, 454, 458, 462, 464, 466, 468, 470, 472, 474, 478, 480, and 482) may represent the data flow steps between the vertical lines originating from their respective process objects (e.g., touch display panel 406, OSD controller 408, touch controller 410, processing unit 412, and control unit 414). Further, activation boxes (e.g., 456, 460, and 476) between the horizontal arrows may represent toe process that is being performed in the respective process object.
[0035] At 452, touch controller 410 may detect a first touch event occurred in touch display panel 406. Further, touch controller 410 may transmit toe detected first touch event to processing unit 412, at 454. At 456, processing unit 412 may determine that the first touch event matches with the OSD menu open command and generate the first command to display the OSD menu based on toe determination. At 458, the first command may be transmitted to control unit 414.
[0036] Upon receiving the first command, control unit 414 may generate the second command to specify die OSD menu area corresponding to tine OSD menu, at 460. At 462, control unit 414 may transmit the first command to OSD controller 408. At 464, control unit 414 may transmit toe second command to touch controller 410. In one example, control unit 414 may transmit toe first command and toe second command simultaneously.
[0037] At 466, touch controller 410 may execute the second command to provide toe OSD menu area on touch display panel 406. At 468, OSD controller 408 may execute the first command to display the OSD menu in toe OSD menu area on touch display panel 406.
[0038] At 470, touch controller 410 may detect a second touch event occurred in toe OSD menu area in touch display panel 406. At 472, touch controller 410 may transmit a touch location corresponding to toe second touch event to control unit 414. At 474, control unit 414 may transmit toe touch location to OSD controller 408.
[8839] At 476, OSD controller 408 may trigger a corresponding parameter command to control options displayed by toe OSD menu. In one example, OSD controller 408 may determine whether the touch location corresponds to an OSD dose command. When the touch location corresponds to the OSD dose command, OSD controller 408 may generate the parameter command to close toe OSD menu. Further, OSD controller 408 may notify dosing of the OSD menu to control unit 414, at 478. At 480, control unit 414 may notify closing of the OSD menu to touch controller 410. When toe touch location does not correspond to the OSD dose command, OSD controller 408 may generate toe parameter command corresponding to toe touch location to control a display state of touch display panel 406. At 482, OSD controller 408 may control toe display state of touch display panel 406 based on toe parameter command. In other examples, examples described herein may enable to use a graphical user interface (GUI) provided by an application/program running on computing unit 404 to control the display characteristic parameters displayed by the OSD menu through control unit 414.
[0040] FIG. 5 is an example flow diagram 500 illustrating displaying and controlling an OSD menu in an OSD menu area based on touch events. It should be understood that the process depicted in FIG. 5 represents generalized illustrations, and that other processes may be added, or existing processes may be removed, modified, or rearranged without departing from the scope and spirit of the present application. In addition, it should be understood thatthe processes may represent instructions stored on a computer-readable storage medium that, when executed, may cause a processor to respond, to perform actions, to change states, ahd/or to make decisions. In another implementation, the processes may represent functions and/or actions performed by functionally equivalent circuits like analog circuits, digital signal processing circuits, application specific integrated circuits (ASICs), or other hardware components associated with the system. Furthermore, the flow charts are not intended to limit the implementation of the present application, but rather the flow charts illustrate functional information to design/fabricate circuits, generate machine-readable instructions, or use a combination of hardware and machine-readable instructions to perform tire illustrated processes.
[0008] At 502, an electronic device (i.e„ a display monitor and a computing unit externally connected to the display monitor) may be switched on, where a boot sequence (e.g„ an initial set of operations to initialize the electronic device) may be executed. At 504, a check is made by a touch controller of the display monitor to determine whether a first touch event has occurred oh a touch display panel of the display monitor. For example, when the electronic device is in power on mode, the touch controller may enter a monitoring state for continuously detecting touch events on the touch display panel.
[0008] At 506, the touch controller may transmit the first touch event to a touch controller application running on the computing unit. At 508, a check is made by the touch controller application to determine whether the first touch event matches with an OSD menu open command. At 510, when the first touch event does not match with the OSD menu open command, the first touch event may be transmitted to an operating system of toe computing unit to perform an assodated function. Further, the process 500 goes to block 504 to detect a next touch event on the touch display panel.
[0043] At 512, when the first touch event matches with the OSD menu open command, a control unit of die computing unit may receive a first command to launch the OSD menu. Upon receiving the first command, the control unit may transmit the first command to an OSD controller of the display monitor to launch the OSD menu on the touch display panel, at 514. Further, the control unit may generate and transmit a second command specifying an OSD menu area corresponding to the OSD menu to the touch controller, at 516.
[0044] At 518, a check is made to determine whether the touch controller has detected a second touch event within toe OSD menu area. For example, toe touch controller may continuously monitor the occurrence of the second touch event witotn toe OSD menu area. Upon detecting the second touch event, the touch controller may transmit a touch location corresponding to toe second touch event to the control unit, at 520. At 522, Ihe control unit may transmit toe touch location to toe OSD controller. At 524, a check is made by the OSD controller to determine whether the touch location corresponds to an OSD dose command. When the touch location does not correspond to the OSD dose command, the OSD controller may process OSD menu selection corresponding to toe touch location, at 526. Further, the process 500 goes to block 518 to detect a subsequent touch event within the OSD menu area.
[0045] At 528, when toe touch location corresponds to the OSD dose command, the OSD controller may generate a parameter command to dose toe OSD menu and notify closing of toe OSD menu to toe control unit. At 530, toe control unit may notify dosing of the OSD menu to the touch controller.
[0046] Thus, examples described in FIGs. 1A-5 may eliminate physical buttons, knobs, or touch-sensing buttons, and can resolve toe problems caused by using buttons, knobs, or touch-sensing buttons to control the OSD function. Examples described herein may enhance tee appearance design of the display monitor by reducing/elimiriating tee number of physical buttons on the display monitor.
[0047] It may be noted that the above-described examples of the present solution are for the purpose of illustration only. Although tee solution has been described in conjunction with a specific embodiment thereof, numerous modifications may be possible without materially departing from die teachings and advantages of tee subject matter described herein. Other substitutions, modifications and changes may be made without departing from the spirit of the present solution. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.
[0048] The terms“include,”“have,” and variations teereof, as used herein, have the same meaning as the term“comprise” or appropriate variation teereof. Furthermore, the term“based on,” as used herein, means“based at least in part on." Thus, a feature teat is described as based on some stimulus can be based on the stimulus or a combination of stimuli including the stimulus.
[0049] The present description has been shown and described with reference to the foregoing examples. It is understood, however, teat other forms, details, and examples can be made without departing from the spirit and scope of the present subject matter that is defined in the following claims.

Claims

WHAT IS CLAIMED IS:
1. A display monitor comprising:
a touch display panel;
a touch controller connected to the touch display panel to:
transmit a first touch event from the touch display panel to a touch controller application running on a computing unit that is externally connected to the display monitor;
receive a first command from the computing unit based on the first touch event; and
execute the first command to provide an on-screen display (OSD) menu; area on the touch display panel; and
an OSD controller connected to the touch display panel to:
receive a second command from the computing unit based on the first touch event; and
execute the second command to display an OSD menu in the OSD menu area.
2. The display monitor of claim 1 , wherein the touch controller and the OSD controller are to simultaneously receive the first command and the second command, respectively, from the computing unit.
3. The display monitor of claim 1 , further comprising:
an external port to connect the display monitor to the computing unit.
4. The display monitor of claim 1 , wherein the touch controller is to detect a second touch event occurred within the OSD menu area.
5. The display monitor of claim 4, wherein the OSD controller is to:
trigger a parameter command according to the second touch event output by the touch controller, and
control a display state of the touch display panel based on the parameter command.
6. A display monitor comprising:
a touch display panel;
a touch controller to:
transmit a first touch event from the touch display panel to a touch controller application running on a computing unit that is externally connected to the display monitor; and
provide an on-screen display (OSD) menu area on toe touch display panel via executing a first command received from the computing unit; and an OSD controller to:
display an OSD menu in the OSD menu area via executing a second command received from toe computing unit, wherein the touch controller is to receive a second touch event occurred in the OSD menu area arid transmit a touch location corresponding to toe second touch event to the computing unit, and wherein the OSD controller is to receive the touch location from the computing unit and trigger a corresponding parameter command to control options displayed by the OSD menu.
7. The display monitor of claim 6, wherein the OSD controller is to:
determine «Aether the touch location corresponds to an OSD dose command;
generate the parameter command to dose the OSD menu when toe touch location corresponds to the OSD dose command; and
notify closing of toe OSD menu to toe touch controller via toe computing unit.
8. The display monitor of claim 7, wherein the OSD controller is to:
generate the parameter command corresponding to the touch location to control a display state of the touch display panel when toe touch location does not correspond to the OSD dose command.
9. The display monitor of claim 6, wherein the parameter command is to control a brightness parameter, a contrast parameter, a volume parameter, an image position control parameter, a color parameter, or a display frequency control parameter.
10. The display monitor of claim 6, wherein the first touch event comprises a predetermined pattern.
11. A computing unit comprising:
a processing unit executing an operating system;
a touch controller application running on the processing unit to:
receive a first touch event from a touch display panel of a display monitor that is extemaliy connected to the computing unit via a display port; determine that the first touch event matches with an on-screen display (OSD) menu open command; and
generate a first command to display an OSD menu based on the determination; and
a control unit communicatively coupled to the processing unit to:
transmit toe first command to an OSD controller of toe display monitor to launch the OSD menu on the touch display panel; and
generate and transmit a second command specifying an OSD menu area corresponding to the OSD menu to a touch controller of the display monitor.
12. The computing unit of claim 11 , wherein toe control unit is to:
receive a touch location corresponding to a second touch event occurred in toe OSD menu area from the touch controller; and
transmit the touch location to toe OSD controller, wherein the OSD controller is to control a display state of the touch display panel via processing an option displayed by the OSD menu corresponding to toe touch location.
13. The computing unit of claim 12, wherein toe control unit is to:
receive a notification of exiting the OSD menu from toe OSD controller; and notify exiting of the OSD menu to the touch eontroiler.
14. The computing unit of claim 11 , wherein the control unit is an embedded controller.
15. The computing unit of daim 11 , wherein the touch controller application runnihg on the processing unit is to transmit the first touch event to the operating system when the first touch event does not match with the OSD menu open command.
PCT/US2018/059095 2018-11-03 2018-11-03 Display monitors with touch events-based on-screen display menus WO2020091815A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2018/059095 WO2020091815A1 (en) 2018-11-03 2018-11-03 Display monitors with touch events-based on-screen display menus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2018/059095 WO2020091815A1 (en) 2018-11-03 2018-11-03 Display monitors with touch events-based on-screen display menus

Publications (1)

Publication Number Publication Date
WO2020091815A1 true WO2020091815A1 (en) 2020-05-07

Family

ID=70461879

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/059095 WO2020091815A1 (en) 2018-11-03 2018-11-03 Display monitors with touch events-based on-screen display menus

Country Status (1)

Country Link
WO (1) WO2020091815A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114237458A (en) * 2022-02-21 2022-03-25 深圳市嘉利达专显科技有限公司 Method, device and storage medium for controlling UI in multidimensional mode independent of OSD position
CN115629686A (en) * 2022-02-21 2023-01-20 深圳市嘉利达专显科技有限公司 Display method and device capable of simultaneously carrying out dual-mode control On Screen Display (OSD)
EP4213014A4 (en) * 2020-10-14 2024-03-27 Huawei Technologies Co., Ltd. Display device control method and display device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080106526A1 (en) * 2006-11-08 2008-05-08 Amtran Technology Co., Ltd. Touch on-screen display control device and control method therefor and liquid crystal display
US20110060987A1 (en) * 2009-09-09 2011-03-10 Qisda Corporation Touch screen display with touch osd and osd control method thereof
US20120098793A1 (en) * 2010-10-20 2012-04-26 Pixart Imaging Inc. On-screen-display module, display device, and electronic device using the same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080106526A1 (en) * 2006-11-08 2008-05-08 Amtran Technology Co., Ltd. Touch on-screen display control device and control method therefor and liquid crystal display
US20110060987A1 (en) * 2009-09-09 2011-03-10 Qisda Corporation Touch screen display with touch osd and osd control method thereof
US20120098793A1 (en) * 2010-10-20 2012-04-26 Pixart Imaging Inc. On-screen-display module, display device, and electronic device using the same

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4213014A4 (en) * 2020-10-14 2024-03-27 Huawei Technologies Co., Ltd. Display device control method and display device
CN114237458A (en) * 2022-02-21 2022-03-25 深圳市嘉利达专显科技有限公司 Method, device and storage medium for controlling UI in multidimensional mode independent of OSD position
CN115629686A (en) * 2022-02-21 2023-01-20 深圳市嘉利达专显科技有限公司 Display method and device capable of simultaneously carrying out dual-mode control On Screen Display (OSD)

Similar Documents

Publication Publication Date Title
US11449295B2 (en) Interchangeable device components
US11099534B2 (en) Configuration user interface for a home automation system
WO2020091815A1 (en) Display monitors with touch events-based on-screen display menus
US10509537B2 (en) Display control apparatus, display control method, and program
KR102604570B1 (en) Method for supporting user input and electronic device supporting the same
EP3035177B1 (en) Electronic device and method of controlling object in electronic device
US20130265243A1 (en) Adaptive power adjustment for a touchscreen
CN107111443B (en) Electronic device and display method thereof
US20160026345A1 (en) Operation panel for electronic device
US20160091979A1 (en) Interactive displaying method, control method and system for achieving displaying of a holographic image
US8754872B2 (en) Capacitive touch controls lockout
JP2012504290A5 (en)
KR102580327B1 (en) Electronic device and method for cotrolling of the electronic device
US9516263B2 (en) Automatic configuration of the logical orientation of multiple monitors based on captured images
US20180003520A1 (en) Method for performing function using sensor data and electronic device for providing same
US11216065B2 (en) Input control display based on eye gaze
US20210051245A1 (en) Techniques for presenting video stream next to camera
US20150227300A1 (en) Providing a single-action multi-mode interface
KR102607564B1 (en) Method for displying soft key and electronic device thereof
US20120313838A1 (en) Information processor, information processing method, and computer program product
CN107239222A (en) The control method and terminal device of a kind of touch-screen
KR102272343B1 (en) Method and Electronic Device for operating screen
US20160142624A1 (en) Video device, method, and computer program product
KR20180071873A (en) Screen controlling method and electronic device supporting the same
US20230298492A1 (en) Display control method and terminal device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18938957

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18938957

Country of ref document: EP

Kind code of ref document: A1