US20150002403A1 - Touch screen and method for adjusting touch sensitive object placement thereon - Google Patents

Touch screen and method for adjusting touch sensitive object placement thereon Download PDF

Info

Publication number
US20150002403A1
US20150002403A1 US13/927,943 US201313927943A US2015002403A1 US 20150002403 A1 US20150002403 A1 US 20150002403A1 US 201313927943 A US201313927943 A US 201313927943A US 2015002403 A1 US2015002403 A1 US 2015002403A1
Authority
US
United States
Prior art keywords
dimension
touch screen
objects
touch
container
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/927,943
Inventor
Martin Dostal
Zdenek Eichler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honeywell International Inc
Original Assignee
Honeywell International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honeywell International Inc filed Critical Honeywell International Inc
Priority to US13/927,943 priority Critical patent/US20150002403A1/en
Assigned to HONEYWELL INTERNATIONAL INC. reassignment HONEYWELL INTERNATIONAL INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: EICHLER, ZDENEK, Dostal, Martin
Priority to EP14172028.4A priority patent/EP2818994A1/en
Priority to CN201410380328.0A priority patent/CN104252267A/en
Publication of US20150002403A1 publication Critical patent/US20150002403A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C23/00Combined instruments indicating more than one navigational value, e.g. for aircraft; Combined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0021Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft

Definitions

  • the exemplary embodiments described herein generally relate to touch screens and more particularly to modifying touch sensitive object placement.
  • a touch screen offers intuitive input for a computer or other data processing devices, but may be affected by movement of the touch screen and/or the pilot caused by, for example, turbulence, aircraft vibration, and/or G forces.
  • An apparatus comprising a system configured to determine an adaptive condition; a touch screen having a first dimension in a first direction and a second dimension in a second direction, and configured to display a container, having a third dimension in the first direction that is less than the first dimension, and that displays a plurality of touch sensitive objects aligned in the first direction; and a touch screen controller configured to, when the adaptive condition is sensed; expand the container further in the first direction to a fourth dimension, if the fourth dimension is less than the first dimension; expand the container in a second direction and repositioning a portion of the objects in the second direction if the fourth dimension is greater than the first dimension.
  • Another apparatus comprises a sensor configured to sense movement; a touch screen having a first dimension in a first direction and a second dimension in a second direction, and configured to display a menu having a third dimension in the first direction that is less than the first dimension, and that displays a plurality of touch sensitive objects aligned in the first direction; and a touch screen controller configured to, when the movement is sensed, expand the menu further in the first direction to a fourth dimension, if the fourth dimension is less than the first dimension; expand the container in the second direction and repositioning a portion of the objects in the second direction if the fourth dimension would be greater than the first dimension.
  • FIG. 1 is a block diagram of an aircraft system for presenting images on a display
  • FIGS. 2 and 3 are representative diagrams of a touch screen in accordance with a first exemplary embodiment
  • FIG. 4 is a representative diagram of a touch screen in accordance with a second exemplary embodiment
  • FIGS. 5 and 6 are representative diagrams of a touch screen in accordance with a third exemplary embodiment
  • FIG. 7 is a representative diagram of a touch screen in accordance with a fourth exemplary embodiment.
  • FIG. 8 is a flow chart in accordance with the exemplary embodiment.
  • an embodiment of a system or a component may employ various integrated circuit components, e.g., memory elements, digital signal processing elements, logic elements, look-up tables, or the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices.
  • integrated circuit components e.g., memory elements, digital signal processing elements, logic elements, look-up tables, or the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices.
  • a user interface for example a touch screen, includes containers, for example, rectangular areas such as menu buttons, toggle buttons, radio buttons, check boxes buttons, or pull-down menus that may be modified in accordance with the exemplary embodiments.
  • touch screen refers to a display sensitive to the touch or approach of another object, for example, a finger or a stylus, by determining pressure from the object, a resistance, a capacitance, and the like.
  • These containers may either increase (expand) or decrease (collapse) in size in response to sensed adverse condition, for example, relative movement between the user interface or the user, that makes it difficult for a user to touch an intended object, that is sensitive to a touch, also known as targets or user controls, during the adverse operation conditions in order to improve a users' ability to reach the desired target on the screen.
  • the most prominent objects may be increased to improve selectability of these interactive screen targets.
  • the prominent objects are increased in size at the expense of non-prominent objects, non-interactive screen areas, or non-prominent screen areas.
  • the expanding of a container increases object spacing in a container (the container itself is enlarged), and optionally may increase the size of the objects. Spacing, and optionally size, of the objects is increased in such axis or axes in which the increase of size and spacing provides the highest benefit for the user. Typically, this method increases height and vertical spacing between user controls, because most controls have a larger size in the horizontal axis than in the vertical axis. This is typically the case of a pull down menu.
  • a container may be expanded in two directions, for example, vertical and horizontal when viewing a screen.
  • the container is expanded in a second direction, for example, horizontal.
  • a first portion of the user controls will remain in the same position while a second portion of the user controls will be moved in the second direction for a distance (typically the width of the container in normal state) in order to improve spacing between controls.
  • the objects of the first portion and the objects of the second portion alternate, wherein the second portion is spaced from the first portion. This change in position of adjacent objects increases spacing between objects to minimize selecting adjacent objects by mistake.
  • the objects may also expand in size.
  • the container When the adverse condition has ceased, the container will reduce back to its original shape.
  • the method may be applied to any display in, for example, avionics or maritime controlled by a pointing device or touch, and may be used on non-integrated Electronic Flight Bags with a touch interface because they are typically not mounted in pilots' primary view area in which the human's hand is not supported (therefore more likely to be subject to movement) with an underlying surface as it is with cursor control device, for example.
  • the boundaries of the touch sensitive object generally are related to a symbol associated with the object.
  • the touch screen controller will modify the boundaries of the object to shift in position so that it changes position in relation to the symbol. These modifications may be either symmetric or asymmetric, with the boundary being enlarged on only one edge or along only the horizontal axis or the vertical axis, in response to the adverse condition, thereby increasing the area defined by the object.
  • a touch screen for adjusting the positioning of user controls or touch sensing objects in response adaptive conditions in which larger touch sensing objects would be beneficial, for example, to movement such as turbulence, aircraft vibration, and/or G forces.
  • the touch screen comprises a display face having a container including a plurality of objects displayed and generally aligned in a first direction.
  • a touch screen controller in response to the adaptive condition, for example, sensed relative movement between the touch screen and the user, is configured to expand the container further in the first direction and enlarge the objects within the container if there is sufficient space for the expansion.
  • the container expands in a second direction and repositions second objects, preferably alternating from the first objects, within the second menu in a second direction.
  • a flight deck display system 100 includes a user interface 102 , a processor 104 , one or more terrain databases 106 sometimes referred to as a Terrain Avoidance and Warning System (TAWS), one or more navigation databases 108 , an adaptive (or adverse) condition sensor 124 avionic sensors 112 , external data sources 114 , one or more display devices 116 .
  • TAWS Terrain Avoidance and Warning System
  • the user interface 102 is in operable communication with the processor 104 and is configured to receive input from a user 109 (e.g., a pilot) and, in response to the user input, supplies command signals to the processor 104 .
  • the user interface 102 may be any one, or combination, of various known user interface devices including, but not limited to, one or more buttons, switches, or knobs (not shown).
  • the user interface 102 includes a touch panel 107 and a touch panel controller 111 .
  • the touch panel controller 111 provides drive signals 113 to a touch panel 107 , and a sense signal 115 is provided from the touch panel 107 to the touch panel controller 111 , which periodically provides a controller signal 117 of the determination of a touch to the processor 104 .
  • the processor 104 interprets the controller signal 117 , determines the application of the digit on the touch panel 107 , and provides, for example, a controller signal 117 to the touch panel controller 111 and a signal 119 to the display device 116 . Therefore, the user 109 uses the touch panel 107 to provide an input as more fully described hereinafter.
  • the processor 104 may be implemented or realized with a general purpose processor, a content addressable memory, a digital signal processor, an application specific integrated circuit, a field programmable gate array, any suitable programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination designed to perform the functions described herein.
  • a processor device may be realized as a microprocessor, a controller, a microcontroller, or a state machine.
  • a processor device may be implemented as a combination of computing devices, e.g., a combination of a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other such configuration.
  • the processor 104 includes on-board RAM (random access memory) 103 , and on-board ROM (read-only memory) 105 .
  • the program instructions that control the processor 104 may be stored in either or both the RAM 103 and the ROM 105 .
  • the operating system software may be stored in the ROM 105
  • various operating mode software routines and various operational parameters may be stored in the RAM 103 .
  • the software executing the exemplary embodiment is stored in either the ROM 105 or the RAM 103 . It will be appreciated that this is merely exemplary of one scheme for storing operating system software and software routines, and that various other storage schemes may be implemented.
  • the memory 103 , 105 may be realized as RAM memory, flash memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
  • the memory 103 , 105 can be coupled to the processor 104 such that the processor 104 can be read information from, and write information to, the memory 103 , 105 .
  • the memory 103 , 105 may be integral to the processor 104 .
  • the processor 104 and the memory 103 , 105 may reside in an ASIC.
  • a functional or logical module/component of the display system 100 might be realized using program code that is maintained in the memory 103 , 105 .
  • the memory 103 , 105 can be used to store data utilized to support the operation of the display system 100 , as will become apparent from the following description.
  • the processor 104 is in operable communication with the terrain databases 106 , the navigation databases 108 , and the display devices 116 , and is coupled to receive various types of inertial data from the sensors 112 , and various other avionics-related data from the external data sources 114 .
  • the processor 104 is configured, in response to the inertial data and the avionics-related data, to selectively retrieve terrain data from one or more of the terrain databases 106 and navigation data from one or more of the navigation databases 108 , and to supply appropriate display commands to the display devices 116 .
  • the display devices 116 in response to the display commands, selectively render various types of textual, graphic, and/or iconic information.
  • the adverse condition sensor 124 may be disposed within the display device 116 , on the user 109 , or separate from the display device 116 and the user 109 . However the adverse condition sensor 110 is disposed, it senses adverse conditions, for example, relative movement between the display device 116 and the user 109 .
  • the terrain databases 106 include various types of data representative of the terrain over which the aircraft is flying, and the navigation databases 108 include various types of navigation-related data.
  • the sensors 112 may be implemented using various types of inertial sensors, systems, and or subsystems, now known or developed in the future, for supplying various types of inertial data, for example, representative of the state of the aircraft including aircraft speed, heading, altitude, and attitude.
  • the ILS 118 provides aircraft with horizontal (or localizer) and vertical (or glide slope) guidance just before and during landing and, at certain fixed points, indicates the distance to the reference point of landing on a particular runway.
  • the GPS receiver 122 is a multi-channel receiver, with each channel tuned to receive one or more of the GPS broadcast signals transmitted by the constellation of GPS satellites (not illustrated) orbiting the earth.
  • the display devices 116 in response to display commands supplied from the processor 104 , selectively render various textual, graphic, and/or iconic information, and thereby supplies visual feedback to the user 109 .
  • the display device 116 may be implemented using any one of numerous known display devices suitable for rendering textual, graphic, and/or iconic information in a format viewable by the user 109 .
  • Non-limiting examples of such display devices include various cathode ray tube (CRT) displays, and various flat panel displays such as various types of LCD (liquid crystal display) and TFT (thin film transistor) displays.
  • the display devices 116 may additionally be implemented as a panel mounted display, or any one of numerous known technologies.
  • the display devices 116 may be configured as any one of numerous types of aircraft flight deck displays. For example, it may be configured as a multi-function display, a horizontal situation indicator, or a vertical situation indicator, just to name a few. In the depicted embodiment, however, one of the display devices 116 is configured as a primary flight display (PFD).
  • PFD primary flight display
  • the display device 116 is also configured to process the current flight status data for the host aircraft.
  • the sources of flight status data generate, measure, and/or provide different types of data related to the operational status of the host aircraft, the environment in which the host aircraft is operating, flight parameters, and the like.
  • the sources of flight status data may be realized using line replaceable units (LRUs), transducers, accelerometers, instruments, sensors, and other well known devices.
  • LRUs line replaceable units
  • the data provided by the sources of flight status data may include, without limitation: airspeed data; groundspeed data; altitude data; attitude data, including pitch data and roll data; yaw data; geographic position data, such as GPS data; time/date information; heading information; weather information; flight path data; track data; radar altitude data; geometric altitude data; wind speed data; wind direction data; etc.
  • the display device 116 is suitably designed to process data obtained from the sources of flight status data in the manner described in more detail herein.
  • a touch screen having at least one container configured to display a plurality of symbols.
  • Symbols as used herein are defined to include alphanumeric characters, icons, signs, words, terms, phrases, and menu items.
  • a particular symbol is selected by sensing the application (touch) of a digit, such as a finger or a stylus, to a touch-sensitive object associated with that symbol.
  • the digit may be swiped, or moved, in a particular direction to enable a desired function.
  • Each display region including a symbol has a touch-sensing object associated therewith for sensing the application and/or movement of the digit or digits.
  • a touch screen 200 in accordance with a first exemplary embodiment includes a face 202 displaying a container, or menu 204 for example, including four objects 206 , 207 , 208 , 209 .
  • the objects are sensitive to a touch for selecting a function and typically contain an icon representative of the function (shown for example, as the letters A, B, C, D, respectively).
  • Objects 206 , 207 , 208 , 209 (shown as dotted lines) are positioned with respect to the icons A, B, C, and D, respectively, and may include a solid outline (not shown), such as a rectangle or circle, surrounding the icons A, B, C, D.
  • the solid outline preferable would cover the same area as the objects 206 , 207 , 208 , 209 , or be slightly within.
  • the objects 206 , 207 , 208 , 209 are defined by selected pixels as determined by software in the processor 104 . A touching of one of the objects 206 , 207 , 208 , 209 is communicated to the processor 104 and the function associated with the respective symbol 206 , 207 , 208 , 209 will be selected.
  • the container 204 expands in a first direction (vertically as shown in FIG. 3 ) to create the container 304 and the spacing between the objects 306 , 307 , 308 , 309 is increased, thereby increasing the probability of the user being able to touch the intended object 306 , 307 , 308 , 309 .
  • the objects 406 , 407 , 408 , 409 are increased in size over the objects 306 , 307 , 308 , 309 of FIG. 3 .
  • the container 505 contains a number of objects 506 , 507 , 508 , 509 , 510 , 511 , 512 are such that the container 504 is unable to expand in a first direction (vertically as shown) and be displayed on the touch screen 500 , the container 504 expands in a second direction (horizontal as shown) to create the container 604 ( FIG. 6 ) and a portion of the icons, for example icons 507 , 509 . 511 are spaced in the second direction from a second portion of the icons, for example, 506 , 508 , 510 , and 512 .
  • alternating objects 506 , 507 , 508 , 509 , 510 , 511 , 512 are repositioned in the second direction as shown.
  • the icons 706 , 707 , 708 , 709 , 710 , 711 , 712 are increased in size over the objects 506 , 507 , 508 , 509 , 510 , 511 , 512 of FIG. 6 .
  • FIG. 8 is a flow chart that illustrates a touch screens process suitable for use with a flight deck display system such as the user interface 102 .
  • Process 800 represents an implementation of a method for selecting symbols on an onboard display element of a host aircraft.
  • the various tasks performed in connection with process 800 may be performed by software, hardware, firmware, or any combination thereof.
  • the following description of process 800 may refer to elements mentioned above in connection with FIGS. 2 through 7 .
  • portions of process 800 may be performed by different elements of the described system, e.g., a processor or a display element. It should be appreciated that process 800 may include any number of additional or alternative tasks, the tasks shown in FIG.
  • process 800 need not be performed in the illustrated order, and process 800 may be incorporated into a more comprehensive procedure or process having additional functionality not described in detail herein. Moreover, one or more of the tasks shown in FIG. 8 could be omitted from an embodiment of the process 800 as long as the intended overall functionality remains intact.
  • an adaptive condition of a touch screen is determined 802 , the touch screen having a first dimension in a first direction and a second dimension in a second direction.
  • a container is displayed 804 having a third dimension in the first direction that is less than the first dimension and having a plurality of touch sensitive objects aligned in the first direction.
  • the container is further expanded 806 in the first direction to a fourth dimension when the adaptive condition is determined and if the fourth dimension is less than the first dimension.
  • the container is expanded in the second direction and a portion of the objects are repositioned 808 in the second direction when the adaptive condition is determined and if the third dimension is greater than the first dimension.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A touch screen and method are provided for adjusting the positioning of user controls such as touch sensing objects in response adaptive conditions, for example, movement such as turbulence, aircraft vibration, and/or G forces, in which larger or spaced touch sensing objects would be beneficial.

Description

    TECHNICAL FIELD
  • The exemplary embodiments described herein generally relate to touch screens and more particularly to modifying touch sensitive object placement.
  • BACKGROUND
  • World wide air traffic is projected to double every ten to fourteen years and the International Civil Aviation Organization (ICAO) forecasts world air travel growth of five percent per annum until the year 2020. Such growth may have an influence on flight performance and may increase the workload of the flight crew. One such influence on flight performance has been the ability for the flight crew to input data while paying attention to other matters within and outside of the cockpit, especially during periods when movement makes it difficult to touch the screen in the desired manner or location. The ability to easily and quickly input data can significantly improve situational awareness of the flight crew.
  • Many electronic devices, such as aircraft flight deck operational equipment, cursor control devices (CCDs), hard knobs, switches, and hardware keyboards, are increasingly being replaced by touch screens. A touch screen offers intuitive input for a computer or other data processing devices, but may be affected by movement of the touch screen and/or the pilot caused by, for example, turbulence, aircraft vibration, and/or G forces.
  • However, owing to screen size, resolution limitations and the amount of information presented on the screen, designing interactive targets (touch sensitive objects) large enough to be suitable for both normal and adverse conditions would cause unwelcome reduction of effective screen space.
  • Accordingly, it is desirable to provide a touch screen whose input is adaptive to adverse conditions, for example, movement caused by turbulence, G forces, and/or equipment vibrations. Furthermore, other desirable features and characteristics of the exemplary embodiments will become apparent from the subsequent detailed description and the appended claims, taken in conjunction with the accompanying drawings and the foregoing technical field and background.
  • BRIEF SUMMARY
  • An apparatus comprising a system configured to determine an adaptive condition; a touch screen having a first dimension in a first direction and a second dimension in a second direction, and configured to display a container, having a third dimension in the first direction that is less than the first dimension, and that displays a plurality of touch sensitive objects aligned in the first direction; and a touch screen controller configured to, when the adaptive condition is sensed; expand the container further in the first direction to a fourth dimension, if the fourth dimension is less than the first dimension; expand the container in a second direction and repositioning a portion of the objects in the second direction if the fourth dimension is greater than the first dimension.
  • Another apparatus comprises a sensor configured to sense movement; a touch screen having a first dimension in a first direction and a second dimension in a second direction, and configured to display a menu having a third dimension in the first direction that is less than the first dimension, and that displays a plurality of touch sensitive objects aligned in the first direction; and a touch screen controller configured to, when the movement is sensed, expand the menu further in the first direction to a fourth dimension, if the fourth dimension is less than the first dimension; expand the container in the second direction and repositioning a portion of the objects in the second direction if the fourth dimension would be greater than the first dimension.
  • A method is provided for modifying the size of a container on a touch screen, the touch screen having a first dimension in a first direction and a second dimension in a second direction, comprising determining an adaptive condition; displaying the container having a third dimension in the first direction that is less than the first dimension and having a plurality of touch sensitive objects aligned in the first direction; expanding the container further in the first direction to a fourth dimension, when the adaptive condition is determined and if the fourth dimension is less than the first dimension; and expanding the container in a second direction and repositioning a portion of the objects in the second direction when the adaptive condition is determined and if the third dimension is greater than the first dimension.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention will hereinafter be described in conjunction with the following drawing figures, wherein like numerals denote like elements, and
  • FIG. 1 is a block diagram of an aircraft system for presenting images on a display;
  • FIGS. 2 and 3 are representative diagrams of a touch screen in accordance with a first exemplary embodiment;
  • FIG. 4 is a representative diagram of a touch screen in accordance with a second exemplary embodiment;
  • FIGS. 5 and 6 are representative diagrams of a touch screen in accordance with a third exemplary embodiment;
  • FIG. 7 is a representative diagram of a touch screen in accordance with a fourth exemplary embodiment; and
  • FIG. 8 is a flow chart in accordance with the exemplary embodiment.
  • DETAILED DESCRIPTION
  • The following detailed description is merely illustrative in nature and is not intended to limit the embodiments of the subject matter or the application and uses of such embodiments. Any implementation described herein as exemplary is not necessarily to be construed as preferred or advantageous over other implementations. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary, or the following detailed description.
  • Techniques and technologies may be described herein in terms of functional and/or logical block components, and with reference to symbolic representations of operations, processing tasks, and functions that may be performed by various computing components or devices. Such operations, tasks, and functions are sometimes referred to as being computer-executed, computerized, software-implemented, or computer-implemented. In practice, one or more processor devices can carry out the described operations, tasks, and functions by manipulating electrical signals representing data bits at memory locations in the system memory, as well as other processing of signals. The memory locations where data bits are maintained are physical locations that have particular electrical, magnetic, optical, or organic properties corresponding to the data bits. It should be appreciated that the various components shown in the figures may be realized by any number of hardware, software, and/or firmware components configured to perform the specified functions. For example, an embodiment of a system or a component may employ various integrated circuit components, e.g., memory elements, digital signal processing elements, logic elements, look-up tables, or the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices.
  • For the sake of brevity, conventional techniques related to graphics and image processing, navigation, flight planning, aircraft controls, aircraft data communication systems, and other functional aspects of certain systems and subsystems (and the individual operating components thereof) may not be described in detail herein. Furthermore, the connecting lines shown in the various figures contained herein are intended to represent exemplary functional relationships and/or physical couplings between the various elements. It should be noted that many alternative or additional functional relationships or physical connections may be present in an embodiment of the subject matter.
  • A user interface, for example a touch screen, includes containers, for example, rectangular areas such as menu buttons, toggle buttons, radio buttons, check boxes buttons, or pull-down menus that may be modified in accordance with the exemplary embodiments. As used herein, touch screen refers to a display sensitive to the touch or approach of another object, for example, a finger or a stylus, by determining pressure from the object, a resistance, a capacitance, and the like. These containers may either increase (expand) or decrease (collapse) in size in response to sensed adverse condition, for example, relative movement between the user interface or the user, that makes it difficult for a user to touch an intended object, that is sensitive to a touch, also known as targets or user controls, during the adverse operation conditions in order to improve a users' ability to reach the desired target on the screen. The most prominent objects may be increased to improve selectability of these interactive screen targets. In some embodiments, the prominent objects are increased in size at the expense of non-prominent objects, non-interactive screen areas, or non-prominent screen areas.
  • The expanding of a container increases object spacing in a container (the container itself is enlarged), and optionally may increase the size of the objects. Spacing, and optionally size, of the objects is increased in such axis or axes in which the increase of size and spacing provides the highest benefit for the user. Typically, this method increases height and vertical spacing between user controls, because most controls have a larger size in the horizontal axis than in the vertical axis. This is typically the case of a pull down menu.
  • Generally, there are two directions in which a container may be expanded, for example, vertical and horizontal when viewing a screen. In one aspect of the exemplary embodiments, if a container cannot be expanded in a first direction owing to insufficient space in axis in which the expanded would be most beneficial for the user, for example, vertical, the container is expanded in a second direction, for example, horizontal. In this latter case of horizontal expansion, a first portion of the user controls will remain in the same position while a second portion of the user controls will be moved in the second direction for a distance (typically the width of the container in normal state) in order to improve spacing between controls. Preferably, the objects of the first portion and the objects of the second portion alternate, wherein the second portion is spaced from the first portion. This change in position of adjacent objects increases spacing between objects to minimize selecting adjacent objects by mistake. In some embodiments, the objects may also expand in size.
  • When the adverse condition has ceased, the container will reduce back to its original shape.
  • The method may be applied to any display in, for example, avionics or maritime controlled by a pointing device or touch, and may be used on non-integrated Electronic Flight Bags with a touch interface because they are typically not mounted in pilots' primary view area in which the human's hand is not supported (therefore more likely to be subject to movement) with an underlying surface as it is with cursor control device, for example.
  • The boundaries of the touch sensitive object generally are related to a symbol associated with the object. The touch screen controller will modify the boundaries of the object to shift in position so that it changes position in relation to the symbol. These modifications may be either symmetric or asymmetric, with the boundary being enlarged on only one edge or along only the horizontal axis or the vertical axis, in response to the adverse condition, thereby increasing the area defined by the object.
  • More specifically, in one exemplary embodiment, a touch screen is provided for adjusting the positioning of user controls or touch sensing objects in response adaptive conditions in which larger touch sensing objects would be beneficial, for example, to movement such as turbulence, aircraft vibration, and/or G forces. The touch screen comprises a display face having a container including a plurality of objects displayed and generally aligned in a first direction. A touch screen controller, in response to the adaptive condition, for example, sensed relative movement between the touch screen and the user, is configured to expand the container further in the first direction and enlarge the objects within the container if there is sufficient space for the expansion. If there is not sufficient room for expansion in the first direction, the container expands in a second direction and repositions second objects, preferably alternating from the first objects, within the second menu in a second direction. A method of operating the touch screen having a first dimension in a first direction, and a second dimension in a second direction, including determining an adaptive condition; displaying a container having a third dimension in the first direction that is less than the first dimension and having a plurality of touch sensitive objects aligned in the first direction; expanding the container further in the first direction to a fourth dimension, when the adaptive condition is determined and if the fourth dimension is less than the first dimension; and expanding the container in a second direction and repositioning a portion of the objects in the second direction when the adaptive condition is determined and if the third dimension is greater than the first dimension.
  • Though the method and touch panel of the exemplary embodiments may be used in any type of electronic device, for example, vehicles and heavy machinery, and small handheld mobile devices such as smart phones, the use in an aircraft system is described as an example. Referring to FIG. 1, a flight deck display system 100 includes a user interface 102, a processor 104, one or more terrain databases 106 sometimes referred to as a Terrain Avoidance and Warning System (TAWS), one or more navigation databases 108, an adaptive (or adverse) condition sensor 124 avionic sensors 112, external data sources 114, one or more display devices 116. The user interface 102 is in operable communication with the processor 104 and is configured to receive input from a user 109 (e.g., a pilot) and, in response to the user input, supplies command signals to the processor 104. The user interface 102 may be any one, or combination, of various known user interface devices including, but not limited to, one or more buttons, switches, or knobs (not shown). In the depicted embodiment, the user interface 102 includes a touch panel 107 and a touch panel controller 111. The touch panel controller 111 provides drive signals 113 to a touch panel 107, and a sense signal 115 is provided from the touch panel 107 to the touch panel controller 111, which periodically provides a controller signal 117 of the determination of a touch to the processor 104. The processor 104 interprets the controller signal 117, determines the application of the digit on the touch panel 107, and provides, for example, a controller signal 117 to the touch panel controller 111 and a signal 119 to the display device 116. Therefore, the user 109 uses the touch panel 107 to provide an input as more fully described hereinafter.
  • The processor 104 may be implemented or realized with a general purpose processor, a content addressable memory, a digital signal processor, an application specific integrated circuit, a field programmable gate array, any suitable programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination designed to perform the functions described herein. A processor device may be realized as a microprocessor, a controller, a microcontroller, or a state machine. Moreover, a processor device may be implemented as a combination of computing devices, e.g., a combination of a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other such configuration. In the depicted embodiment, the processor 104 includes on-board RAM (random access memory) 103, and on-board ROM (read-only memory) 105. The program instructions that control the processor 104 may be stored in either or both the RAM 103 and the ROM 105. For example, the operating system software may be stored in the ROM 105, whereas various operating mode software routines and various operational parameters may be stored in the RAM 103. The software executing the exemplary embodiment is stored in either the ROM 105 or the RAM 103. It will be appreciated that this is merely exemplary of one scheme for storing operating system software and software routines, and that various other storage schemes may be implemented.
  • The memory 103, 105 may be realized as RAM memory, flash memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. In this regard, the memory 103, 105 can be coupled to the processor 104 such that the processor 104 can be read information from, and write information to, the memory 103, 105. In the alternative, the memory 103, 105 may be integral to the processor 104. As an example, the processor 104 and the memory 103, 105 may reside in an ASIC. In practice, a functional or logical module/component of the display system 100 might be realized using program code that is maintained in the memory 103, 105. For example, the memory 103, 105 can be used to store data utilized to support the operation of the display system 100, as will become apparent from the following description.
  • No matter how the processor 104 is specifically implemented, it is in operable communication with the terrain databases 106, the navigation databases 108, and the display devices 116, and is coupled to receive various types of inertial data from the sensors 112, and various other avionics-related data from the external data sources 114. The processor 104 is configured, in response to the inertial data and the avionics-related data, to selectively retrieve terrain data from one or more of the terrain databases 106 and navigation data from one or more of the navigation databases 108, and to supply appropriate display commands to the display devices 116. The display devices 116, in response to the display commands, selectively render various types of textual, graphic, and/or iconic information.
  • The adverse condition sensor 124 may be disposed within the display device 116, on the user 109, or separate from the display device 116 and the user 109. However the adverse condition sensor 110 is disposed, it senses adverse conditions, for example, relative movement between the display device 116 and the user 109.
  • The terrain databases 106 include various types of data representative of the terrain over which the aircraft is flying, and the navigation databases 108 include various types of navigation-related data. The sensors 112 may be implemented using various types of inertial sensors, systems, and or subsystems, now known or developed in the future, for supplying various types of inertial data, for example, representative of the state of the aircraft including aircraft speed, heading, altitude, and attitude. The ILS 118 provides aircraft with horizontal (or localizer) and vertical (or glide slope) guidance just before and during landing and, at certain fixed points, indicates the distance to the reference point of landing on a particular runway. The GPS receiver 122 is a multi-channel receiver, with each channel tuned to receive one or more of the GPS broadcast signals transmitted by the constellation of GPS satellites (not illustrated) orbiting the earth.
  • The display devices 116, as noted above, in response to display commands supplied from the processor 104, selectively render various textual, graphic, and/or iconic information, and thereby supplies visual feedback to the user 109. It will be appreciated that the display device 116 may be implemented using any one of numerous known display devices suitable for rendering textual, graphic, and/or iconic information in a format viewable by the user 109. Non-limiting examples of such display devices include various cathode ray tube (CRT) displays, and various flat panel displays such as various types of LCD (liquid crystal display) and TFT (thin film transistor) displays. The display devices 116 may additionally be implemented as a panel mounted display, or any one of numerous known technologies. It is additionally noted that the display devices 116 may be configured as any one of numerous types of aircraft flight deck displays. For example, it may be configured as a multi-function display, a horizontal situation indicator, or a vertical situation indicator, just to name a few. In the depicted embodiment, however, one of the display devices 116 is configured as a primary flight display (PFD).
  • In operation, the display device 116 is also configured to process the current flight status data for the host aircraft. In this regard, the sources of flight status data generate, measure, and/or provide different types of data related to the operational status of the host aircraft, the environment in which the host aircraft is operating, flight parameters, and the like. In practice, the sources of flight status data may be realized using line replaceable units (LRUs), transducers, accelerometers, instruments, sensors, and other well known devices. The data provided by the sources of flight status data may include, without limitation: airspeed data; groundspeed data; altitude data; attitude data, including pitch data and roll data; yaw data; geographic position data, such as GPS data; time/date information; heading information; weather information; flight path data; track data; radar altitude data; geometric altitude data; wind speed data; wind direction data; etc. The display device 116 is suitably designed to process data obtained from the sources of flight status data in the manner described in more detail herein.
  • A touch screen is disclosed having at least one container configured to display a plurality of symbols. Symbols as used herein are defined to include alphanumeric characters, icons, signs, words, terms, phrases, and menu items. A particular symbol is selected by sensing the application (touch) of a digit, such as a finger or a stylus, to a touch-sensitive object associated with that symbol. In some exemplary embodiments, the digit may be swiped, or moved, in a particular direction to enable a desired function. Each display region including a symbol has a touch-sensing object associated therewith for sensing the application and/or movement of the digit or digits.
  • Referring to FIG. 2, a touch screen 200 in accordance with a first exemplary embodiment includes a face 202 displaying a container, or menu 204 for example, including four objects 206, 207, 208, 209. The objects are sensitive to a touch for selecting a function and typically contain an icon representative of the function (shown for example, as the letters A, B, C, D, respectively). Objects 206, 207, 208, 209 (shown as dotted lines) are positioned with respect to the icons A, B, C, and D, respectively, and may include a solid outline (not shown), such as a rectangle or circle, surrounding the icons A, B, C, D. The solid outline preferable would cover the same area as the objects 206, 207, 208, 209, or be slightly within. The objects 206, 207, 208, 209 are defined by selected pixels as determined by software in the processor 104. A touching of one of the objects 206, 207, 208, 209 is communicated to the processor 104 and the function associated with the respective symbol 206, 207, 208, 209 will be selected.
  • However, when the touch screen 200 and/or the user is subject to adverse conditions, for example, turbulence, G forces, and/or equipment vibrations as sensed by the system determining adaptive conditions 124, it becomes difficult for the user to touch the intended object since the intended object (the entire touch screen 200) is moving in relation to the user.
  • In accordance with the first exemplary embodiment, when the adverse conditions are determined (sensed) by the system 124, the container 204 expands in a first direction (vertically as shown in FIG. 3) to create the container 304 and the spacing between the objects 306, 307, 308, 309 is increased, thereby increasing the probability of the user being able to touch the intended object 306, 307, 308, 309. Optionally, and in accordance with a second exemplary embodiment of FIG. 4, the objects 406, 407, 408, 409 are increased in size over the objects 306, 307, 308, 309 of FIG. 3.
  • If, as shown with the touch screen 500 of FIG. 5, the container 505 contains a number of objects 506, 507, 508, 509, 510, 511, 512 are such that the container 504 is unable to expand in a first direction (vertically as shown) and be displayed on the touch screen 500, the container 504 expands in a second direction (horizontal as shown) to create the container 604 (FIG. 6) and a portion of the icons, for example icons 507, 509. 511 are spaced in the second direction from a second portion of the icons, for example, 506, 508, 510, and 512. Preferably, alternating objects 506, 507, 508, 509, 510, 511, 512 are repositioned in the second direction as shown. Optionally, and in accordance with a fourth exemplary embodiment of FIG. 7, the icons 706, 707, 708, 709, 710, 711, 712 are increased in size over the objects 506, 507, 508, 509, 510, 511, 512 of FIG. 6.
  • FIG. 8 is a flow chart that illustrates a touch screens process suitable for use with a flight deck display system such as the user interface 102. Process 800 represents an implementation of a method for selecting symbols on an onboard display element of a host aircraft. The various tasks performed in connection with process 800 may be performed by software, hardware, firmware, or any combination thereof. For illustrative purposes, the following description of process 800 may refer to elements mentioned above in connection with FIGS. 2 through 7. In practice, portions of process 800 may be performed by different elements of the described system, e.g., a processor or a display element. It should be appreciated that process 800 may include any number of additional or alternative tasks, the tasks shown in FIG. 8 need not be performed in the illustrated order, and process 800 may be incorporated into a more comprehensive procedure or process having additional functionality not described in detail herein. Moreover, one or more of the tasks shown in FIG. 8 could be omitted from an embodiment of the process 800 as long as the intended overall functionality remains intact.
  • Referring to the flow chart of FIG. 8, an adaptive condition of a touch screen is determined 802, the touch screen having a first dimension in a first direction and a second dimension in a second direction. A container is displayed 804 having a third dimension in the first direction that is less than the first dimension and having a plurality of touch sensitive objects aligned in the first direction. The container is further expanded 806 in the first direction to a fourth dimension when the adaptive condition is determined and if the fourth dimension is less than the first dimension. The container is expanded in the second direction and a portion of the objects are repositioned 808 in the second direction when the adaptive condition is determined and if the third dimension is greater than the first dimension.
  • While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the invention, it being understood that various changes may be made in the function and arrangement of elements described in an exemplary embodiment without departing from the scope of the invention as set forth in the appended claims.

Claims (20)

What is claimed is:
1. An apparatus comprising:
a system configured to determine an adaptive condition;
a touch screen having a first dimension in a first direction and a second dimension in a second direction, and configured to:
display a container, having a third dimension in the first direction that is less than the first dimension, and that displays a plurality of touch sensitive objects aligned in the first direction; and
a touch screen controller configured to, when the adaptive condition is sensed:
expand the container further in the first direction to a fourth dimension, if the fourth dimension is less than the first dimension;
expand the container in a second direction and repositioning a portion of the objects in the second direction if the fourth dimension is greater than the first dimension.
2. The apparatus of claim 1 wherein the touch screen controller is configured to:
reposition alternating objects if the fourth dimension is greater than the first dimension.
3. The apparatus of claim 2 wherein the touch screen controller is further configured to:
enlarge the objects when the adaptive condition is determined
4. The apparatus of claim 1 wherein the touch screen controller is further configured to:
enlarge the objects when the adaptive condition is determined
5. The apparatus of claim 1 wherein the system, in determining the adaptive condition, is configured to:
determine relative movement between the touch screen and a user.
6. The apparatus of claim 1 wherein the system, in determining the adaptive condition, is configured to:
sense at least one of a vibration, turbulence, and G forces.
7. An apparatus comprising:
a sensor configured to sense movement;
a touch screen having a first dimension in a first direction and a second dimension in a second direction, and configured to:
display a menu having a third dimension in the first direction that is less than the first dimension, and that displays a plurality of touch sensitive objects aligned in the first direction; and
a touch screen controller configured to, when the movement is sensed:
expand the menu further in the first direction to a fourth dimension, if the fourth dimension is less than the first dimension;
expand the container in the second direction and repositioning a portion of the objects in the second direction if the fourth dimension would be greater than the first dimension.
8. The apparatus of claim 7 wherein the touch screen controller is further configured to:
enlarge the second objects when the movement is sensed.
9. The touch screen of claim 7 wherein the touch screen controller is further configured to:
enable a function in response to the at least one touch.
10. The touch screen of claim 7 wherein the touch sensitive objects each comprise a boundary and the touch screen controller is configured to:
adjust the boundary.
11. The touch screen of claim 7 wherein the touch sensitive objects each comprise a boundary and the touch screen controller is configured to:
enlarge the boundary.
12. The touch screen of claim 7 wherein the portion of the objects repositioned comprise alternating objects if the fourth dimension would greater than the first dimension.
13. The touch screen of claim 7 wherein the sensor is further configured to:
determine relative movement between the touch screen and a user.
14. The touch screen of claim 7 wherein the sensor is further configured to:
determine at least one of a vibration, turbulence, and a G force.
15. A method for modifying the size of a container on a touch screen, the touch screen having a first dimension in a first direction and a second dimension in a second direction, comprising:
determining an adaptive condition;
displaying the container having a third dimension in the first direction that is less than the first dimension and having a plurality of touch sensitive objects aligned in the first direction;
expanding the container further in the first direction to a fourth dimension, when the adaptive condition is determined and if the fourth dimension is less than the first dimension; and
expanding the container in a second direction and repositioning a portion of the objects in the second direction when the adaptive condition is determined and if the third dimension is greater than the first dimension.
16. The method of claim 15 further comprising:
repositioning alternating objects if the fourth dimension is greater than the first dimension.
17. The method of claim 15 further comprising:
enlarging the objects when the adaptive condition is determined
18. The method of claim 15 further comprising:
enlarging the objects when the adaptive condition is determined
19. The method of claim 15 wherein determining an adaptive condition comprises:
sensing relative movement between the touch screen and a user.
20. The method of claim 15 wherein determining an adaptive condition comprises:
sensing at least one of a vibration, turbulence, and G forces.
US13/927,943 2013-06-26 2013-06-26 Touch screen and method for adjusting touch sensitive object placement thereon Abandoned US20150002403A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US13/927,943 US20150002403A1 (en) 2013-06-26 2013-06-26 Touch screen and method for adjusting touch sensitive object placement thereon
EP14172028.4A EP2818994A1 (en) 2013-06-26 2014-06-11 Touch screen and method for adjusting touch sensitive object placement thereon
CN201410380328.0A CN104252267A (en) 2013-06-26 2014-06-25 Touch screen and method for adjusting touch sensitive object placement thereon

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/927,943 US20150002403A1 (en) 2013-06-26 2013-06-26 Touch screen and method for adjusting touch sensitive object placement thereon

Publications (1)

Publication Number Publication Date
US20150002403A1 true US20150002403A1 (en) 2015-01-01

Family

ID=50943126

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/927,943 Abandoned US20150002403A1 (en) 2013-06-26 2013-06-26 Touch screen and method for adjusting touch sensitive object placement thereon

Country Status (3)

Country Link
US (1) US20150002403A1 (en)
EP (1) EP2818994A1 (en)
CN (1) CN104252267A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10996793B2 (en) * 2016-06-20 2021-05-04 Ge Aviation Systems Limited Correction of vibration-induced error for touch screen display in an aircraft
US11262900B1 (en) * 2018-07-30 2022-03-01 The Boeing Company Graphical user interface in a computer system in an aircraft
US11301087B2 (en) * 2018-03-14 2022-04-12 Maxell, Ltd. Personal digital assistant

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017063141A1 (en) * 2015-10-13 2017-04-20 Motorola Mobility Llc Setting cursor position in text on display device
US10503317B2 (en) * 2016-02-09 2019-12-10 The Boeing Company Turbulence resistant touch system
CN107972878B (en) * 2017-11-20 2021-05-04 中电科航空电子有限公司 Tuning control system for touch screen of cockpit
US20190318711A1 (en) * 2018-04-16 2019-10-17 Bell Helicopter Textron Inc. Electronically Damped Touch Screen Display

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5751283A (en) * 1996-07-17 1998-05-12 Microsoft Corporation Resizing a window and an object on a display screen
US20070180401A1 (en) * 2006-02-02 2007-08-02 Mona Singh Methods, systems, and computer program products for displaying windows on a graphical user interface based on relative priorities associated with the windows
US20090201246A1 (en) * 2008-02-11 2009-08-13 Apple Inc. Motion Compensation for Screens
US20110109576A1 (en) * 2009-11-10 2011-05-12 Airbus Operations (S.A.S.) Method of adapting display parameters of a display device, in particular an on-board display device in an aircraft

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FI20012209A (en) * 2001-11-14 2003-06-24 Nokia Corp Method for controlling display of information in an electronic device and electronic device
US8631358B2 (en) * 2007-10-10 2014-01-14 Apple Inc. Variable device graphical user interface
US8219937B2 (en) * 2009-02-09 2012-07-10 Microsoft Corporation Manipulation of graphical elements on graphical user interface via multi-touch gestures

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5751283A (en) * 1996-07-17 1998-05-12 Microsoft Corporation Resizing a window and an object on a display screen
US20070180401A1 (en) * 2006-02-02 2007-08-02 Mona Singh Methods, systems, and computer program products for displaying windows on a graphical user interface based on relative priorities associated with the windows
US20090201246A1 (en) * 2008-02-11 2009-08-13 Apple Inc. Motion Compensation for Screens
US20110109576A1 (en) * 2009-11-10 2011-05-12 Airbus Operations (S.A.S.) Method of adapting display parameters of a display device, in particular an on-board display device in an aircraft

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10996793B2 (en) * 2016-06-20 2021-05-04 Ge Aviation Systems Limited Correction of vibration-induced error for touch screen display in an aircraft
US11301087B2 (en) * 2018-03-14 2022-04-12 Maxell, Ltd. Personal digital assistant
US20220236854A1 (en) * 2018-03-14 2022-07-28 Maxell, Ltd. Personal digital assistant
US11947757B2 (en) * 2018-03-14 2024-04-02 Maxell, Ltd. Personal digital assistant
US11262900B1 (en) * 2018-07-30 2022-03-01 The Boeing Company Graphical user interface in a computer system in an aircraft

Also Published As

Publication number Publication date
EP2818994A1 (en) 2014-12-31
CN104252267A (en) 2014-12-31

Similar Documents

Publication Publication Date Title
US8766936B2 (en) Touch screen and method for providing stable touches
EP2383642B1 (en) Touch screen and method for adjusting screen objects
EP2818994A1 (en) Touch screen and method for adjusting touch sensitive object placement thereon
US9916032B2 (en) System and method of knob operation for touchscreen devices
JP6242964B2 (en) How to magnify characters displayed on an adaptive touchscreen keypad
US20110187651A1 (en) Touch screen having adaptive input parameter
EP2787428A1 (en) Avionic touchscreen control systems and program products having no look control selection feature
EP2555105A2 (en) Touch screen having adaptive input requirements
EP2584318B1 (en) System and method for dynamically rendering bounded region labels on a moving map display
US20110128235A1 (en) Big key touch input device
EP2741198A2 (en) System and method for interacting with a touch screen interface utilizing an intelligent stencil mask
US20110029919A1 (en) Display and method for selecting images to be displayed
EP2762837A2 (en) System and method for displaying terrain altitudes on an aircraft display
US20090322753A1 (en) Method of automatically selecting degree of zoom when switching from one map to another
US8170789B2 (en) Method for providing search area coverage information
EP2813920B1 (en) A system and method for volumetric computing

Legal Events

Date Code Title Description
AS Assignment

Owner name: HONEYWELL INTERNATIONAL INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DOSTAL, MARTIN;EICHLER, ZDENEK;SIGNING DATES FROM 20130620 TO 20130626;REEL/FRAME:030692/0760

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION