WO2010036217A1 - Dual-view touchscreen display system and method of operation - Google Patents

Dual-view touchscreen display system and method of operation Download PDF

Info

Publication number
WO2010036217A1
WO2010036217A1 PCT/US2008/011090 US2008011090W WO2010036217A1 WO 2010036217 A1 WO2010036217 A1 WO 2010036217A1 US 2008011090 W US2008011090 W US 2008011090W WO 2010036217 A1 WO2010036217 A1 WO 2010036217A1
Authority
WO
WIPO (PCT)
Prior art keywords
menu
dual
display system
proximity
sensor
Prior art date
Application number
PCT/US2008/011090
Other languages
French (fr)
Inventor
Dallas D. Hickerson
Original Assignee
Panasonic Automotive Systems Company Of America Division Of Panasonic Corporation Of North America
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Automotive Systems Company Of America Division Of Panasonic Corporation Of North America filed Critical Panasonic Automotive Systems Company Of America Division Of Panasonic Corporation Of North America
Priority to KR1020117009081A priority Critical patent/KR20110066949A/en
Priority to CN200880131310XA priority patent/CN102165381A/en
Priority to JP2011528986A priority patent/JP2012503818A/en
Priority to PCT/US2008/011090 priority patent/WO2010036217A1/en
Priority to EP08816292A priority patent/EP2329326A1/en
Publication of WO2010036217A1 publication Critical patent/WO2010036217A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1438Touch screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/1526Dual-view displays

Definitions

  • the present invention generally relates to electronic display systems.
  • the present invention relates to a dual-view touchscreen display system and method of operation.
  • Some currently available display systems are capable of simultaneously displaying different information depending on the direction from which the screen is being viewed.
  • an automotive implementation of such a display system may provide a map view via a first application to the driver while simultaneously providing a video output such as a DVD movie to the passenger via a second application.
  • a potential problem is identifying which viewer is touching the screen to make a menu selection at a given time.
  • the display system has no way of directing the proper application (map or movie) to respond to a touchscreen command. This problem is particularly acute if the touchscreen menu options on the display have the same physical location for both applications.
  • An exemplary dual-view display system comprises a dual-view touchscreen display that is adapted to display a first image including a first menu to a first user who is positioned at a first location with respect to the dual-view display system and to display a second image including a second menu to a second user who is positioned at a second location with respect to the dual-view display system.
  • the dual-view display system further comprises at least one sensor that is adapted to detect proximity to the dual-view touchscreen display of the first user relative to proximity to the dual-view touchscreen display of the second user and a menu selection logic that is adapted to identify a received menu command as a selection from the first menu or a selection from the second menu based on the proximity to the dual-view touchscreen display of the first user relative to the proximity to the dual-view touchscreen display of the second user.
  • FIG. 1 is a front view of a dual-view touchscreen display system in accordance with an exemplary embodiment of the present invention
  • FIG. 2 is a top view of a dual-view touchscreen display system in accordance with an exemplary embodiment of the present invention
  • FIG. 3 is a schematic diagram of a proximity sensing circuit in accordance with an exemplary embodiment of the present invention.
  • FIG. 4 is a functional block diagram of a dual-view touchscreen display system in accordance with an exemplary embodiment of the present invention.
  • FIG. 5 is a process flow diagram showing a method in accordance with an exemplary embodiment of the present invention.
  • FIG. 6 is a process flow diagram showing a method of identifying an application to which a menu input command is directed in accordance with an exemplary embodiment of the present invention.
  • FIG. 1 is a front view of a dual-view touchscreen display system in accordance with an exemplary embodiment of the present invention.
  • the front view is generally referred to by the reference number 100.
  • a dual-view touchscreen display system 102 is adapted to present multiple views depending on the direction from which the screen is being viewed.
  • the dual-view touchscreen display system 102 may be positioned such that a first user (the driver) sees a display provided by a first application such as a map application.
  • the dual- view touchscreen display system 102 may present a second display from a second application to a second user (the passenger).
  • the passenger may view a movie from a DVD application at the same time the driver is viewing the map application.
  • the dual-view touchscreen display system 102 includes a touchscreen 104.
  • the touchscreen 104 allows either user to provide input in the form of menu selections depending upon where a user touches the screen.
  • the map application may from time to time display menu options relevant to the current map display being viewed by the driver on the touchscreen 104.
  • the driver may provide input by touching the touchscreen 104 at a location that corresponds to the desired menu command.
  • the DVD application may present menu oDtions relevant to the current display being viewed by the passenger on the touchscreen 104.
  • the passenger may provide input by touching the touchscreen 104 at a location that corresponds to the desired menu command.
  • An exemplary embodiment of the present invention is adapted to distinguish responses or user inputs by touching to the first application from responses or user inputs by touching to the second application even if the physical location of the menu selection areas for the first application physically overlap menu selection areas for the second application.
  • exemplary embodiments of the present invention prevent a second viewer of a dual-view display system from mistakenly entering a menu command that would affect the display being viewed by a first viewer of the system.
  • the dual-view touchscreen display system 102 includes a first proximity sensor 106 and a second proximity sensor 108.
  • the first proximity sensor 106 and the second proximity sensor 108 are adapted to detect proximity to the dual-view touchscreen display of the first user and the proximity to the dual-view touchscreen display of the second user, and to use that proximity information to identify the application to which entry of a given menu command is intended or directed.
  • FIG. 2 is a top view of the dual-view touchscreen display system 102 in accordance with an exemplary embodiment of the present invention.
  • the first proximity sensor 106 provides a first proximity detection field 202.
  • the second proximity sensor 108 provides a second proximity detection field 204.
  • the first proximity sensor is adapted to generate a signal indicating that the hand of the first user is proximate to the touchscreen 104 when the hand of the first user encounters the first proximity detection field 202.
  • FIG. 3 is a schematic diagram of a proximity sensing circuit in accordance with an exemplary embodiment of the present invention.
  • the proximity sensing circuit is generally referred to by the reference number 300.
  • replications of the proximity sensing circuit 300 are used as the first proximity sensor 106 and the second proximity sensor 108.
  • the proximity sensing circuit 300 is adapted to receive an input signal 302, such as the output of an oscillator or square-wave generator (not shown).
  • the exemplary proximity sensing circuit 300 includes a variable capacitor 304, the capacitance of which changes in value when a user is proximate thereto.
  • the variable capacitor 304 is connected as one input to a comparator 306.
  • a reference voltage is provided as the other input to the comparator 306.
  • the input signal 302 and the output of the differential amplifier 306 are delivered as inputs to a Exclusive OR gate 308.
  • the Exclusive OR gate 308 provides an output voltage signal 310 indicative of whether the user is proximate to the variable capacitor 304.
  • the magnitude of the output voltage signal 310 varies depending at least in part upon whether the user's hand is present in a proximity detection field of the proximity sensing circuit 300.
  • proximity sensors that operate based on inductance, infrared signals, optical signals or the like may be used.
  • the choice of a particular sensor type may be made by one of ordinary skill in the art based on system design considerations.
  • FIG. 4 is a functional block diagram of a dual-view touchscreen display system in accordance with an exemplary embodiment of the present invention.
  • the block diagram is generally referred to by the reference number 400.
  • the dual-view touchscreen display system 400 includes functional blocks for the first proximity sensor 106 and the second proximity sensor 108. Each of first proximitv sensor i nfi and second proximity sensor 108 includes a corresponding and respective one of exemplary proximity sensing circuit 300.
  • the dual-view touchscreen display system 400 further includes a functional block for touchscreen 104. Each of 106 and 108 include a 300.
  • the dual-view touchscreen display system, 400 includes a menu selection logic block 402 adapted to receive input from the proximity sensor 106 and the second proximity sensor 108.
  • the menu selection logic 402 determines whether a selection of a menu command or touch input to the touchscreen 104 is intended to apply to a first application or menu associated with a first view or user or to a second application or menu associated with the second view or user.
  • the menu selection logic block 402 may comprise hardware elements (including circuitry), software elements (including computer code stored on a machine readable medium) or a combination of both hardware and software elements.
  • FIG. 5 is a process flow diagram showing a method in accordance with an exemplary embodiment of the present invention.
  • the method is generally referred to by the reference number 500.
  • the method begins.
  • a voltage associated with an X-coordinate direction i.e., a horizontal direction in a typical X-Y coordinate system
  • the menu selection logic block 402. a determination is made about whether the voltage read at block 504 indicates that the touchscreen 104 is being touched by a user. If the voltage does not indicate that the touchscreen 104 is being touched, the process flow returns to block 504.
  • a voltage indicative of a position on the touchscreen 104 in the Y-direction is read, as shown at block 508.
  • the menu selection logic block 402 is able to determine a location on the touchscreen 104 based on the voltage readings in the X-direction and the Y-direction.
  • the X-Y coordinates corresponding to the location where the touchscreen 104 is being touched are estimated.
  • the menu selection logic block determines whether the touch input is intended to be a selection or command corresponding to a first menu item of a first menu associated with a first display or a second menu item of a second menu associated with a second display.
  • the menu selection logic block 402 determines to which of two display applications or menus a touch input command is directed. Additional details with respect to the determination of the correct application or menu to which a menu or touch input command is directed are set forth below with respect to FIG. 6. Moreover, FIG. 6 illustrates a process that employs input data from the first proximity sensor 106 and the second proximity sensor 108 to identity the application to which a given menu command is directed.
  • the menu selection logic block 402 correlates the X-Y coordinates estimated at block 510 to an appropriate menu command. In other words, the menu selection logic block 402 determines what menu command has been entered for the correct application. At block 516, the menu selection logic block 402 acts on the appropriate menu command. Process flow then returns to block 502.
  • FIG. 6 is a process flow diagram showing a method of identifying an application to which a menu input command is directed in accordance with an exemplary embodiment of the present invention.
  • the process is generally referred to by the reference number 600.
  • the process 600 shows one exemplary embodiment by which a touchscreen command in a dual-view touchscreen display system is determined to be applied to one of two different viewing applications being viewed by two users.
  • the process 600 is one exemplary method of determining which menu is being accessed and/or actuated, as shown at block 512 of FIG. 5.
  • the process begins.
  • An oscillator is enabled at block 604.
  • the oscillator generates the input signals 302 (FIG. 3) for the first proximity sensor 106 (FIG. 1) and the second proximity sensor 108 (FIG. 1).
  • the menu selection logic block 402 measures output of the first proximity sensor 106 corresponding to the proximity of the first user.
  • the menu selection logic block 402 measures the outnut of the second proximity sensor 108 corresponding to the proximity of the second user.
  • the oscillator is disabled.
  • the menu selection logic block 402 determines which user is more likely proximate to touchscreen 104 when a particular touch input or menu command is received. In an exemplary embodiment of the present invention, this determination is made by comparing a voltage measured from the first proximity sensor 106 to a voltage measured from the second proximity sensor 108. If the voltage from the first proximity sensor 106 is greater, the menu selection logic block 402 determines that the received touch input originated or was entered by the first user (e.g., the driver), as shown at block 614.
  • the first user e.g., the driver
  • the menu selection logic block 402 determines that the received touch input or menu input originated was entered by the second user (e.g., the passenger), as shown at block 616.
  • an exemplary embodiment of the present invention comprises a dual-view touchscreen display system that is able to differentiate between user inputs from a first user viewing the display from a first position and user inputs generated by a second user viewing the display from a second position.
  • a system advantageously allows touchscreen menus from various applications to be designed without regard to whether the physical location of touchscreen menu items overlaps with the location of menu items that might be visible in the alternate view.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)
  • Position Input By Displaying (AREA)

Abstract

A dual-view display system (102) includes a dual-view touchscreen display (104) adapted to display a first image including a first menu to a first user positioned at a first location with respect to system (102) and to display a second image including a second menu to a second user positioned at a second location with respect to system (102). The dual-view display system (102) further includes at least one sensor (106, 108) adapted to detect proximity to the dual-view touchscreen display (104) of the first user relative to the proximity to the dual-view touchscreen display (104) of the second user. Menu selection logic (402) identifies a received user touch command as a selection from the first menu or as a selection from the second menu based on the proximity to the dual-view touchscreen display (104) of the first user relative to the proximity to the dual-view touchscreen display (104) of the second user.

Description

DUAL- VIEW TOUCHSCREEN DISPLAY SYSTEM AND METHOD OF OPERATION
FIELD OF THE INVENTION
[0001] The present invention generally relates to electronic display systems. In particular, the present invention relates to a dual-view touchscreen display system and method of operation.
BACKGROUND OF THE INVENTION
[0002] This section is intended to introduce the reader to various aspects of art which may be related to various aspects of the present invention which are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present invention. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.
[0003] Some currently available display systems are capable of simultaneously displaying different information depending on the direction from which the screen is being viewed. For example, an automotive implementation of such a display system may provide a map view via a first application to the driver while simultaneously providing a video output such as a DVD movie to the passenger via a second application. If both applications require touchscreen input, a potential problem is identifying which viewer is touching the screen to make a menu selection at a given time. Without a method of identifying which user is activating a touchscreen menu option, the display system has no way of directing the proper application (map or movie) to respond to a touchscreen command. This problem is particularly acute if the touchscreen menu options on the display have the same physical location for both applications. SUMMARY OF THE INVENTION
[0004] There is provided a dual-view display system. An exemplary dual-view display system comprises a dual-view touchscreen display that is adapted to display a first image including a first menu to a first user who is positioned at a first location with respect to the dual-view display system and to display a second image including a second menu to a second user who is positioned at a second location with respect to the dual-view display system. The dual-view display system further comprises at least one sensor that is adapted to detect proximity to the dual-view touchscreen display of the first user relative to proximity to the dual-view touchscreen display of the second user and a menu selection logic that is adapted to identify a received menu command as a selection from the first menu or a selection from the second menu based on the proximity to the dual-view touchscreen display of the first user relative to the proximity to the dual-view touchscreen display of the second user.
BRIEF DESCRIPTION OF THE DRAWINGS
[0005] The above-mentioned and other features and advantages of the present invention, and the manner of attaining them, will become apparent and be better understood by reference to the following description of one embodiment of the invention in conjunction with the accompanying drawings, wherein:
[0006] FIG. 1 is a front view of a dual-view touchscreen display system in accordance with an exemplary embodiment of the present invention;
[0007] FIG. 2 is a top view of a dual-view touchscreen display system in accordance with an exemplary embodiment of the present invention;
[0008] FIG. 3 is a schematic diagram of a proximity sensing circuit in accordance with an exemplary embodiment of the present invention;
[0009] FIG. 4 is a functional block diagram of a dual-view touchscreen display system in accordance with an exemplary embodiment of the present invention;
[0010] FIG. 5 is a process flow diagram showing a method in accordance with an exemplary embodiment of the present invention; and
[0011] FIG. 6 is a process flow diagram showing a method of identifying an application to which a menu input command is directed in accordance with an exemplary embodiment of the present invention.
[0012] Corresponding reference characters indicate corresponding parts throughout the several views. The exemplifications set out herein illustrate a preferred embodiment of the invention, in one form, and such exemplifications are not to be construed as limiting in any manner the scope of the invention. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0013] One or more specific embodiments of the present invention will be described below. In an effort to provide a concise description of these embodiments, not all features of an actual implementation are described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions may be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
[0014] FIG. 1 is a front view of a dual-view touchscreen display system in accordance with an exemplary embodiment of the present invention. The front view is generally referred to by the reference number 100. A dual-view touchscreen display system 102 is adapted to present multiple views depending on the direction from which the screen is being viewed. In an automotive application, the dual-view touchscreen display system 102 may be positioned such that a first user (the driver) sees a display provided by a first application such as a map application. The dual- view touchscreen display system 102 may present a second display from a second application to a second user (the passenger). In one example, the passenger may view a movie from a DVD application at the same time the driver is viewing the map application.
[0015] The dual-view touchscreen display system 102 includes a touchscreen 104. The touchscreen 104 allows either user to provide input in the form of menu selections depending upon where a user touches the screen. For example, the map application may from time to time display menu options relevant to the current map display being viewed by the driver on the touchscreen 104. The driver may provide input by touching the touchscreen 104 at a location that corresponds to the desired menu command. In addition, the DVD application may present menu oDtions relevant to the current display being viewed by the passenger on the touchscreen 104. The passenger may provide input by touching the touchscreen 104 at a location that corresponds to the desired menu command.
[0016] An exemplary embodiment of the present invention is adapted to distinguish responses or user inputs by touching to the first application from responses or user inputs by touching to the second application even if the physical location of the menu selection areas for the first application physically overlap menu selection areas for the second application. In so doing, exemplary embodiments of the present invention prevent a second viewer of a dual-view display system from mistakenly entering a menu command that would affect the display being viewed by a first viewer of the system. To accomplish this, the dual-view touchscreen display system 102 includes a first proximity sensor 106 and a second proximity sensor 108. As fully set forth below, the first proximity sensor 106 and the second proximity sensor 108 are adapted to detect proximity to the dual-view touchscreen display of the first user and the proximity to the dual-view touchscreen display of the second user, and to use that proximity information to identify the application to which entry of a given menu command is intended or directed.
[0017] FIG. 2 is a top view of the dual-view touchscreen display system 102 in accordance with an exemplary embodiment of the present invention. As shown in FIG. 2, the first proximity sensor 106 provides a first proximity detection field 202. Similarly, the second proximity sensor 108 provides a second proximity detection field 204. When the first user reaches for the touchscreen 104 to make a menu selection, the hand of the first user passes through the first proximity detection field 202. As set forth below, the first proximity sensor is adapted to generate a signal indicating that the hand of the first user is proximate to the touchscreen 104 when the hand of the first user encounters the first proximity detection field 202. Similarly, the second proximity sensor 108 is adapted to generate a signal indicating that the second user is proximate to the touchscreen 104 when the hand of the second user passes through the second proximity detection field 204. [0018] FIG. 3 is a schematic diagram of a proximity sensing circuit in accordance with an exemplary embodiment of the present invention. The proximity sensing circuit is generally referred to by the reference number 300. In an exemplary embodiment of the present invention, replications of the proximity sensing circuit 300 are used as the first proximity sensor 106 and the second proximity sensor 108. The proximity sensing circuit 300 is adapted to receive an input signal 302, such as the output of an oscillator or square-wave generator (not shown). The exemplary proximity sensing circuit 300 includes a variable capacitor 304, the capacitance of which changes in value when a user is proximate thereto. The variable capacitor 304 is connected as one input to a comparator 306. A reference voltage is provided as the other input to the comparator 306.
[0019] In the exemplary proximity sensing circuit 300, the input signal 302 and the output of the differential amplifier 306 are delivered as inputs to a Exclusive OR gate 308. The Exclusive OR gate 308 provides an output voltage signal 310 indicative of whether the user is proximate to the variable capacitor 304. Moreover, the magnitude of the output voltage signal 310 varies depending at least in part upon whether the user's hand is present in a proximity detection field of the proximity sensing circuit 300.
[0020] Those of ordinary skill in the art will appreciate that, while a capacitive proximity sensor is illustrated in FIG. 3, that the use of other types of proximity sensors are within the scope of the present invention. By way of example, proximity sensors that operate based on inductance, infrared signals, optical signals or the like may be used. The choice of a particular sensor type may be made by one of ordinary skill in the art based on system design considerations.
[0021] FIG. 4 is a functional block diagram of a dual-view touchscreen display system in accordance with an exemplary embodiment of the present invention. The block diagram is generally referred to by the reference number 400. The dual-view touchscreen display system 400 includes functional blocks for the first proximity sensor 106 and the second proximity sensor 108. Each of first proximitv sensor i nfi and second proximity sensor 108 includes a corresponding and respective one of exemplary proximity sensing circuit 300. The dual-view touchscreen display system 400 further includes a functional block for touchscreen 104. Each of 106 and 108 include a 300. In addition, the dual-view touchscreen display system, 400 includes a menu selection logic block 402 adapted to receive input from the proximity sensor 106 and the second proximity sensor 108. Based on this input, the menu selection logic 402 determines whether a selection of a menu command or touch input to the touchscreen 104 is intended to apply to a first application or menu associated with a first view or user or to a second application or menu associated with the second view or user. Those of ordinary skill in the art will appreciate that the menu selection logic block 402 may comprise hardware elements (including circuitry), software elements (including computer code stored on a machine readable medium) or a combination of both hardware and software elements.
[0022] FIG. 5 is a process flow diagram showing a method in accordance with an exemplary embodiment of the present invention. The method is generally referred to by the reference number 500. At block 502, the method begins. At block 504, a voltage associated with an X-coordinate direction (i.e., a horizontal direction in a typical X-Y coordinate system) of the touchscreen 104 is read by the menu selection logic block 402. At decision block 506, a determination is made about whether the voltage read at block 504 indicates that the touchscreen 104 is being touched by a user. If the voltage does not indicate that the touchscreen 104 is being touched, the process flow returns to block 504.
[0023] If the voltage indicates that the touchscreen is being touched, a voltage indicative of a position on the touchscreen 104 in the Y-direction is read, as shown at block 508. Those of ordinary skill in the art will appreciate that the menu selection logic block 402 is able to determine a location on the touchscreen 104 based on the voltage readings in the X-direction and the Y-direction. At block 510, the X-Y coordinates corresponding to the location where the touchscreen 104 is being touched are estimated. [0024] At block 512, the menu selection logic block determines whether the touch input is intended to be a selection or command corresponding to a first menu item of a first menu associated with a first display or a second menu item of a second menu associated with a second display. In other words, the menu selection logic block 402 determines to which of two display applications or menus a touch input command is directed. Additional details with respect to the determination of the correct application or menu to which a menu or touch input command is directed are set forth below with respect to FIG. 6. Moreover, FIG. 6 illustrates a process that employs input data from the first proximity sensor 106 and the second proximity sensor 108 to identity the application to which a given menu command is directed.
[0025] At block 514, the menu selection logic block 402 correlates the X-Y coordinates estimated at block 510 to an appropriate menu command. In other words, the menu selection logic block 402 determines what menu command has been entered for the correct application. At block 516, the menu selection logic block 402 acts on the appropriate menu command. Process flow then returns to block 502.
[0026] FIG. 6 is a process flow diagram showing a method of identifying an application to which a menu input command is directed in accordance with an exemplary embodiment of the present invention. The process is generally referred to by the reference number 600. The process 600 shows one exemplary embodiment by which a touchscreen command in a dual-view touchscreen display system is determined to be applied to one of two different viewing applications being viewed by two users. Moreover, the process 600 is one exemplary method of determining which menu is being accessed and/or actuated, as shown at block 512 of FIG. 5.
[0027] At block 602, the process begins. An oscillator is enabled at block 604. In an exemplary embodiment of the present invention, the oscillator generates the input signals 302 (FIG. 3) for the first proximity sensor 106 (FIG. 1) and the second proximity sensor 108 (FIG. 1). At block 606, the menu selection logic block 402 measures output of the first proximity sensor 106 corresponding to the proximity of the first user. At block 608, the menu selection logic block 402 measures the outnut of the second proximity sensor 108 corresponding to the proximity of the second user. At block 610, the oscillator is disabled.
[0028] At decision block 612, the menu selection logic block 402 determines which user is more likely proximate to touchscreen 104 when a particular touch input or menu command is received. In an exemplary embodiment of the present invention, this determination is made by comparing a voltage measured from the first proximity sensor 106 to a voltage measured from the second proximity sensor 108. If the voltage from the first proximity sensor 106 is greater, the menu selection logic block 402 determines that the received touch input originated or was entered by the first user (e.g., the driver), as shown at block 614. If the voltage measured from the first proximity sensor 106 is not greater than the voltage measured by the second proximity sensor 108, the menu selection logic block 402 determines that the received touch input or menu input originated was entered by the second user (e.g., the passenger), as shown at block 616.
[0029] As set forth herein, an exemplary embodiment of the present invention comprises a dual-view touchscreen display system that is able to differentiate between user inputs from a first user viewing the display from a first position and user inputs generated by a second user viewing the display from a second position. Such a system advantageously allows touchscreen menus from various applications to be designed without regard to whether the physical location of touchscreen menu items overlaps with the location of menu items that might be visible in the alternate view.
[0030] While the invention may be susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, it should be understood that the invention is not intended to be limited to the particular forms disclosed. Rather, the invention is to cover all modifications, equivalents and alternatives falling within the spirit and scope of the invention as defined by the following appended claims.

Claims

CLAIMSWhat is claimed is:
1. A dual-view display system (102), comprising: a dual-view touchscreen display (104) adapted to display a first image including a first menu to a first user positioned at a first location with respect to the dual-view display system (102) and to display a second image including a second menu to a second user positioned at a second location with respect to the dual-view display system (102); at least one sensor (106, 108) adapted to detect a proximity of at least one of the first user and the second user to the dual-view touchscreen display (104); and a menu selection logic (402) adapted to identify a received menu command as a selection from the first menu or as a selection from the second menu dependent at least in part upon the proximity detected by said at least one sensor.
2. The dual-view display system (102) recited in claim 1, wherein the at least one sensor (106, 108) comprises a first proximity sensor (106) that provides a first proximity detection field (202) to detect a first proximity, and a second proximity sensor (108) that provides a second proximity detection field (204) to detect a second proximity, and wherein said menu selection logic is adapted to identify a received menu command as a selection from the first menu or as a selection from the second menu dependent at least in part upon said first and second proximities.
3. The dual-view display system (102) recited in claim 1, wherein the at least one sensor (106, 108) comprises a variable capacitor (304).
4. The dual-view display system (102) recited in claim 1, wherein the at least one sensor (106, 108) comprises an inductive sensor.
5. The dual-view display system (102) recited in claim 1, wherein the at least one sensor (106, 108) comprises an infrared sensor.
6. The dual-view display system (102) recited in claim 1 , wherein the at least one sensor (106, 108) comprises an optical sensor.
7. The dual-view display system (102) recited in claim 1, wherein the first image comprises a map view.
8. The dual-view display system (102) recited in claim 1 , wherein the second image comprises a movie.
9. A method (500) of operating a dual-view display system (102) adapted to display a first image including a first menu to a first user positioned at a first location with respect to the dual-view display system (102) and to display a second image including a second menu to a second user positioned at a second location with respect to the dual-view display system (102), the method comprising: receiving a menu input command via a touchscreen (104) of the dual-view display system (102); determining whether the menu input command is directed to the first menu or the second menu dependent at least in part upon the proximity to the dual-view display system (102) of the first user relative to the proximity to the dual-view display system (102) of the second user; and responding to the menu command dependent at least in part upon said determining step.
10. The method (500) recited in claim 9, wherein the determining step comprises comparing an output of a first proximity sensor (106) to an output of a second proximity sensor (108).
1 1. The method (500) recited in claim 10, wherein the first proximity sensor (106) and the second proximity sensor (108) each comprises a variable capacitor (304).
12. The method (500) recited in claim 1 1, wherein the first proximity sensor (106) and the second proximity sensor (108) each comprises an inductive sensor.
13. The method (500) recited in claim 11, wherein the first proximity sensor (106) and the second proximity sensor (108) each comprises an infrared sensor.
14. The method (500) recited in claim 1 1, wherein the first proximity sensor (106) and the second proximity sensor (108) each comprises an optical sensor.
15. The method (500) recited in claim 9, wherein the first image comprises a map view.
16. The method (500) recited in claim 9, wherein the second image comprises a movie.
17. A dual-view display system (102), comprising: a dual-view touchscreen display (104) adapted to display a first image including a first menu to a first user positioned at a first location with respect to the dual-view display system (102) and to display a second image including a second menu to a second user positioned at a second location with respect to the dual-view display system (102); a first proximity sensor (106) providing a first proximity detection field (202), the first proximity sensor (102) providing an indication that the first menu is active when the first user encounters the first proximity detection field (202); a second proximity sensor (108) providing a second proximity detection field (204), the second proximity sensor (104) providing an indication that the second menu is active when the second user encounters the second proximity detection field (204); and a menu selection logic (402) adapted to identify a received menu command as a selection from the first menu if the first proximity sensor (106) provides the indication that the first menu is active or to identify the received menu command as a selection from the second menu if the second proximity sensor (108) provides the indication that the second menu is active.
18. The dual-view display system (102) recited in claim 17, wherein the first proximity sensor (106) and the second proximity sensor (108) each comprises a variable capacitor (304).
19. The dual-view display system ( 102) recited in claim 17, wherein the first proximity sensor (106) and the second proximity sensor (108) each comprises an inductive sensor.
20. The dual-view display system (102) recited in claim 17, wherein the first proximity sensor (106) and the second proximity sensor (108) each comprises an infrared sensor.
21. The dual-view display system ( 102) recited in claim 17, wherein the first proximity sensor (106) and the second proximity sensor (108) each comprises an optical sensor.
22. The dual-view display system (102) recited in claim 17, wherein the first image comprises a map view.
23. The dual-view display system (102) recited in claim 17, wherein the second image comprises a movie.
24. In a dual-view touchscreen display system (102) adapted to display a first image having a first menu to a first user located at a first position relative to the system (102) and a second image having a second menu to a second user located at a second position relative to the system (102), a method of determining the menu to which user touch input to the touchscreen display (104) is directed, said method comprising: receiving via the touchscreen display (104) a touch input from one of the first and second user; detecting a proximity of said one of the first and second user to the touchscreen display (104); determining, dependent at least in part upon said detecting step, whether the touch input is from the first or second user; and correlating, dependent at least in part upon said determining step, the touch input to an actuated one of the first or second menus.
25. The method of claim 24, comprising the further step of executing, dependent at least in part upon said correlating step, a command of said actuated one of the first and second menus that corresponds to the touch input.
PCT/US2008/011090 2008-09-25 2008-09-25 Dual-view touchscreen display system and method of operation WO2010036217A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
KR1020117009081A KR20110066949A (en) 2008-09-25 2008-09-25 Dual-view touchscreen display system and method of operation
CN200880131310XA CN102165381A (en) 2008-09-25 2008-09-25 Dual-view touchscreen display system and method of operation
JP2011528986A JP2012503818A (en) 2008-09-25 2008-09-25 Dual view touch screen display system and method of operation
PCT/US2008/011090 WO2010036217A1 (en) 2008-09-25 2008-09-25 Dual-view touchscreen display system and method of operation
EP08816292A EP2329326A1 (en) 2008-09-25 2008-09-25 Dual-view touchscreen display system and method of operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2008/011090 WO2010036217A1 (en) 2008-09-25 2008-09-25 Dual-view touchscreen display system and method of operation

Publications (1)

Publication Number Publication Date
WO2010036217A1 true WO2010036217A1 (en) 2010-04-01

Family

ID=42059969

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/011090 WO2010036217A1 (en) 2008-09-25 2008-09-25 Dual-view touchscreen display system and method of operation

Country Status (5)

Country Link
EP (1) EP2329326A1 (en)
JP (1) JP2012503818A (en)
KR (1) KR20110066949A (en)
CN (1) CN102165381A (en)
WO (1) WO2010036217A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013152561A (en) * 2012-01-24 2013-08-08 Japan Display West Co Ltd Touch panel, display device, and electronic apparatus
EP2757407A1 (en) 2013-01-18 2014-07-23 Lite-On It Corporation Multiple-view display system with user recognition and operation method thereof
DE102014200025A1 (en) * 2014-01-06 2015-07-09 Volkswagen Aktiengesellschaft Method and device for outputting information in a vehicle
US9620042B2 (en) 2013-01-18 2017-04-11 Magna Electronics Solutions Gmbh Multiple-view display system with user recognition and operation method thereof
DE102022105769A1 (en) 2022-03-11 2023-09-14 Audi Aktiengesellschaft Operating a touch-sensitive multiview display screen

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9436301B2 (en) 2011-06-29 2016-09-06 Google Technology Holdings LLC Portable electronic device having interchangeable user interfaces and method thereof
FR2995836B1 (en) * 2012-09-27 2015-05-22 Valeo Systemes Thermiques CONTROL MODULE
CN103777796A (en) * 2012-10-22 2014-05-07 联想(北京)有限公司 Information processing method and electronic device
CN103105987A (en) * 2012-12-28 2013-05-15 苏州瀚瑞微电子有限公司 Device and method for realizing approach switch function by utilizing capacitive touch screen
CN103268033B (en) * 2013-05-16 2016-03-30 京东方科技集团股份有限公司 A kind of Double-vision touch display device and preparation method thereof
KR101611205B1 (en) * 2013-11-11 2016-04-11 현대자동차주식회사 A displaying apparatus, a vehicle the displaying apparatus installed in and method of controlling the displaying apparatus

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070262953A1 (en) * 2006-05-15 2007-11-15 Zackschewski Shawn R Multiple-view display system having user manipulation control and method
US20080133133A1 (en) * 2006-12-04 2008-06-05 Abels Steven M System and method of enabling features based on geographic imposed rules

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2405545A (en) * 2003-08-30 2005-03-02 Sharp Kk Multiple view directional display with parallax optic having colour filters.
JP4530267B2 (en) * 2003-08-30 2010-08-25 シャープ株式会社 Multiple view display
JP4542407B2 (en) * 2004-10-04 2010-09-15 パイオニア株式会社 Information display device
TWI446004B (en) * 2005-06-14 2014-07-21 Koninkl Philips Electronics Nv Combined single/multiple view-display
US9411181B2 (en) * 2005-09-21 2016-08-09 Koninklijke Philips N.V. Display device
DE112007001143T5 (en) * 2006-06-05 2009-04-23 Mitsubishi Electric Corp. Display system and method for limiting its operation
JP4356763B2 (en) * 2007-01-30 2009-11-04 トヨタ自動車株式会社 Operating device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070262953A1 (en) * 2006-05-15 2007-11-15 Zackschewski Shawn R Multiple-view display system having user manipulation control and method
US20080133133A1 (en) * 2006-12-04 2008-06-05 Abels Steven M System and method of enabling features based on geographic imposed rules

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013152561A (en) * 2012-01-24 2013-08-08 Japan Display West Co Ltd Touch panel, display device, and electronic apparatus
EP2757407A1 (en) 2013-01-18 2014-07-23 Lite-On It Corporation Multiple-view display system with user recognition and operation method thereof
US9620042B2 (en) 2013-01-18 2017-04-11 Magna Electronics Solutions Gmbh Multiple-view display system with user recognition and operation method thereof
DE102014200025A1 (en) * 2014-01-06 2015-07-09 Volkswagen Aktiengesellschaft Method and device for outputting information in a vehicle
DE102022105769A1 (en) 2022-03-11 2023-09-14 Audi Aktiengesellschaft Operating a touch-sensitive multiview display screen

Also Published As

Publication number Publication date
EP2329326A1 (en) 2011-06-08
JP2012503818A (en) 2012-02-09
CN102165381A (en) 2011-08-24
KR20110066949A (en) 2011-06-17

Similar Documents

Publication Publication Date Title
US20100073306A1 (en) Dual-view touchscreen display system and method of operation
EP2329326A1 (en) Dual-view touchscreen display system and method of operation
US9778742B2 (en) Glove touch detection for touch devices
CN106155409B (en) Capacitive metrology processing for mode changes
EP2488932B1 (en) Touch interface having microphone to determine touch impact strength
US20090066659A1 (en) Computer system with touch screen and separate display screen
US20100220062A1 (en) Touch sensitive display
EP2508965B1 (en) Touch-sensitive display apparatus and method for displaying object thereof
US20110254806A1 (en) Method and apparatus for interface
JP2014503925A (en) Terminal having touch screen and touch event identification method in the terminal
WO2014052918A1 (en) System and method for low power input object detection and interaction
US10007770B2 (en) Temporary secure access via input object remaining in place
CN102141883B (en) Information processing apparatus, information processing method, and program
US20160054831A1 (en) Capacitive touch device and method identifying touch object on the same
CN105468214B (en) Location-based object classification
KR20110063985A (en) Display apparatus and touch sensing method
CN103995579A (en) Multiple-view display system with user recognition and operation method thereof
US20120127120A1 (en) Touch device and touch position locating method thereof
EP3644167A1 (en) Electronic devices and methods of operating electronic devices
US20190107917A1 (en) Impedance ratio-based current conveyor
US9134843B2 (en) System and method for distinguishing input objects
CN107894859A (en) Touch-control processing unit, electronic system and its touch-control processing method
US10558306B2 (en) In-cell touch apparatus and a water mode detection method thereof
WO2014002315A1 (en) Operation device
US11048907B2 (en) Object tracking method and object tracking system

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200880131310.X

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08816292

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2008816292

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2008816292

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2011528986

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20117009081

Country of ref document: KR

Kind code of ref document: A