WO2015078826A1 - Dispositif de commande utilisateur, procédé de commande utilisateur, système comprenant ledit dispositif de commande utilisateur, et support d'enregistrement - Google Patents

Dispositif de commande utilisateur, procédé de commande utilisateur, système comprenant ledit dispositif de commande utilisateur, et support d'enregistrement Download PDF

Info

Publication number
WO2015078826A1
WO2015078826A1 PCT/EP2014/075446 EP2014075446W WO2015078826A1 WO 2015078826 A1 WO2015078826 A1 WO 2015078826A1 EP 2014075446 W EP2014075446 W EP 2014075446W WO 2015078826 A1 WO2015078826 A1 WO 2015078826A1
Authority
WO
WIPO (PCT)
Prior art keywords
parameter
zone
controllable
touch
value
Prior art date
Application number
PCT/EP2014/075446
Other languages
English (en)
Inventor
Niels LAUTE
Jurriën Carl GOSSELINK
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2015078826A1 publication Critical patent/WO2015078826A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • User control device user control method, system comprising the user control device and record carrier
  • the present invention relates to a user control device for controlling applications involving a plurality of controllable parameters.
  • the present invention further relates to a user control method for controlling applications involving a plurality of controllable parameters.
  • the present invention further relates to an application including a user control device for allowing a user to controlling a plurality of controllable parameters of said system.
  • the present invention further relates to a computer program product.
  • RELATED ART US2012188193 pertains to a display device with a multi-point touch-sensitive display.
  • the display device includes a position detection unit detects the XY coordinates of each of two touched positions on the touch panel. When the position detection unit detects that two positions are touched, a calculation unit computes a distance between the X- coordinates as well as a distance between the Y-coordinates of each of the touched positions.
  • a control execution unit controls a first controlled object according to the computed distance in the X-direction and controls a second controlled object according to the computed distance in the Y-direction.
  • the known device is sensitive to unintended touch events. If a user inadvertently touches the multi-point touch- sensitive display, this may cause undesired changes in the control of the first and the second controlled object.
  • a user control device for controlling an application involving a plurality of controllable parameters.
  • the user control device comprises a multi-point touch-sensitive display having an input to receive display data and an output to provide position information indicative for a first and a second touch position on said multi-point touch-sensitive display for which it is currently detected that the multi-point touch-sensitive display is touched.
  • the user control device further comprises a graphics engine to provide said display data to display on said multi-point touch- sensitive display in a plurality of regions respective visual representations of respective ones of said controllable parameters. Each region includes a parameter identification zone and a parameter control zone.
  • the user control device further comprises an association facility to receive the position information and to identify a parameter identification zone and a parameter control zone of a respective controllable parameter associated with the received position information for the first and the second touch position respectively.
  • the user control device further comprises a controller to set a value of a controllable parameter in accordance with a relative position of the second touch position within the associated parameter control zone if said controllable parameter is the controllable parameter corresponding to the identified parameter identification zone and the identified parameter control zone.
  • controllable parameters of the application to be controlled are assigned a respective region with a parameter identification zone and a parameter control zone. If a user touches each of the parameter identification zone and a parameter control zone within the region corresponding to a controllable parameter, the controllable parameter is set in accordance with a relative position within the associated parameter control zone of the region for said parameter.
  • the present invention further relates to a system comprising such a user control device as well as an application that comprises one or more controllable objects having an operational state that is at least partly determined by said plurality of controllable parameters.
  • the invention is in particular suitable for applications wherein an output is provided as a composition of layers, for example an output image composed of image layers.
  • the items to be controlled are image layer and the controlled value may be a relative depth of the layer, or a relative contribution of each layer to the output image.
  • the invention may be applied to allow an operator to compose a display with various physiologic parameters.
  • the invention may be used to control elements in a lighting system.
  • a control method to control an application involving a plurality of controllable parameters, comprises the steps of:
  • controllable parameters each region including a parameter identification zone and a parameter control zone
  • a computer program product comprises a computer program, which when executed by a programmable processor causes the programmable processor to carry out any of the steps of this method and variations thereof.
  • the user control device comprises a first storage part to store said value as a temporary value in a first operational mode allowing user input for setting the value and a second storage part to store said temporary value as an effective value in a second operational mode following said first operation mode, wherein the graphics engine has a display component for displaying a visual indication of the temporal value, and wherein the effective value is used to control the application.
  • the user control device comprising a storage part for storing said value as an effective value in a first operational mode allowing user input for setting the value and wherein the effective value is used to control the application. This allows the user to immediately observe the effects of user input.
  • the association facility comprises a disambiguation module to isolate from said position information a first position indicating a parameter identification zone and a second position indicating a parameter control zone.
  • disambiguation may be based on the absolute position of the detected positions. However, in certain embodiments, wherein a very compact display is desired, parameter identification zones may overlap parameter control zones. Alternative solutions are provided to enable disambiguation in that case.
  • the disambiguation module is arranged to identify a first position where the multi-point touch-sensitive display is touched for a first time after an absence of detection that the multi-point touch-sensitive display has been touched as a position that indicates a parameter identification zone of a parameter and that is arranged to identify a second position where it is detected that the multi-point touch-sensitive display is touched subsequently while it is still detected that the multi-point touch-sensitive display is touched at said first position as a parameter control zone of a parameter.
  • the disambiguation module is arranged to determine a spatial relationship between positions in a list of at least two positions and based on said determined spatial relationship to identify a first position of said list as a position that indicates a parameter identification zone of a parameter and a second position of said list as a position that indicates a parameter control zone of a parameter. For example a leftmost one of the positions is identified as the position that indicates a parameter identification zone and the one but leftmost one of the positions is identified as the position that indicates the parameter control zone. Other detected positions, if any, may be ignored or used as control input for another purpose. Alternatively, control may be suppressed if it is detected that the multi-point touch- sensitive display is touched at more than two positions.
  • a visual indication of the value set for a parameter is displayed in the region associated with the parameter. This is in particular suitable if the user input does not immediately affect the controllable application, but may also be used as an additional indication in applications where an immediate effect is perceivable.
  • the visual indication is in the form of a circle segment. This provides for a very compact and clear representation.
  • the regions corresponding to the controllable parameters are substantially elongate regions and these regions are mutually arranged according to a direction transverse to their length direction. This allows the user to rapidly select a region of one of the plurality of controllable parameters and to subsequently control the selected parameter.
  • the parameter control zone of a parameter comprises a first subzone and a second subzone, and comprising the first conditional step of decreasing the value for said parameter upon detection that the first subzone is touched and increasing the value for said parameter upon detection that the second subzone is touched.
  • the menu for controlling the plurality of controllable parameters may be permanently present in an operational state of the user control device or a system having the user control device.
  • an activation step may be involved preceding the step of generating the visual representations. This is particularly advantage in applications wherein the multi-point touch-sensitive display is also used for other purposes.
  • said activation step is induced by touching a position on the multi-point touch-sensitive display, and wherein said visual representations are provided at a position depending on said touched position.
  • said control area having the control regions for the various parameters can be provided near the touched position, which is particularly useful for large area multi-point touch-sensitive displays.
  • Fig. 1 schematically shows a user control device according to an embodiment of the present invention
  • Fig. 2 schematically shows an example of a menu as displayed on the multipoint touch-sensitive display of a user control device according to the present invention
  • Fig. 2A schematically shows another example of a menu as displayed on the multi-point touch- sensitive display of a user control device according to the present invention
  • Fig. 3 shows an embodiment of a user control device according to the present invention in more detail
  • Fig. 4 shows a component of the embodiment of Fig. 3 in more detail
  • Fig. 5 shows an alternative embodiment of a user control device according to the present invention in more detail
  • Fig. 6 schematically shows another example of a menu as displayed on the multi-point touch- sensitive display of a user control device according to the present invention
  • Fig. 7 A, 7B illustrate operation of a component in an embodiment of a user control device according to the present invention
  • Fig. 7A indicates a sequence of events
  • Fig. 7B shows a state diagram of the component
  • Fig. 8 illustrates operation of an alternative version of said component in an embodiment of a user control device according to the present invention
  • Fig. 9 illustrates an embodiment of a system according to the present invention
  • Fig. 10 illustrates another embodiment of a system according to the present invention
  • Fig. 11 A, 11B, l lC illustrate a visual feedback provided in an embodiment of the present invention
  • Fig. 12 schematically shows a method according to the present invention.
  • FIG. 1 schematically shows a user control device 1 according to the present invention.
  • the user control device 1 is provided for controlling an application 90 involving a plurality of controllable parameters PI, P2, P3.
  • the user control device 1 and the application that is controlled therewith form a system 100.
  • the user control device 1 comprises a multi -point touch-sensitive display 10 that has an input 11 to receive display data Dd and an output 12 to provide position information Ip indicative for positions touched on said multi-point touch-sensitive display.
  • the user control device 1 further comprises a graphics engine 20 to provide said display data Dd.
  • the graphics engine provides the display data Dd to display on said multi -point touch- sensitive display 10 in a plurality of regions Rl, R2, R3 respective visual representations of respective ones of said controllable parameters PI, P2, P3.
  • Each region Rl, R2, R3 includes a parameter identification zone Zli, Z2i, Z3i and a parameter control zone Zlc, Z2c, Z3c.
  • the user control device 1 further comprises an association facility 30 to receive the position information Ip and to identify a parameter identification zone and a parameter control zone of a respective controllable parameter associated with the received position information.
  • the association facility 30 provides a first signal Izi indicative for the parameter identification zone that is associated with one of the touched positions and a second signal Izc indicative for the parameter control zone that is associated with another one of the touched positions.
  • the association facility 30 further provides a third signal Ipc indicative for the position of said another one of the touched positions within the associated parameter control zone indicated Izc.
  • the user control device 1 still further comprises a controller 40 to set a value of a controllable parameter PI, P2, P3 in accordance with the relative position within the associated parameter control zone if said controllable parameter is the controllable parameter corresponding to the identified parameter identification zone and the identified parameter control zone.
  • positions pa, pb respectively are the positions within the parameter identification zone Zli and the parameter control zone Zlc of the region Rl for parameter PI.
  • Position pc is a position within the parameter control zone Z2c of the region R2 for parameter P2. If the user touches the positions pa and pb it is determined that the parameter identification zone Zli identified by pa and the parameter control zone Zlc identified by pb correspond to the same parameter PI. In that case the controller 40 sets a value of the controllable parameter PI in accordance with the relative position of the position pb within the associated parameter control zone Zlc.
  • a position as such indicates a location in coordinates on the multi-point touch-sensitive display 10, and that the relative position is the position of said location relative to the associated parameter control zone Zlc. If the user touches the positions pa and pc it is determined that the parameter identification zone Zli identified by pa and the parameter control zone Z2c identified by pc correspond to mutually different parameters PI, P2. In that case the controller 40 does not change a value for the parameter P2 in accordance with the relative position of the position pc within the parameter control zone Z2c. This reduces the risk that a user inadvertently changes a setting by accidentally touching the multi-point touch-sensitive display 10.
  • the parameter identification zone and a parameter control zone of a region have a fixed spatial relationship.
  • the parameter control zone Zlc of a parameter PI is arranged to the right of the parameter identification zone Zli for that parameter.
  • the regions Rl, R2, R3, corresponding to the controllable parameters are substantially elongate regions and these regions are mutually arranged according to a direction y transverse to their length direction x. This allows the user to rapidly select a region of one of the plurality of controllable parameters and to
  • the parameter control zone of a region has a first and a second part arranged on mutually opposite sides of the parameter identification zone of that region.
  • the parameter control zone of a parameter PI has a first part Zlca that is arranged to the left of the parameter identification zone Zli for that parameter and a second part Zlcb that is arranged to the right the parameter identification zone Zli.
  • This may assist the user in easily recognizing the effect of control input.
  • the left part Zlca of the parameter control zone may be dedicated to a lower range of possible values for the controllable parameter and the right part Zlcb of the parameter control zone may be dedicated to a higher range of possible values for the controllable parameter.
  • the user may be allowed to decrease the value by tapping on the left part Zlca and to increase the value by tapping on the right part Zlcb.
  • a separate activation step may be required to activate the user control device.
  • a menu with identification and control zones for controllable parameters appears only after this activation step.
  • the activation step may be induced by a separate touch event wherein the user touches the multi-point touch-sensitive display.
  • the user control device may also have a time-out unit for its deactivation. For example the menu may disappear automatically if no touches are detected in a certain time-period e.g. in the range of 30 to 60 seconds.
  • the parameter identification zone and a parameter control zone of a region may have a dynamical spatial relationship.
  • parameter control zone of a region may normally appear on the right side of the parameter identification zone of that region, but may instead appear on the left side if the region is close to the right border of the multi-point touch- sensitive display.
  • FIG. 3 shows an embodiment in more detail.
  • the association facility 30 of the user control device comprises a disambiguation module 31 to isolate from the position information Ip a first position pi indicating a parameter identification zone and a second position pc indicating a parameter control zone.
  • the disambiguation module 31 signals the first position pi by signal Izi to an identification zone processing part 32 that identifies which parameter is associated with the identification zone comprising position pi.
  • the identification zone processing part 32 issues a multi-bit one-hot signal Ipi having a respective bit Ipil, Ipi2, Ipi3 to indicate a respective parameter PI, P2, P3.
  • the disambiguation module 31 further signals the second position pc by signal Izc to a control zone processing part 33 that identifies which parameter is associated with the control zone comprising position pc.
  • the control zone processing part 33 issues a multi-bit one-hot signal Ipc having a respective bit Ipcl, Ipc2, Ipc3 to indicate a respective parameter PI, P2, P3. If a parameter is associated with a control zone comprising a position pc then a corresponding bit of the multi-bit one-hot signal Ipc is set.
  • a verification module 41 determines if the parameter identified by the identification zone processing part 32 is the same as the parameter identified by the control zone processing part 33. In response the verification module 41 issues a multi-bit one-hot sample control signal having components Scl, etc. If the verification module 41 determines for a particular parameter, e.g. PI that both a touch is detected at a first position pa in its identification zone, e.g. Zil as indicated by Ipil, and a touch is detected at a second position pa in its control zone Zcl, as indicated by Ipcl, than a sample control signal, in this case Scl, is enabled for that parameter.
  • a particular parameter e.g. PI that both a touch is detected at a first position pa in its identification zone, e.g. Zil as indicated by Ipil, and a touch is detected at a second position pa in its control zone Zcl, as indicated by Ipcl
  • control zone processing part 33 includes a control zone association part 331 that generates the signal Ipc and a value computation part 332 that is arranged to a compute a value of a controllable parameter PI, P2, P3 in accordance with a relative position within the associated parameter control zone.
  • the value computation part 332 has respective value computation elements that each are arranged to provide a signal Ipvl, .., Ipv3 indicative for the value of a controlled parameter PI, P2, P3.
  • each value computation element of value computation part 332 is selectively enabled by a respective bit of output signal Ipc, to prevent that unnecessary results are computed.
  • the value computation elements may have a mutually different implementation. For example, one value computation element may compute a value that is linearly dependent on a relative position of a position indicated with the associated control zone. Another value computation element may compute a value that is non-linearly dependent on a relative position of a position indicated with the associated control zone for example according to a logarithmic function. Again another value computation element may compute a binary value that is True for a first range of relative positions and that is False for a second range of relative positions.
  • the user control device 1 of FIG. 3 comprising a storage part 42 with proper storage locations for each parameter PI, P2, P3.
  • the thin lines between control zone processing part 33 and the verification module 41 carry the respective signal bits Ipcl, Ipc2, Ipc3 of the multi-bit one-hot signal Ipc.
  • the thick lines between the control zone processing part 33 and the storage part 42 carry the signals Ipvl, Ipv2, Ipv3.
  • the storage locations of the storage part 42 are sample and hold elements S/H.
  • the sample and hold elements S/H each receive a proper sample control signal, e.g. Scl.
  • the user is allowed to change the value of the corresponding parameter. E.g.
  • the value indicated by the corresponding signal Ipvl for said parameter PI is sampled.
  • the application 90 directly receives the value for the parameter sampled by the corresponding sample and hold element S/H of the storage part 42. Therewith the sample value is an effective value. In a second operational mode the sampled value is no longer overwritten by new samples and therewith stored.
  • This embodiment has the advantage that the user can directly observe the behavior of the application in response to the provided input. This may be very suitable for example for use in lighting systems, having a plurality of lighting elements. Another suitable application is for example a CAD system, wherein the user can indicate the contribution of various drawing planes to a composed image.
  • supplemental feedback may be provided to the multi -point touch-sensitive display 10, as indicated by the dashed lines towards the graphics engine.
  • FIG. 5 shows an alternative embodiment that allows the user to first observe how the controlled values of the parameters change as a function of user input before user input change before this affects the application.
  • the user control device 1 comprises a first storage part 42a that has respective sample and hold elements for each parameter PI, P2, P3.
  • a sample and hold element assumes a first operation mode. Therein the value that is determined in accordance with a relative position within the associated parameter control zone of the parameter for said sample and hold element is sampled and therewith stored said sample and hold element as a temporary value.
  • Outputs of the sample and hold elements first storage part 42a are coupled to the graphics engine 20 to enable the latter to display in the region for the currently controlled parameter a visual indication of the value set currently set for that parameter.
  • the outputs of the sample and hold elements first storage part 42a are also coupled to respective sample and hold elements of a second storage part 42b.
  • the respective sample control signals, e.g. Scl provided to the sample and hold elements of the first storage part 42a are inverted by the NOT gates and the inverted signals, e.g. /Sclare provided to the corresponding sample and hold elements of the second storage part 42b.
  • the position information Ip indicates the positions, also denoted as 'touch positions' for which it is currently detected that the multi -point touch- sensitive display is touched.
  • the disambiguation module 31 determines which touch position indicates a parameter identification zone and which indicates a parameter control zone.
  • disambiguation module 31 In a first embodiment mutually non- overlapping areas may be assigned to the parameter identification zones and the parameter control zones, see for example FIG. 2. In that case the disambiguation module 31 merely needs to determine whether a position is in an area assigned to the parameter identification zones or an area assigned to the parameter control zones.
  • FIG. 6 shows an example of a control menu for 8 parameters.
  • Zli, ..., Z8i are the parameter identification zones for parameters PI,...,P8 and Zlc, ..., Z8c are the parameter control zones for these parameters.
  • the parameter identification zones Z5i, ...,Z8i overlap the parameter control zones Zlc,...,Z4c. It is noted that for clarity a slight y-offset is applied to the parameter identification zones Z4i, ... , Z8i and to the parameter control zones Z4c, ... , Z8c. In this case it is not immediately clear whether a position px for example is a position indicating parameter identification zone Z5i or parameter control zone Zlc. The following embodiments of the disambiguation module address this issue.
  • the disambiguation module assigns the detected touch positions dependent on the order of their occurrence.
  • a first touch position is considered a touch position where the multi-point touch- sensitive display is touched for a first time after an absence of detection that the multi-point touch- sensitive display has been touched or after a reference event.
  • a second touch position is a subsequent touch position, occurring while the first touch position is still valid.
  • a reference event is for example start-up of the device or a separate step wherein the user control device is activated.
  • the disambiguation module 31 identifies the first touch position as a position that indicates a parameter identification zone.
  • the subsequent, second touch position is identified as a position that indicates a parameter control zone.
  • FIG. 7A shows an exemplary order of events
  • FIG. 7B shows a state diagram of the embodiment of the disambiguation module.
  • the user control device 1 assumes an initial state SO, wherein a menu, as shown for example in FIG. 2 is not yet visible.
  • the user may activate the menu by causing a transition to state SI in a separate step.
  • the menu may already be active upon start-up.
  • the disambiguation module 31 identifies a touch position as a position indicating a parameter identification zone.
  • a next state S2 is assumed, wherein a touch position is identified as a position that indicates a parameter control zone.
  • the user activates the menu at time tO, for example by tapping on the multi-point touch- sensitive display 10, or pushing a button. This results in a transition from state SO to SI.
  • state SI the disambiguation module 31 waits for detection of a first touch position.
  • a first touch position p(tl) is detected. This position is identified as the position pi that indicates the parameter identification zone currently selected by the user. Detection of the first touch position causes a transition from state SI to S2. If the user releases the multi-point touch- sensitive display 10 the disambiguation module 31 reassumes state SI. However, in this case a second touch position p(t2) is detected at time t2.
  • This second touch position is identified as the position pc that indicates the parameter control zone currently indicated by the user.
  • the user releases the first touch position at time t3, causing a transition to state S3.
  • the user also releases the second position, having the effect that state S 1 is reassumed.
  • the disambiguation module 31 is arranged to determine a spatial relationship between positions in a list of at least two positions and based on said determined spatial relationship to identify a first position of said list as a position that indicates a parameter identification zone of a parameter and a second position of said list as a position that indicates a parameter control zone of a parameter. This is illustrated in FIG. 8. Therein the disambiguation module 31 receives a list of positions px 1 , ... ,px5. The disambiguation module 31 may for example select the leftmost one of the positions in the list as the position pi indicating the parameter identification zone and the one but leftmost one of the positions in the list as the position pc indicating the parameter control zone.
  • FIG. 9 shows a typical example of a system 100 comprising a user control device 1 and a controlled application 90 comprising a plurality of controllable objects.
  • the controlled application 90 is a lighting arrangement and the controllable objects are lighting elements LI,...,L5.
  • the lighting elements LI, ...,L5 are controllable in accordance with a plurality of parameters.
  • Lighting element LI is for example controllable according to a first parameter PI 1 indicative for its operational state (on/off), a second parameter P12 indicative for its light intensity, and a third parameter P13 indicative for its color temperature.
  • the other lighting elements L2, ...,L5 may be analogously or otherwise controllable.
  • the multi -point touch-sensitive display 10 of the user control device 1 comprises has assigned respective columns CI, ...,C5 for the parameter identification zones of the parameters of each lighting element LI,...,L5.
  • column CI includes the parameter identification zones Zl li, Z12i, Z13i for identifying the parameters PI 1, P12, P13 of the first lighting .element LI.
  • the associated parameter control zones extend to the right of the parameter identification zones.
  • parameter control zone Zl lc extends to the right of the associated parameter identification zone Zl li.
  • This embodiment of the system 100 may for example have a user control device 1 as shown in and described with reference to FIG. 3 and 4, or FIG. 5.
  • the a user control device 1 may have a disambiguation module as described with reference to FIG. 7A, 7B or FIG. 8 for example.
  • FIG. 10 shows another example of a system 100 comprising a user control device and a controllable application comprising a plurality of controllable objects.
  • the controllable application is CAD system and the controllable objects are drawing layer.
  • the user control device herein enables the user to control a depth ranking of the drawing layers.
  • the user control device may enable the user to control a weighting factor with which individual images are weighted, when composing therefrom a resulting image. This is also applicable in an image processing application.
  • CAD system CAD system
  • the controllable objects are drawing layer.
  • the user control device herein enables the user to control a depth ranking of the drawing layers.
  • the user control device may enable the user to control a weighting factor with which individual images are weighted, when composing therefrom a resulting image. This is also applicable in an image processing application.
  • the user control device and the controllable application share a common multi-point touch-sensitive display 10.
  • a first area 14 thereof is assigned to the controllable application, and a second area 15 thereof is assigned to the user control device.
  • Within the second area a menu is presented having a plurality of regions Rl, ...,R5 for controlling respective parameters.
  • a visual indication of a value set for a controlled parameter may be displayed in the region associated with that parameter.
  • the visual indication may be for example be in the form of a bar extending in the parameter control zone, or may be in the form of a color indicating a particular state, e.g. on/off.
  • FIG. 11A,1 IB, l lC show a particular advantageous embodiment wherein the visual indication is in the form of a circle segment of a circle surrounding the parameter identification zone of a parameter. The size of the circle segment is proportional to the value set by the user input.
  • FIG. 11 A, 1 IB, 11C respectively show an example wherein a circle segment extending over an angle of 90, 180 and 270 degrees indicate a value set to 25 %, 50% and 75% of the maximum possible variable for the parameter.
  • FIG. 12 schematically shows a control method according to the present invention for controlling an application involving a plurality of controllable parameters.
  • the method comprises a step SI, wherein display data is generated to display in a plurality of regions of a multi-point touch- sensitive display 10 respective visual representations of respective ones of the controllable parameters. Each region includes a parameter
  • a second step S2 position information Ip is provided indicative for a first and a second position touched by a user on the multi-point touch- sensitive display 10.
  • a parameter identification zone and a parameter control zone of a respective controllable parameter associated with the received first and second positions are identified that are indicated by the position information Ip.
  • the identification involves a disambiguation to determine which of the positions indicates a parameter identification zone and which indicates parameter control zone.
  • a fourth step S4 it is verified if the received first position and the received second position are associated with a same controllable parameter.
  • step S4 Only if it is determined in this verification step S4 that the received first position and the received second position are associated with the same controllable parameter, a value of that same controllable parameter is set in step S5, in accordance with a relative position within the parameter control zone.
  • the elements listed in the system and device claims are meant to include any hardware (such as separate or integrated circuits or electronic elements) or software (such as programs or parts of programs) which reproduce in operation or are designed to reproduce a specified function, be it solely or in conjunction with other functions, be it in isolation or in co-operation with other elements.
  • the invention can be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the apparatus claim enumerating several means, several of these means can be embodied by one and the same item of hardware.
  • 'Computer program product' is to be understood to mean any software product stored on a computer-readable medium, such as a floppy disk, downloadable via a network, such as the Internet, or marketable in any other manner.
  • first, second, third etc. may be used herein to describe various elements, components, modules and/or units, these elements, components, modules and/or units should not be limited by these terms. These terms are only used to distinguish one element, component, module and/or unit from another element, component, module and/or unit. Thus, a first element, component, module and/or unit discussed herein could be termed a second element, component, module and/or unit without departing from the teachings of the present invention.
  • the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion.
  • a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
  • “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif de commande utilisateur (1) servant à commander une application (90) qui implique une pluralité de paramètres à commander (P1, P2, P3). Le dispositif de commande utilisateur (1) comprend un écran tactile multipoint (10) et un moteur graphique (20) destiné à fournir des données d'affichage pour afficher, dans une pluralité de régions (R1, R2, R3) dudit écran tactile multipoint, une représentation visuelle de chacun des paramètres à commander. Chaque région comporte une zone d'identification de paramètre (Z1i, Z2i, Z3i) et une zone de commande de paramètre (Z1c, Z2c, Z3c). Une fonction d'association (30) reçoit des informations de position (Ip) en provenance de l'écran tactile multipoint (10) et identifie une zone d'identification de paramètre et une zone de commande de paramètre pour chaque paramètre à commander associé aux informations de position reçues. Un contrôleur (40) définit la valeur d'un paramètre à commander (P1, P2, P3) conformément à une position relative dans la zone de commande de paramètre associée si ledit paramètre à commander est celui qui correspond à la zone d'identification de paramètre identifiée et à la zone de commande de paramètre identifiée.
PCT/EP2014/075446 2013-11-26 2014-11-25 Dispositif de commande utilisateur, procédé de commande utilisateur, système comprenant ledit dispositif de commande utilisateur, et support d'enregistrement WO2015078826A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP13194435.7 2013-11-26
EP13194435 2013-11-26

Publications (1)

Publication Number Publication Date
WO2015078826A1 true WO2015078826A1 (fr) 2015-06-04

Family

ID=49683514

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2014/075446 WO2015078826A1 (fr) 2013-11-26 2014-11-25 Dispositif de commande utilisateur, procédé de commande utilisateur, système comprenant ledit dispositif de commande utilisateur, et support d'enregistrement

Country Status (1)

Country Link
WO (1) WO2015078826A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120188193A1 (en) 2009-09-29 2012-07-26 Nec Corporation Display device, control method and recording medium
US20130207915A1 (en) * 2012-02-13 2013-08-15 Konica Minolta Business Technologies, Inc. Image forming apparatus, method of controlling the same, and recording medium
US20130249829A1 (en) * 2012-03-23 2013-09-26 Canon Kabushiki Kaisha Display control apparatus and control method for the same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120188193A1 (en) 2009-09-29 2012-07-26 Nec Corporation Display device, control method and recording medium
US20130207915A1 (en) * 2012-02-13 2013-08-15 Konica Minolta Business Technologies, Inc. Image forming apparatus, method of controlling the same, and recording medium
US20130249829A1 (en) * 2012-03-23 2013-09-26 Canon Kabushiki Kaisha Display control apparatus and control method for the same

Similar Documents

Publication Publication Date Title
US9829975B2 (en) Gaze-controlled interface method and system
JP6264293B2 (ja) 表示制御装置、表示制御方法及びプログラム
US20150268802A1 (en) Menu control method and menu control device including touch input device performing the same
US20090265659A1 (en) Multi-window display control system and method for presenting a multi-window display
US20110199386A1 (en) Overlay feature to provide user assistance in a multi-touch interactive display environment
JP6401831B2 (ja) タッチ入力装置の圧力タッチ方法
KR20130135983A (ko) 내부 컴포넌트에의 액세스를 제공하는 장치 및 방법
CN104951129A (zh) 将覆盖数据与视频图像相组合的方法和系统及显示系统
KR102205283B1 (ko) 적어도 하나의 어플리케이션을 실행하는 전자 장치 및 그 제어 방법
KR101421369B1 (ko) 터치락 레이어를 설정하는 단말기 및 방법
CN104951127A (zh) 用于生成显示覆盖的系统和方法以及显示系统
JP2016139180A5 (fr)
US9927914B2 (en) Digital device and control method thereof
US20150355819A1 (en) Information processing apparatus, input method, and recording medium
US9013507B2 (en) Previewing a graphic in an environment
WO2015078826A1 (fr) Dispositif de commande utilisateur, procédé de commande utilisateur, système comprenant ledit dispositif de commande utilisateur, et support d'enregistrement
US9501210B2 (en) Information processing apparatus
US20160077618A1 (en) Touch display device including visual accelerator
US9417780B2 (en) Information processing apparatus
JP6157971B2 (ja) 机型表示装置
KR102320203B1 (ko) 3차원 가상 공간을 표시하는 전자 장치 및 그 제어 방법
KR20150131761A (ko) 입력 처리 장치 및 방법
CN104820489B (zh) 管理低延时的直接控制反馈的系统和方法
US11474693B2 (en) OSDs for display devices
KR102138501B1 (ko) 디지털 컨텐츠를 캡쳐하는 디스플레이 디바이스 및 그 제어 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14802444

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14802444

Country of ref document: EP

Kind code of ref document: A1