EP1425653A2 - Gestionnaire de bureau electronique - Google Patents

Gestionnaire de bureau electronique

Info

Publication number
EP1425653A2
EP1425653A2 EP02777079A EP02777079A EP1425653A2 EP 1425653 A2 EP1425653 A2 EP 1425653A2 EP 02777079 A EP02777079 A EP 02777079A EP 02777079 A EP02777079 A EP 02777079A EP 1425653 A2 EP1425653 A2 EP 1425653A2
Authority
EP
European Patent Office
Prior art keywords
user interface
input device
virtual window
freedom
enlargement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP02777079A
Other languages
German (de)
English (en)
Inventor
Bernd 3Dconnexion GmbH GOMBERT
Bernhard Von Prittwitz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
3DConnexion GmbH
Original Assignee
3DConnexion GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from DE10155030A external-priority patent/DE10155030A1/de
Application filed by 3DConnexion GmbH filed Critical 3DConnexion GmbH
Publication of EP1425653A2 publication Critical patent/EP1425653A2/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04805Virtual magnifying lens, i.e. window or frame movable on top of displayed information to enlarge it for better reading or selection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present invention relates to a method for managing user interfaces, to a computer software program for implementing such a method and to the use of a force / moment sensor for such a method.
  • the general background of the present invention is the management of graphical user interfaces on which symbols are arranged, the arrangement being generally freely selectable by the user.
  • “desktop” is the designation for the visible work surface of the graphical user interface of, for example, Microsoft Windows or OS / 2.
  • “Desktop” is normally a work area on the screen that contains symbols and menus, around the surface of a To simulate desks.
  • a desktop for example, is characteristic of window-oriented programs such as Microsoft Windows. The purpose of such a desktop is the intuitive operation of a computer, since the user can move the pictures of objects and start and end tasks in almost the same way as he is used to from a real desk.
  • a force / torque sensor is used as an input device for such a desktop program, the state of the art with regard to Force / torque sensors are explained.
  • DE 199 52 560 AI discloses a method for adjusting and / or adjusting a seat of a motor vehicle using a multifunctional, manually operated input device with a force / moment sensor.
  • a force / torque sensor is shown in FIG. 6 of DE 199 52 560 AI.
  • the technical details of such a sensor refer to this figure and the associated description for DE 199 52 560 AI.
  • the input device has a user interface on which a number of areas are provided for entering at least one pressure pulse.
  • the input device has a device for evaluating and recognizing a pressure pulse detected by means of the force / moment sensor and converted into a force and moment vector pair.
  • the selected device can then be controlled linearly by means of an analog signal from the force / torque sensor.
  • the selection of a function and the subsequent activation are thus separated into two processes that are separated from one another in time.
  • the central finding of the invention is that a user of a real desk arranges various documents on the desk surface in accordance with an intuitive, user-specific working behavior. This aspect has already been taken into account in classic desktop technology, i.e. translated into the world of the graphical user interface.
  • the first time it is possible for the first time to navigate a virtual window - similar to microfiche technology (microfilm with microcopies arranged in rows) - relative to a user interface.
  • the user interface under the virtual window can be moved in three dimensions, for example.
  • the user interface can therefore be larger than the desktop, depending on the desktop. In this case, the entire user interface is not displayed on the monitor. However, it is also possible to equate the size of the desktop with the entire user interface.
  • a further finding in the present invention is that the user first takes a certain distance (“lean back”) in order to obtain an overview of the workplace. After recognizing desired documents etc. by means of this overview, the focus is then turned to interesting working documents This is reproduced by the invention in that the enlargement / reduction factor of a virtual window can be changed, which essentially corresponds to a zoom effect with regard to the objects located within the window of the viewer are gradually directed towards certain screen objects (working documents, icons, etc.).
  • This effect is more precisely achieved according to the invention in that objects are first arranged on a user interface, for example by the user.
  • the user can add, delete or move objects and scale the display size of the objects.
  • This step corresponds to arranging documents on a desk, for example.
  • a virtual window with an adjustable enlargement / reduction factor can be navigated with respect to the user interface, which corresponds to a focus that can be changed in terms of position and viewing angle.
  • an input device which provides control signals in at least three mutually independent degrees of freedom. It is thus possible to navigate three-dimensionally with respect to the user interface, with control signals in two degrees of freedom for the
  • a method for managing objects on a graphical user interface is provided. First, objects are arranged on the user interface by the user. Finally, a virtual window can be navigated with respect to the overall user interface configured in this way, the content of the window being displayed on the screen in each case.
  • Control signals in Two degrees of freedom are used for the positioning of the virtual window with respect to the user interface and that
  • the input device can provide control signals in at least three translational and / or rotational degrees of freedom.
  • This input device can in particular be a force / moment sensor.
  • an input device can also be used Navigation (for example a computer mouse) can be used, which is physically assigned an element for generating a control signal in a third degree of freedom.
  • This element can be, for example, an additional switch, a rotary wheel or a button.
  • the virtual window can correspond to the entire display area of a screen.
  • the zoom function is carried out, the size of all objects on the total user surface changes to the same extent.
  • the virtual window only as part of the total display area of the screen. If the entire user interface is then displayed on the display area of the screen, the input device can be used to navigate the virtual window as a type of “magnifying glass” with an adjustable magnification factor with respect to the user interface, so that the user interface can be moved under the “magnifying glass”, so to speak.
  • the software programs to be managed can in particular be office applications, such as word processing or spreadsheets.
  • the objects on the user interface can be windows of files which can be changed with regard to their display size. These files can be active, ie immediately callable and executable state are displayed. It is therefore not necessary to start an application program after activating such an object.
  • the objects can be displayed on the user interface in a pseudo 3D view.
  • a computer software program which implements a method of the type mentioned above when it runs on a computer.
  • the invention proposes the use of a force / moment sensor for a method according to one of the above-mentioned types.
  • FIG. 1 shows a system having a 3D input device and a computer with a desktop surface
  • Fig. 2 shows a modification of
  • 3 to 5 show a further embodiment in which a virtual window has been defined as the entire screen
  • FIG. 6 shows a schematic flow diagram of a sequence for carrying out the present invention
  • FIG. 7 shows the evaluation step S3 from FIG. 6 in detail.
  • a PC 4 is used to implement the invention.
  • This PC 4 has a monitor 6 on which a desktop 3, that is to say a section of the user interface, is displayed.
  • a desktop 3 that is to say a section of the user interface
  • Several graphic objects 5, 10 are arranged on this displayed section of the user interface.
  • a 3D input device 1 has an operating part 7 which can be manipulated by the fingers or the hand of a user and which is movably mounted, for example, in three mutually independent rotational and three translational degrees of freedom with respect to a base part 8. A relative movement between the operating part 7 and the base part 8 is evaluated and the result of the evaluation is transmitted to the computer 4 in the form of control signals.
  • the input device 1 can of course still output control signals with respect to further degrees of freedom by physically assigning it further rotary dials, buttons or switches on the operating part 7 or on the base plate 8.
  • the input device 1 can be used to navigate a virtual window with an adjustable size with respect to the total area of the user interface.
  • the display scale of objects within of the virtual window can be optionally selected in a particularly advantageous embodiment within certain limits by means of the input device 1.
  • control signals are used in two degrees of freedom of the input device 1 for navigating the virtual window with respect to the user interface 3 (up / down or left / right).
  • a control signal in a third degree of freedom of the input device 1 is provided - if this option is provided - for real-time setting of an enlargement / reduction factor for the objects lying within the virtual window.
  • This enlargement / reduction factor can be changed continuously with the appropriate pixel scaling or discretely, for example in the case of defined font size levels.
  • increasing the enlargement / reduction factor within the virtual window can be as
  • Approximation according to the screen objects are shown larger and the section of the user interface 3 shown on the screen is reduced.
  • Such a virtual window is designated by the reference symbol 2 in FIG.
  • the size of this window 2 is set such that it occupies only part of the display area of the screen 6. Accordingly, it can be navigated selectively, for example, as shown, via the object 10, so that the object 10 lies within the window area. If now by means of the input device 1, the enlargement / reduction factor of the virtual window 2 is increased, which can be done in steps or continuously, results in the enlarged representation 10 ′ of the object 10, which is shown schematically in FIG.
  • FIGS. 3 to 5 show the case in which the virtual window 2 is set such that it corresponds to the entire display area of the screen 6.
  • the user interface 3 is thus moved with respect to the desktop.
  • the display size of all objects represented on the display area changes when the enlargement / reduction factor changes. If the user has arranged a group 11 on the user interface 3, he can enlarge the display of it continuously (pixel scaling) or step by step until, for example (see FIG. 5), only the document 12 from this group 11 is legibly displayed. This corresponds to zooming in on the user interface 3.
  • a computer mouse 1 ' is symbolically provided in FIG. 2 as an input device.
  • This computer mouse 1 ' which can actually only provide control signals in two degrees of freedom (x-y axis), is physically assigned a further element 9 which can generate a control signal in at least one further degree of freedom.
  • this further element is a rotary wheel 9, which is arranged on the top of the computer mouse 1 '. By rotating this wheel 9 to the front, the display area of a screen object 10, 10 'can also be enlarged (selective focus) or all screen objects 5, 10 can be shown enlarged (general focus).
  • the reduction function can accordingly by rotating the wheel 9 in the reverse direction (in the three-dimensional Input device by pressing or tilting the control panel 7 backwards), which intuitively corresponds to leaning back of the user in order to get a better overview of the objects 5, 10 on the user interface 3.
  • objects 5, 10 on the user interface 3 display files from application programs, such as word processing or spreadsheets
  • these file objects can be actively displayed.
  • the corresponding object is enlarged / reduced, not only is an icon enlarged or reduced as a symbol for the corresponding application program, but rather the document / spreadsheet itself can be enlarged or reduced.
  • several screen objects can be actively displayed on the user interface 3 at the same time, their respective display scale being freely selectable. The user can thus, for example, arrange documents of any size and at any position on the screen surface 3.
  • FIG. 6 shows schematically the sequence in the implementation of the present invention.
  • Output signals of the force / moment sensor are generated in a step S1. These are then fed (step S2) to the data input of an EDP system.
  • This can be done for example by means of a so-called USB interface.
  • USB Universal Serial Bus
  • peripheral devices such as a mouse, modem, printer, keyboard, scanner, etc.
  • the transfer rate of USB version 1.1 is already 12 Mbit / s.
  • step S3 the signals input by the force / moment sensor are evaluated. This step S3 is explained in detail below with reference to FIG. 7. Depending on the evaluation in step S3, the control of the graphical user interface (GUI) is then carried out in a step S4 before the data are evaluated again by the force / torque sensor.
  • GUI graphical user interface
  • step S3 of the sequence of FIG. 6 will now be explained in more detail.
  • data in three different degrees of freedom x, y and z are evaluated, for example, to determine whether the corresponding signal is in the positive or negative range.
  • degree of freedom “z” a positive signal can be used to enlarge and a negative signal to reduce the virtual window with respect to the entirety of the graphical user interface.
  • a positive signal can shift the virtual window to the left and a negative signal can shift the virtual window to the right (always with respect to the entirety of the graphical user interface).
  • the virtual window can therefore be designed as a fixed marking bar "under” the user interface is navigated. Objects that come under the virtual window are automatically marked “highlight”) and preselected for a possible subsequent click or other activation.
  • This procedure is particularly advantageous if a directory structure (directory tree) is navigated under the fixed window, under directories located in the window can be selected automatically, so that in principle, you can navigate in infinitely large structures without the user's hand having to leave the input device.
  • a "grasp" to change the image section as soon as the cursor in the case of known techniques on the edge of the screen is no longer required.
  • a positive signal can move the window upwards and a negative signal can move the window downwards. This can also be seen analogously as an inverse movement of the user interface "under” the virtual window.
  • the display size or the document size can be freely selected on the user interface.
  • a single device such as a 3D input device or a 2D input device with additional elements
  • the arrangement and the size of the screen objects on the desktop surface can be freely selected.
  • the recognition value of freely arranged areas is significantly larger, since optical recognition features and not just purely memory features apply here.
  • a real intuitive working behavior is largely reproduced.
  • the real working behavior is usually the fact that the user works in the workplace with the inclusion of the optically perceptible sector.
  • the focus on a working document that leans back to gain an overview are a natural part of processing real objects. But the present invention does it now for the first time possible to transfer such intuitive behavior to virtual objects, namely objects that are displayed on a user interface.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un programme de gestionnaire de bureau électronique permettant d'étendre l'interface utilisateur graphique (3) de moniteurs et ordinateurs personnels habituels par positionnement libre de la section affichée de l'interface utilisateur au moyen d'un appareil d'entrée 3D (1, 1'), de manière que l'utilisateur puisse déterminer lui-même la partie visible de l'interface utilisateur (3) d'un moniteur (6) et d'un ordinateur personnel (4). Le choix de cette partie visible se présentant sous la forme d'une fenêtre virtuelle (2) peut être effectué au moyen d'un appareil d'entrée (1, 1') avec au moins trois degrés de liberté. Deux degrés de liberté servent ainsi à la navigation d'une fenêtre virtuelle (2) sur l'interface utilisateur (3). Un autre degré de liberté sert au réglage d'un facteur d'agrandissement/réduction concernant les objets sur l'interface utilisateur (3) dans la fenêtre virtuelle (2). Ainsi, il est possible de définir la fenêtre virtuelle en tant qu'une seule partie de la surface d'affichage totale de l'écran (6). Lorsque l'interface utilisateur (3) est représentée sur la surface d'affichage de l'écran (6), l'appareil d'entrée permet de guider la fenêtre virtuelle en tant que </= loupe >/= présentant un facteur d'agrandissement réglable sur l'interface utilisateur (3).
EP02777079A 2001-09-13 2002-09-12 Gestionnaire de bureau electronique Withdrawn EP1425653A2 (fr)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
DE10145185 2001-09-13
DE10145185 2001-09-13
DE10155030 2001-11-09
DE10155030A DE10155030A1 (de) 2001-09-13 2001-11-09 Desktopmanager
PCT/EP2002/010246 WO2003023592A2 (fr) 2001-09-13 2002-09-12 Gestionnaire de bureau electronique

Publications (1)

Publication Number Publication Date
EP1425653A2 true EP1425653A2 (fr) 2004-06-09

Family

ID=26010126

Family Applications (1)

Application Number Title Priority Date Filing Date
EP02777079A Withdrawn EP1425653A2 (fr) 2001-09-13 2002-09-12 Gestionnaire de bureau electronique

Country Status (3)

Country Link
US (1) US20040046799A1 (fr)
EP (1) EP1425653A2 (fr)
WO (1) WO2003023592A2 (fr)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005070898A (ja) * 2003-08-20 2005-03-17 Toshiba Corp 情報処理装置および表示制御方法
US7899756B2 (en) * 2004-12-01 2011-03-01 Xerox Corporation Critical parameter/requirements management process and environment
US8819569B2 (en) * 2005-02-18 2014-08-26 Zumobi, Inc Single-handed approach for navigation of application tiles using panning and zooming
JP4653561B2 (ja) * 2005-05-31 2011-03-16 株式会社東芝 情報処理装置および表示制御方法
US20070268317A1 (en) * 2006-05-18 2007-11-22 Dan Banay User interface system and method for selectively displaying a portion of a display screen
US8914786B2 (en) 2007-03-23 2014-12-16 Zumobi, Inc. Systems and methods for controlling application updates across a wireless interface
US8595642B1 (en) * 2007-10-04 2013-11-26 Great Northern Research, LLC Multiple shell multi faceted graphical user interface
WO2011104269A1 (fr) 2008-02-26 2011-09-01 Jenavalve Technology Inc. Stent pour le positionnement et l'ancrage d'une prothèse valvulaire dans un site d'implantation dans le cœur d'un patient
US8289288B2 (en) * 2009-01-15 2012-10-16 Microsoft Corporation Virtual object adjustment via physical object detection
US9443257B2 (en) * 2010-10-21 2016-09-13 Yahoo! Inc. Securing expandable display advertisements in a display advertising environment
US9843665B2 (en) * 2011-05-27 2017-12-12 Microsoft Technology Licensing, Llc Display of immersive and desktop shells
US10417018B2 (en) 2011-05-27 2019-09-17 Microsoft Technology Licensing, Llc Navigation of immersive and desktop shells
US10133355B2 (en) 2014-03-21 2018-11-20 Dell Products L.P. Interactive projected information handling system support input and output devices
US9965038B2 (en) 2014-03-21 2018-05-08 Dell Products L.P. Context adaptable projected information handling system input environment
US20150268739A1 (en) * 2014-03-21 2015-09-24 Dell Products L.P. Projected Information Handling System Input Environment with Object Initiated Responses
US9304599B2 (en) 2014-03-21 2016-04-05 Dell Products L.P. Gesture controlled adaptive projected information handling system input and output devices
US20160196013A1 (en) * 2015-01-07 2016-07-07 Blackberry Limited Electronic device and method of controlling display of information
CN105867754B (zh) * 2015-01-22 2019-11-26 阿里巴巴集团控股有限公司 应用界面处理方法及装置

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5341466A (en) * 1991-05-09 1994-08-23 New York University Fractal computer user centerface with zooming capability
US5670984A (en) * 1993-10-26 1997-09-23 Xerox Corporation Image lens
JP2813728B2 (ja) * 1993-11-01 1998-10-22 インターナショナル・ビジネス・マシーンズ・コーポレイション ズーム/パン機能付パーソナル通信機
EP0693852A3 (fr) * 1994-07-22 1997-05-28 Eastman Kodak Co Méthode et appareil pour l'application d'une fonction à un domaine localisé d'une image numérique utilisant une fenêtre
US6037939A (en) * 1995-09-27 2000-03-14 Sharp Kabushiki Kaisha Method for enabling interactive manipulation of data retained in computer system, and a computer system for implementing the method
US5999169A (en) * 1996-08-30 1999-12-07 International Business Machines Corporation Computer graphical user interface method and system for supporting multiple two-dimensional movement inputs
US6097393A (en) * 1996-09-03 2000-08-01 The Takshele Corporation Computer-executed, three-dimensional graphical resource management process and system
US6128006A (en) * 1998-03-26 2000-10-03 Immersion Corporation Force feedback mouse wheel and other control wheels
US6275232B1 (en) * 1998-12-14 2001-08-14 Sony Corporation Polymorphic event handling for zooming graphical user interface
US20020060691A1 (en) * 1999-11-16 2002-05-23 Pixel Kinetix, Inc. Method for increasing multimedia data accessibility

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO03023592A2 *

Also Published As

Publication number Publication date
WO2003023592B1 (fr) 2004-03-25
WO2003023592A2 (fr) 2003-03-20
US20040046799A1 (en) 2004-03-11
WO2003023592A3 (fr) 2004-02-12

Similar Documents

Publication Publication Date Title
EP1425653A2 (fr) Gestionnaire de bureau electronique
DE60024655T2 (de) Verfahren zur benutzung von mit einem anzeigegerät verbundenen tasten für den zugriff und die ausführung von damit verbundenen funktionen
DE69724416T2 (de) Zeigersteuerung mit benutzerrückführungsmechanismus
EP1428110B1 (fr) Appareil d&#39;entree 3d a ecran tactile integre
EP1272921B1 (fr) Procede pour naviguer entre des fenetres d&#39;un espace d&#39;affichage
DE102009014555A1 (de) Verfahren zum Unterstützen der Steuerung der Bewegung eines Positionsanzeigers mittels eines Tastfelds
DE102012109058A1 (de) Steuerverfahren und elektronische Einrichtung
DE19744861A1 (de) Verfahren zum Einsatz einer dreidimensionalen Mouse im WINDOWS-Betriebssystem
DE102012014098A1 (de) Verfahren zur Imitation der Touchscreen-Steuerung durch eine Maus
DE102012020607B4 (de) Kraftwagen mit einer Gestensteuerungseinrichtung sowie Verfahren zum Steuern eines Auswahlelements
DE102012014603A1 (de) System und Verfahren für den synchronisierten Betrieb einer Touch-vorrichtung
WO2017144298A1 (fr) Interface utilisateur pourvue de plusieurs écrans d&#39;affichage et procédé de positionnement de contenus sur plusieurs écrans d&#39;affichage
DE102012220062A1 (de) Einstellung mehrerer benutzereingabeparameter
DE10140874A1 (de) Graphische Benutzeroberfläche
DE102013203918A1 (de) Verfahren zum Betreiben einer Vorrichtung in einer sterilen Umgebung
EP2877910B1 (fr) Dispositif d&#39;entrée à surface sensible au toucher enfonçable
DE10084249T5 (de) Zusätzliches LCD-Feld mit Sensorbildschirm
WO2006032442A1 (fr) Dispositif de commande pour affichages
DE10155030A1 (de) Desktopmanager
EP1444566A2 (fr) Appareil de saisie, webcam et ecran a fonction d&#39;entree vocale
DE102009003995A1 (de) Verfahren zur Vergrößerung eines Darstellungsbereichs auf einer Darstellungseinrichtung
EP1484666A2 (fr) Dispositif d&#39;entrée multidimensionnel pour navigation et sélection d&#39;objets virtuels
WO2017178313A1 (fr) Unité de commande et de configuration ainsi que procédé de commande et de configuration d&#39;un microscope
EP2570907B1 (fr) Procédé de fonctionnement d&#39;une interface utilisateur d&#39;un agencement de traitement de données
DE102008011072B4 (de) System zur Steuerung der Sichtbarkeit von Fenstern in fensterorientierten Bedienoberflächen

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20040202

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LI LU MC NL PT SE SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK RO SI

17Q First examination report despatched

Effective date: 20070817

REG Reference to a national code

Ref country code: DE

Ref legal event code: 8566

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20071228