WO2007148210A2 - Device feature activation - Google Patents

Device feature activation Download PDF

Info

Publication number
WO2007148210A2
WO2007148210A2 PCT/IB2007/001685 IB2007001685W WO2007148210A2 WO 2007148210 A2 WO2007148210 A2 WO 2007148210A2 IB 2007001685 W IB2007001685 W IB 2007001685W WO 2007148210 A2 WO2007148210 A2 WO 2007148210A2
Authority
WO
WIPO (PCT)
Prior art keywords
input
text
display
orientation
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2007/001685
Other languages
English (en)
French (fr)
Other versions
WO2007148210A3 (en
WO2007148210B1 (en
Inventor
Mikko A. Nurmi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Inc
Original Assignee
Nokia Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Inc filed Critical Nokia Inc
Priority to JP2009515986A priority Critical patent/JP2009541835A/ja
Publication of WO2007148210A2 publication Critical patent/WO2007148210A2/en
Publication of WO2007148210A3 publication Critical patent/WO2007148210A3/en
Publication of WO2007148210B1 publication Critical patent/WO2007148210B1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the disclosed embodiments relate to touch screen devices and, more particularly, to activating features of touch screen devices.
  • the disclosed embodiments are direct to activating functions of a device.
  • the method includes detecting at least one input to a touch display of the device, determining at least one dimension of a movement of the input, and activating or deactivating a function of the device in dependence upon the movement of the input .
  • a method in another aspect, includes detecting an input of the text on a touch enabled display of a device, determining an orientation of an input sequence of the inputted text, and opening an application of the device that is associated with the orientation of the input sequence of the inputted text .
  • an apparatus includes a display processor coupled to a touch screen, an input detection unit coupled to the display processor that receives a first input in the form of a user forming text on the touch screen with a pointing device, an input recognition unit coupled to the display processor that detects an orientation of a sequence of the text being inputted and a processing unit that activates at least one function or application of the apparatus that is associated with the detected orientation.
  • a computer program product includes a computer useable medium having computer readable code means embodied therein for causing a computer to activate functions of a device.
  • the computer readable code means in the computer program product includes computer readable code means for causing a computer to detect at least one input to a touch display of the device, computer readable code means for causing a computer to determine at least one dimension of a movement of the input and computer readable code means for causing a computer to activate or deactivate a function of the device in dependence upon the movement of the input .
  • FIG. 1 shows a device incorporating features of an embodiment
  • FIG. 2 shows another device incorporating features of an embodiment
  • FIGS. 3 and 4 illustrate text input directions in accordance with an embodiment
  • FIG. 5A illustrates a device incorporating features of an embodiment
  • FIG. 5B illustrates a device incorporating features of an embodiment
  • FIG. 6 is a flow diagram of a method in accordance with an embodiment
  • FIG. 7 is a block diagram of one embodiment of a typical apparatus incorporating features of the present invention that may be used to practice the present invention.
  • FIG. 8 shows another device in accordance with an embodiment .
  • Figure 1 illustrates a system incorporating features of one exemplary embodiment.
  • the present embodiments will be described with reference to the exemplary embodiments shown in the drawings and described below, it should be understood that the present invention could be embodied in many alternate forms of embodiments.
  • any suitable size, shape or type of elements or materials could be used.
  • Figure 1 shows a device 10 including a touch screen display 110 and a pointing device 20.
  • the pointing device 20 such as for example, a stylus, pen or simply the user's finger can be used with the touch screen display 110. In alternate embodiments any suitable pointing device may be used.
  • the display 110 and the pointing device 20 form a user interface of the device IQ, which may be configured as a graphical user interface.
  • the device 10 may also include a display processor 130 coupled to a memory 140 that stores a gesture or stroke based algorithm for causing the display processor 130 to operate in accordance with this invention.
  • the memory 140 may also store one or more software applications that run on the device 10.
  • a processing unit 190 may be coupled to the display processor 130 and the memory 140 for initiating or launching the software applications.
  • a first communication or data link or connection may exist between the display 110 and the processor 130 for the processor 130 to receive coordinate information that is descriptive or indicative of the location of the tip or end of the pointing device 20 relative to the surface of the display 110.
  • the display 110 is typically pixelated, and may contain liquid crystal (LC) or some other type of display pixels.
  • the display may be configured to recognize simultaneous inputs (e.g. touch) where the simultaneous inputs occur at different places on the display. In alternate embodiments any display may be utilized.
  • the device may include a touch sensitive keypad as shown in Figure 8. The keys of the touch sensitive keypad may be used in a conventional manner while at the same time be configured to function in a manner substantially similar to that of a touch screen display.
  • a user may make a mark such as the letter "A" in the center of the keypad 810 using any suitable pointing device (e.g. the user finger or a stylus) so that the letter "A" appears at the center of the display 820.
  • any suitable pointing device e.g. the user finger or a stylus
  • the embodiments described below apply equally to a display such as, for example, a touch screen display and the touch sensitive keypad.
  • the display processor 130 may generally provide display data directly or indirectly to the display 110 over, for example, a second communication or data link or connection for activating desired pixels, as is well known in the art.
  • a given coordinate location such as for example an x-y location on the surface of the display 110 may correspond directly or indirectly to one or more display pixels, depending on the pixel resolution and the resolution of the touch screen itself.
  • a single point on the touch screen display 110 (a single x-y location) may thus correspond to one pixel or to a plurality of adjacent pixels.
  • Differing from a single point, a path, stroke, line or gesture (as these terms are used interchangeably herein) that may be used to form text or activate a device function may have a starting x-y point and an ending x-y point, and may include some number of x-y locations between the start and end points.
  • text refers to a single alphanumeric character and strings of alphanumeric characters (i.e. words, sentences and the like) including punctuation marks.
  • any suitable gestures such as lines or graphical marks, may be used.
  • Bringing an end of the pointing device 20 in proximity to or in contact with the surface of the display 110 may mark a starting point of the text. Subsequently moving or lifting the end of the pointing device 20 away from the surface of the display 110 may mark the end point of the text. In one embodiment, the pointing device 20 does not need to make contact with the surface of the display 110 to cause the formation of, or recognition of, an input signal to form a gesture.
  • the device 10 may be for example, the PDA 100 illustrated in Figure 1.
  • the PDA 100 may have a keypad 120, a touch screen display 110 and a pointing device 20 for use on the touch screen display 110.
  • the device 10 may be a mobile cellular device 200 shown in Figure 2.
  • the device 200 may also have a touch screen display 110 a keypad 120 and a pointing device 20.
  • the device 10 may be a personal communicator, a tablet computer, a laptop or desktop computer, or any other suitable device capable of containing the touch screen display 110 and supported electronics such as the display processor 130 and memory 140.
  • the display 5 is a personal communicator, a tablet computer, a laptop or desktop computer, or any other suitable device capable of containing the touch screen display 110 and supported electronics such as the display processor 130 and memory 140.
  • the display 5 is a personal communicator, a tablet computer, a laptop or desktop computer, or any other suitable device capable of containing the touch screen display 110 and supported electronics such as the display processor 130 and memory 140.
  • the device 10 may be peripheral devices that may not be located within the body of the device 10.
  • the device 10 may have multiple displays where, for example, input on one display may affect the behavior (e.g. what is presented, orientation of objects, etc.) of the other displays.
  • the exemplary embodiments herein will be described with reference to the PDA 100 for exemplary purposes only and it should be understood that the embodiments could be applied equally to any suitable device incorporating a touch screen display.
  • An input can include for example, a marking such as a line that can be straight, wavy, or jagged, a string of characters (e.g. word or sentence) or a single character (e.g. a single letter or number).
  • the input could be a random series of markings that is inputted on the screen of the device.
  • the orientation of the input will be in a certain direction with respect to the display 110.
  • text may be input in direction 300 from the top 340 of the PDA 100 to the bottom 350 of the PDA 100 or vice versa as indicated by arrow 320.
  • Text may also be input from the left side 370 of the PDA to the right side 360 of the PDA 100 as indicated by arrow 330 or vice versa as indicated by arrow 310.
  • the text may be input diagonally as shown in Figure 4 and indicated by arrows 400, 410, 420, 430.
  • Text orientations These different text input directions will be referred to herein as "text orientations" and may be facilitated by rotating the PDA 100 to an angle corresponding to a desired text orientation. For example, if a user desires to input text in orientation 310 the user may rotate the PDA 100 so that the top 340 of the PDA 100 is closest to the user, when for example the English language is being used. In alternate embodiments any suitable user language may be used with the touch screen device and the text orientations may change according to a specified user language. For example, when the Arabic language is used, text is normally written from right to left so when text is input in orientation 310 the bottom 350 of the PDA 100 would be closest to the user.
  • the above described text orientations may represent shortcuts to a specified device function or application that is associated with a given text orientation.
  • the memory 140 of the PDA 100 may include algorithms that cause the display processor 130 to automatically recognize the different text orientations 300-330 and 400-430, as well as the text itself, as a user inputs the text.
  • the memory 140 may also include algorithms that may be used by processor 190 and display processor 130 for launching and causing features, functions and applications of the PDA 100 to activate. For example, software applications or functions can be activated when a certain sequence of movement and direction of the input to the device 10 is detected.
  • a messaging application may be opened when text is input in orientation 330 or a notes application may be opened when text is input in orientation 310.
  • the function, feature or application to be associated with and activated by any given text orientation may be predefined during manufacture of the device or it may be set by the user of the PDA 100.
  • certain text orientations may be associated with applications of the touch screen device such as e-mails, short messages (SMS), multimedia messages (MMS), instant messages (IM), notepads, word processors, calendars, To-Dos, spreadsheets or any other suitable functionality that may be stored and run within the touch screen device.
  • each text orientation may be associated with more than one function in that, for example, the display processor may recognize function names as well as the direction of the written text.
  • the display processor recognizes both the word “calendar” and the direction 330 and causes the calendar application to be launched.
  • the display processor similarly recognizes both the word “notes” and the direction 330 and causes a notes application to be launched instead of the calendar function.
  • a combination of a word and a direction may be used to launch an application in different orientations. For example, if the word "notes” is input on the display in the direction 330, the notepad application may be launched so that the contents of the notepad application are read from left to right. If the word "notes” is input in direction 350 the notepad application may be launched so the contents of the notepad application are read from right to left.
  • any suitable method of associating the device functions with a specified text orientation may be used.
  • a user may associate text orientation 330 with a calendar application so that when text is input in a direction 330, an algorithm within the memory 140 may cause the display processor 130 to display, for example, the calendar 500 of the PDA 100 as can be seen in Figure 5A.
  • the touch screen device 10 may have up to eight shortcuts associated with the text orientations, however the embodiments are not limited to eight shortcuts as any number (more or less than eight) of text orientation/device software application or function associations can be envisioned using the concept of the embodiments.
  • a combination of direction of an input and a location e.g.
  • corner 380 of the device may be associated with the calendar application of the device so that when an input is made, for example, starting in corner 380 in direction 300 the calendar may be launched and the contents of the calendar may be presented on the display to read from top 340 to bottom 350.
  • the shortcut description may be displayed along a corresponding side of the touch screen display 110 itself such as when, for example, a user configures the shortcuts.
  • the display of the shortcut definition directly on the touch screen display may allow the shortcut definition to be easily changed when a user redefines the shortcut.
  • the shortcut description may be displayed or presented in any suitable manner on any suitable area of the touch screen device.
  • a user of the device 10 such as PDA 100 may, for example, input text, such as text 530 in direction 330 by placing the pointing device 20 on or near the touch screen 110 and writing a desired text (Fig. 6, Block 600) .
  • the display processor 130 may detect or recognize the direction (i.e. direction 330) the text is being input (Fig. 6, Block 610) .
  • the detection of direction 330 by the display processor 130 may cause processor 190 via an algorithm within the memory 140 to open a software application or function associated with direction 330 that is to be displayed by the display processor 130 on the touch screen display 110 (Fig. 6, Block 620) .
  • the calendar application 500 will be associated with the text orientation 330.
  • the calendar 500 may be displayed on the touch screen 110 having the look of a conventional paper calendar.
  • the calendar may be a personalized calendar including the month 550, the date 560, a day planner 540, a notes section 510 and a "month at a glance" section 520.
  • the day planner may contain hourly entries for the day that may be categorized in groups such as by work, family, or hobbies groups.
  • the display processor 130 may be configured to display the software function in such a manner so that the display corresponds with the orientation of the input text (Fig. 6, Block 630) .
  • the display processor 130 may automatically "rotate" the items (e.g. the software application) shown on the display 110 in accordance with the detected text input direction.
  • the calendar function 500 may be displayed to be read from the left side 370 of the PDA 100 to the right side 360 of the PDA 100.
  • direction 320 e.g.
  • the display processor may automatically "rotate" the items (e.g. icons, character strings, pictures, graphics, etc.) corresponding to the software application on the touch screen display 110 so that when displayed, the contents of the software application, such as for example the contents 580 of a notepad 570, may be read from the bottom 350 of the PDA 100 to the top 340 of the PDA 100.
  • the device may present a choice to the user via, for example, a dialogue box or the user may configure the device as to whether or not the device is to rotate the items on the display to correspond to the detected text input direction.
  • the display processor may direct the input text 530 to a certain area of the calendar such as the day planner 540 (Fig. 6, Block 640) .
  • the area the text is directed to may be preset during the manufacture of the device or it may be user defined.
  • a new set of application specific text orientation shortcuts may be invoked (Fig. S 1 Block 650) .
  • the application specific shortcuts may also be definable by a user of the device.
  • a set of shortcuts may be configured or defined so that text written in direction 320 will be entered in the day planner section 540 under the work category while text entered in direction 330 may be entered in the day planner section 540 under the family category.
  • FIG. 7 is a block diagram of one embodiment of a typical apparatus 700 incorporating features that may be used to practice the present invention.
  • a computer system 702 may be linked to another computer system 704, such that the computers 702 and 704 are capable of sending information to each other and receiving information from each other.
  • computer system 702 could include a server computer adapted to communicate with a network 706.
  • Computer systems 702 and 704 can be linked together in any conventional manner including, for example, a modem, hard wire connection, or fiber optic link.
  • Computers 702 and 704 are generally adapted to utilize program storage devices embodying machine readable program source code which is adapted to cause the computers 702 and 704 to perform the method steps of the present invention.
  • the program storage devices incorporating features of the invention may be devised, made and used as a component of a machine utilizing optics, magnetic properties and/or electronics to perform the procedures and methods of the present invention.
  • the program storage devices may include magnetic media such as a diskette or computer hard drive, which is readable and executable by a computer.
  • the program storage devices could include optical disks, read-only-memory (“ROM”) floppy disks and semiconductor materials and chips .
  • Computer systems 702 and 704 may also include a microprocessor for executing stored programs.
  • Computer 702 may include a data storage device 708 on its program storage device for the storage of information and data.
  • the computer program or software incorporating the processes and method steps incorporating features of the present invention may be stored in one or more computers 702 and 704 on an otherwise conventional program storage device.
  • computers 702 and 704 may include a user interface 710, and a display interface 712 from which features of the present invention can be accessed.
  • the user interface 710 and the display interface 712 can be adapted to allow the input of queries and commands to the system, as well as present the results of the commands and queries.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
PCT/IB2007/001685 2006-06-23 2007-06-21 Device feature activation Ceased WO2007148210A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009515986A JP2009541835A (ja) 2006-06-23 2007-06-21 デバイス機能のアクティブ化

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/473,836 2006-06-23
US11/473,836 US20070295540A1 (en) 2006-06-23 2006-06-23 Device feature activation

Publications (3)

Publication Number Publication Date
WO2007148210A2 true WO2007148210A2 (en) 2007-12-27
WO2007148210A3 WO2007148210A3 (en) 2008-04-24
WO2007148210B1 WO2007148210B1 (en) 2008-06-19

Family

ID=38833818

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2007/001685 Ceased WO2007148210A2 (en) 2006-06-23 2007-06-21 Device feature activation

Country Status (5)

Country Link
US (1) US20070295540A1 (enExample)
JP (1) JP2009541835A (enExample)
CN (1) CN101506763A (enExample)
TW (1) TW200813798A (enExample)
WO (1) WO2007148210A2 (enExample)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009266201A (ja) * 2008-04-24 2009-11-12 Kotatsu Kokusai Denshi Kofun Yugenkoshi ユーザインタフェース切り換え方法、この方法を用いる電子装置及び記録媒体
WO2010061052A1 (en) * 2008-11-30 2010-06-03 Nokia Corporation Item and view specific options
WO2011017006A1 (en) * 2009-08-05 2011-02-10 Apple Inc. Multi-operation user interface tool
US8627207B2 (en) 2009-05-01 2014-01-07 Apple Inc. Presenting an editing tool in a composite display area
US8698844B1 (en) 2005-04-16 2014-04-15 Apple Inc. Processing cursor movements in a graphical user interface of a multimedia application

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7159194B2 (en) * 2001-11-30 2007-01-02 Palm, Inc. Orientation dependent functionality of an electronic device
US7999789B2 (en) * 2007-03-14 2011-08-16 Computime, Ltd. Electrical device with a selected orientation for operation
US8127254B2 (en) * 2007-06-29 2012-02-28 Nokia Corporation Unlocking a touch screen device
US8446371B2 (en) * 2007-12-19 2013-05-21 Research In Motion Limited Method and apparatus for launching activities
US20110242043A1 (en) * 2010-04-06 2011-10-06 Mark Yarvis Device with capacitive touchscreen panel and method for power management
US8854318B2 (en) 2010-09-01 2014-10-07 Nokia Corporation Mode switching
US20120256857A1 (en) * 2011-04-05 2012-10-11 Mak Genevieve Elizabeth Electronic device and method of controlling same
US8872773B2 (en) 2011-04-05 2014-10-28 Blackberry Limited Electronic device and method of controlling same
US20120256846A1 (en) * 2011-04-05 2012-10-11 Research In Motion Limited Electronic device and method of controlling same
US20130249810A1 (en) * 2012-03-22 2013-09-26 Microsoft Corporation Text entry mode selection
JP6189680B2 (ja) * 2013-08-23 2017-08-30 シャープ株式会社 インターフェイス装置、インターフェイス方法、インターフェイスプログラム、及び、そのプログラムを記憶したコンピュータ読取可能な記録媒体

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04186425A (ja) * 1990-11-21 1992-07-03 Hitachi Ltd メニュー表示方式
US5614926A (en) * 1993-05-17 1997-03-25 Sharp Kabushiki Kaisha Word processor with a handwriting text processing function
JP2000123114A (ja) * 1998-10-15 2000-04-28 Casio Comput Co Ltd 手書き文字入力装置及び記憶媒体
US6938221B2 (en) * 2001-11-30 2005-08-30 Microsoft Corporation User interface for stylus-based user input
US11275405B2 (en) * 2005-03-04 2022-03-15 Apple Inc. Multi-functional hand-held device
US20040036680A1 (en) * 2002-08-26 2004-02-26 Mark Davis User-interface features for computers with contact-sensitive displays
WO2004111816A2 (en) * 2003-06-13 2004-12-23 University Of Lancaster User interface
GB2410662A (en) * 2004-01-29 2005-08-03 Siemens Plc Activation of an operation by cursor movement
US20060007176A1 (en) * 2004-07-06 2006-01-12 Chung-Yi Shen Input method and control module defined with an initial position and moving directions and electronic product thereof
US7671845B2 (en) * 2004-11-30 2010-03-02 Microsoft Corporation Directional input device and display orientation control
US9395905B2 (en) * 2006-04-05 2016-07-19 Synaptics Incorporated Graphical scroll wheel

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8698844B1 (en) 2005-04-16 2014-04-15 Apple Inc. Processing cursor movements in a graphical user interface of a multimedia application
JP2009266201A (ja) * 2008-04-24 2009-11-12 Kotatsu Kokusai Denshi Kofun Yugenkoshi ユーザインタフェース切り換え方法、この方法を用いる電子装置及び記録媒体
WO2010061052A1 (en) * 2008-11-30 2010-06-03 Nokia Corporation Item and view specific options
US8627207B2 (en) 2009-05-01 2014-01-07 Apple Inc. Presenting an editing tool in a composite display area
WO2011017006A1 (en) * 2009-08-05 2011-02-10 Apple Inc. Multi-operation user interface tool

Also Published As

Publication number Publication date
TW200813798A (en) 2008-03-16
JP2009541835A (ja) 2009-11-26
US20070295540A1 (en) 2007-12-27
WO2007148210A3 (en) 2008-04-24
CN101506763A (zh) 2009-08-12
WO2007148210B1 (en) 2008-06-19

Similar Documents

Publication Publication Date Title
WO2007148210A2 (en) Device feature activation
US8667412B2 (en) Dynamic virtual input device configuration
EP3220252B1 (en) Gesture based document editor
AU2010295574B2 (en) Gesture recognition on computing device
CA2501118C (en) Method of combining data entry of handwritten symbols with displayed character data
US7623119B2 (en) Graphical functions by gestures
US9354771B2 (en) Controlling application windows in an operating system
US20070236468A1 (en) Gesture based device activation
US20160110230A1 (en) System and Method for Issuing Commands to Applications Based on Contextual Information
US6384815B1 (en) Automatic highlighting tool for document composing and editing software
US10241670B2 (en) Character entry apparatus and associated methods
CN104461338A (zh) 可携式电子装置及控制可携式电子装置的方法
EP4309070A1 (en) Converting text to digital ink
JP5634617B1 (ja) 電子機器および処理方法
JP7109448B2 (ja) 動的スペースバー
US11435893B1 (en) Submitting questions using digital ink
WO2022197458A1 (en) Duplicating and aggregating digital ink instances
HK1137529A (en) Device feature activation
EP2587355A1 (en) Electronic device and method of character entry
CN109656460B (zh) 提供键盘的可选择的键的电子设备和方法
WO2022197436A1 (en) Ink grouping reveal and select
HK1177023A (en) Method and device for editing objects

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200780030846.8

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07766575

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 2009515986

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 07766575

Country of ref document: EP

Kind code of ref document: A2