EP3776150A1 - Display apparatus - Google Patents
Display apparatusInfo
- Publication number
- EP3776150A1 EP3776150A1 EP19717550.8A EP19717550A EP3776150A1 EP 3776150 A1 EP3776150 A1 EP 3776150A1 EP 19717550 A EP19717550 A EP 19717550A EP 3776150 A1 EP3776150 A1 EP 3776150A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- motion
- information
- display screen
- display
- erase
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
- G06F40/171—Editing, e.g. inserting or deleting by use of digital ink
Definitions
- the present invention relates to a display apparatus for displaying erasable information.
- Display apparatuses are used in a wide range of portable and moveable devices, such as mobile telephones, tablet computer devices and laptops.
- portable and moveable devices such as mobile telephones, tablet computer devices and laptops.
- the challenges for the designers of portable devices is to design a display apparatus that is light, battery efficient and convenient to use.
- the present invention provides display apparatus comprising a display screen to display information in a plurality of different regions; a memory to store information for display on the different regions of the display screen; a motion detecting arrangement to detect motion of the display apparatus; and a processor programmed to control the display of the information on the display screen in the different regions and responsive to motion detected by the motion detecting arrangement, to control the display screen to erase at least a part of the information displayed on the display screen, wherein the motion detecting arrangement is configured to detect a plurality of motion types, and the processor is programmed to identify the detected motion types and to selectively erase information displayed on the display screen in the different regions in dependence upon the identified motion types.
- the present invention also provides a method of controlling a display apparatus having a display screen and a motion detector arrangement to detect motion, the method comprising displaying information on the display screen in a plurality of different regions; detecting a plurality of types of motion of the display apparatus; and selectively erasing information displayed on the display screen in different regions in dependence upon the detected motion types.
- the present invention also provides a display apparatus comprising a display screen to display a plurality of types of information; a memory to store information for display on the display screen; a motion detecting arrangement to detect motion of the display apparatus; and a processor programmed to control the display of the information on the display screen to display the different information types and responsive to motion detected by the motion detecting arrangement, to control the display screen to erase at least a part of the information displayed on the display screen, wherein the motion detecting arrangement is configured to detect a plurality of motion types, and the processor is programmed to identify the detected motion types and to selectively erase information types displayed on the display screen in dependence upon the identified motion types.
- the present invention also provides a method of controlling a display apparatus having a display screen and a motion detector arrangement to detect motion, the method comprising displaying information of a plurality of different information types on the display screen; detecting a plurality of types of motion of the display apparatus; and selectively erasing information types displayed on the display screen in dependence upon the detected motion types.
- the present invention also provides a display apparatus comprising a display screen to display information; a memory to store motion type erase data comprising identifications of motion types and corresponding erase operations, and information for display on the display screen; a motion detecting arrangement to detect motion of the display apparatus; a user input arrangement to receive a user input identifying erase operations during a learning phase; and a processor programmed, in the learning phase when a user input is received identifying a control operation, to identify a motion type from motion detected by the motion detecting arrangement and to store the received user input in correspondence with data identifying the motion type in the motion type erase data, and, subsequent to the learning phase when the motion detecting arrangement detects motion, to use the motion type erase data to identify a motion type and execute the corresponding erase operation to erase at least a part of the information displayed on the display screen.
- the present invention also provides a method of controlling a display apparatus having a motion detector arrangement to detect motion of the device, the method comprising, during a learning phase: receiving a user input identifying an erase operation; detecting motion of the display apparatus using the motion detector arrangement; identifying a motion type from the motion detected by the motion detecting arrangement; storing the received user input in correspondence with data identifying the motion type as motion type erase data; and repeating the receiving, detecting, identifying and storing steps; and in an operation phase: displaying information on a display screen; detecting motion of the display apparatus using the motion detector arrangement; and using the motion type control data to identify a motion type and execute the corresponding erase operation to erase at least a part of the information displayed on the display screen.
- Figure 1 is a schematic diagram illustrating an example display apparatus
- Figure 2 is a schematic diagram illustrating an alternative display of an example display apparatus
- Figure 3 is a flow diagram illustrating an example learning phase
- Figure 4 is a flow diagram illustrating an example operational phase
- Figure 5 is a schematic diagram illustrating the electronic components of an example display apparatus.
- data store or memory is intended to encompass any computer readable storage medium and/or device (or collection of data storage mediums and/or devices).
- data stores include, but are not limited to, optical disks (e.g., CD-ROM, DVD-ROM, etc.), magnetic disks (e.g., hard disks, floppy disks, etc.), memory circuits (e.g., solid state drives, random-access memory (RAM), etc.), and/or the like.
- the functions or algorithms described herein are implemented in hardware, software or a combination of software and hardware in one embodiment.
- the software comprises computer executable instructions stored on computer readable carrier media such as memory or other type of storage devices.
- described functions may correspond to modules, which may be software, hardware, firmware, or any combination thereof. Multiple functions are performed in one or more modules as desired, and the embodiments described are merely examples.
- the software is executed on a digital signal processor, ASIC, microprocessor, or other type of processor.
- Some embodiments implement the functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application- specific integrated circuit.
- the exemplary process flow is applicable to software, firmware, and hardware implementations.
- a generalized embodiment provides a display apparatus comprising a display screen to display information; a memory to store information for display on the display screen; a motion detecting arrangement to detect motion of the display apparatus; and a processor programmed to control the display of the information on the display screen and responsive to motion detected by the motion detecting arrangement, to control the display screen to erase at least a part of the information displayed on the display screen, wherein the motion detecting arrangement is configured to detect a plurality of motion types, and the processor is programmed to identify the detected motion types and to selectively erase information displayed on the display screen in dependence upon the identified motion types.
- the invention is suited to portable display apparatus but is not limited to such.
- the display apparatus is applicable to any display apparatus that can be moved in a reciprocal, rotational or shaking motion. Such a display apparatus is typically portable and wireless but could be simply moveable and connected by wires.
- the display apparatus could be part of another apparatus and removeable therefore in use to allow for the shaking, rotating or reciprocating motion to be effected by a user to erase displayed information.
- the display could be rigid or flexible.
- the display may be detachable. There could be more than one display and the displays can be of different types, e.g. one flexible and one rigid.
- at least a part of the information displayed on the display screen is erased in response to shaking or reciprocating motion detected by the motion detecting arrangement that is greater than a threshold. This avoids unintentional erasing of displayed information due to a jolt of the display apparatus, for example while travelling in a car, train or airplane. The erasing of information can require more vigorous movement to initiate the erasing.
- the erasing of the displayed information also causes the information to be erased or deleted in memory at least for the set of information being displayed, so that the action is equivalent to a delete key on a keyboard (although the information may still be resident in memory to allow an unerase operation).
- the motion may be detected in multiple directions, and a direction of the shaking or reciprocating motion can be detected to selectively erase information displayed on the display screen.
- the apparatus can be programmed, or the user can select, so that certain directions of motion of the display apparatus or certain types of motion of the display apparatus, such as different patterns of motion, can cause different erase operations.
- the different erase operations can relate to erase operations in different screen regions, such as frames on a web page or input regions in an application executing on the display apparatus, or to the erasing of different displayed information types.
- one type of motion or a different direction of motion may cause the erasing of a displayed image, another the erasing of typed text, another the erasing of handwritten text, another the erasing of a table or spreadsheet, etc.
- Types of motion that the user can use for erase operations can comprise the shaking or reciprocating motion of the display apparatus can be any of a rectilinear, motion, a reciprocating motion, a rotational motion, swaying motion, see-sawing motion, swinging motion or rocking motion for example.
- the displacement or amplitude of the‘shake’, the frequency or rate of the ‘shake’ and/or the velocity or vigour or the ‘shake’ can also be components of the type of motion of the display apparatus.
- the motion can comprise a complex mix or aggregation of motions or motion types e.g. a rotation simultaneously with a shake, or a sequence of motions e.g.
- the display apparatus can be programmed to allow the user to teach it motion types and erase operations to implement in response to the detected motion type.
- the display apparatus may display erase options in a learning or startup phase, with the request to the user to initiate a desired motion type to be used for that erase operation.
- the user may be required to perform the operation a few times for the processor to implement a learning process to learn and store the motion type corresponding to a selected erase operation.
- the motion types and erase operations can be learned for individual users. For example, a user could enter their name or sign in to use the display apparatus and the learnt motion types and erase operations are then stored for that user for use when the user enters their name or signs in.
- the display apparatus in one example can include an input arrangement to receive an input for the display of information.
- the input arrangement can comprise a touch sensitive display and/or a pen.
- a pen allows a user to input handwritten text and drawings.
- a carrier medium such as a non-transient storage medium storing code for execution by a processor of a machine to carry out the method, or a transient medium carrying processor executable code for execution by a processor of a machine to carry out the method.
- Embodiments can be implemented in programmable digital logic that implements computer code. The code can be supplied to the programmable logic, such as a processor or microprocessor, on a carrier medium.
- a carrier medium is a transient medium i.e. a signal such as an electrical, electromagnetic, acoustic, magnetic, or optical signal.
- Another form of carrier medium is a non-transitory storage medium that stores the code, such as a solid-state memory, magnetic media (hard disk drive), or optical media (Compact disc (CD) or digital versatile disc (DVD)).
- the display apparatus can, in one example, be used as the display unit in any of the examples of the display apparatus disclosed in co-pending application GB 1722249.8 filed on 29 th December 2017, the contents of which is hereby incorporate by reference in its entirety or with the device disclosed in co-pending UK patent application entitled“Device Operation Control” filed on the same date as this application, the contents of which is hereby incorporate by reference in its entirety.
- FIG. 1 is a schematic diagram illustrating a display apparatus.
- the display apparatus 2 comprises a display screen 1 capable of displaying information.
- the display screen is displaying three types of information, namely typed text 11, handwritten text 41 and an image 31.
- the information can be displayed by any application executing on the apparatus, such as a web browser, email application, word processor, spreadsheet, image editor etc.
- the displayed information can for example be a document or form that has regions to be completed or filled in, e.g. an address region and a name region in a standard form letter, or fields in a form.
- the handwritten text can be input using a pen device 50.
- the screen display 1 can be sensitive to the proximity of the pen device 50 to detect the coordinates of the pen device 50 to track its location and hence to generate the input handwritten text 41.
- the display screen 1 can be touch sensitive to allow for input of information and commands to the display apparatus e.g. by the display of touch sensitive options and/or a keyboard.
- the motion directions that the display apparatus can experience as a method of erasing displayed information can be a back and forth linear or reciprocating motion in any of three directions along three axes x, y and z in three dimensions as shown by the intersecting arrows. Also, or alternatively, the motion can be non-linear, curved or rotational as shown by the curved arrows.
- a motion of the display apparatus can erase all of the displayed information or selective parts, such as one or more of the typed text 11, the handwritten text 41 or the image 31.
- Figure 2 illustrates an alternative display screen 20 of a display apparatus.
- a typed text display region 10 with displayed typed text 11 an image display region 30 displaying an image 31, and a handwritten text display region 40 displaying handwritten text 41, created using a pen 50.
- the motion of the display apparatus of this example can be a back and forth linear or reciprocating motion in any of three directions along three axes x, y and z in three dimensions as shown by the intersecting arrows of figure 1. Also, or alternatively, the motion can be non-linear, curved or rotational as shown by the curved arrow in figure 1.
- a motion of the display apparatus can erase all of the displayed information or selective parts, such as information in one or more of the display regions 10, 30 and 40.
- a motion type detected by a motion detecting arrangement of the display apparatus can detect any one of a plurality of types of patterns of motion in any dimension or direction.
- Examples of different patterns of motion are: shaking back and forth, shaking side to side, shaking up and down, turning side to side (tilting side to side), turning up and down (tilting up and down) and rotating.
- Each of these represents a lateral (shaking) or rotational movement along or about one of the three axes illustrated in figure 1. It is possible to detect two types of each of these motion types, based on the starting motion. For example, back and forth could be back and forth or forth and back, side to side could be left to right or right to left, rotating could be clockwise or anticlockwise rotating etc.
- the memory of the display apparatus can store a set of erase commands to implement by the processor for each type of motion detected by the motion detecting arrangement and identified by the processor from the signal from the motion detecting arrangement. These commands can be predefined or downloaded from a data source or they can be user defined.
- the display apparatus can be programmed to allow the user to teach it motion types and erase operations to implement in response to the detected motion type.
- the display apparatus may display erase options in a learning or startup phase, with the request to the user to initiate a desired motion type to be used for that erase operation.
- the user may be required to perform the operation a few times for the processor to implement a learning process to learn and store the motion type corresponding to a selected erase operation.
- the display apparatus then stores a pattern of detected motion for each erase operation to be used in future motion detection operations for the execution of erase operations.
- Figure 3 is a flow diagram illustrating a learning phase of the display apparatus.
- step S 1 the display apparatus enters the learning phase. This can be automatic on start-up of the display apparatus or as a result of some other operation of the display apparatus, or it can be as a result of a user selection i.e. a use input, such as using an input arrangement of the display apparatus.
- step S2 a display of the display apparatus displays a menu of define control operations that a user can choose to execute using motion as a user input.
- the displayed menu can simply be a list, or a drop-down menu for example.
- the list could be organized into categories of types of erase operations, e.g. erase one letter, erase one word, erase text, erase all text in a display region or frame, erase image, erase displayed history data, erase in a frame or display region, etc.
- step S3 a user selection of an erase operation menu item is received and then in step S4, the display apparatus waits for the motion detector arrangement e.g. an accelerometer, to detect motion of the display apparatus.
- the motion is recorded over a short period of time appropriate to capture the motion pattern e.g. 1-2 seconds.
- step S5 the process loops back to require the user to repeat the motion input a number (N) times e.g. anything from 3 to 10 times. More iterations improve the accuracy of the matching process but reduces the usability of the display apparatus: a user will lose patience with a requirement to repeat the motion pattern too many times.
- step S6 the recorded motion patterns for the repeated recordings are averaged and the average is used in step S7 for storage as part of the information on the motion type in the stored motion type erase data.
- Data indicating the degree of deviation between the recorded motion patterns can also be stored as part of the motion type data to assist with the matching process during the operational phase i.e. to assist with assessing whether a match lies within an expected deviation of a stored average motion pattern.
- An identifier identifying the corresponding erase operation is also stored as part of the stored motion type erase data.
- step S2 The process will then return to step S2 to allow the user to select another erase operation from the menu. If the user has finished selecting erase operations, a user can select to exit the learning phase.
- Figure 4 is a flow diagram illustrating an operational phase of the display apparatus following the learning phase.
- step S 10 the display device enters the operational phase and in step S l l information is displayed on a display screen.
- the information can be displayed in different regions, such as frames or windows, as shown in figures 1 and 2.
- the displayed information can also be of different information types, e.g. text or images.
- Motion of the display apparatus is detected in step S 12.
- the pattern of motion is compared with the stored patterns for the motion type in the motion type erase data in step S 13 to determine if the motion matches a motion type. If no match is found (step S 14), the process returns to step S 12 to detect further motion.
- step S 15 a corresponding erase operation is identified in the motion type erase data and in step S 16 the erase operation is executed.
- the erase operation matched may be an operation to erase information in one of the regions or to erase an information type such as text or images.
- the process then returns to step S 12 to detect the next motion of the device for further control operations.
- an average of motion data for a number of recorded motion operations is taken.
- the invention is not restricted to taking an average and only one motion detection may be used, or a number may be recorded and they can be stored individually for individual matching with a motion pattern during the operation phase.
- a user is presented with a menu to select an erase operation to teach the display apparatus the motion type for that erase operation.
- the user may also be able to define their own erase operation using the user input arrangement.
- the display apparatus may automatically, e.g. on start-up, enter the learning phase and require the user to enter a sequence of motion types for a sequence of control operations.
- the motion types and erase operations can be learned for individual users. For example, a user could enter their name or sign in to use the display apparatus and the learnt motion types and erase operations are then stored for that user for use when the user enters their name or signs in.
- Figure 5 is a schematic diagram illustrating the electronic components of an example display apparatus.
- a processor 100 is connected to static or non-volatile memory 102 storing code used by the processor 100, such as operating system code and for the storage of data and information for display in a non-volatile manner.
- Volatile memory 103 is provided for the storage of application code for implementation by the processor 100 and data and information for display and for erase control.
- the volatile memory can store code (e.g. a code module) comprising code for identifying motion patterns detected by the motion sensor 104, and code (e.g. a code module) for controlling a display screen 105 in response to erase displayed information.
- the display screen 105 is connected to the processor 100 for the display of information.
- the display screen 105 can comprise a touch sensitive display to provide an input arrangement.
- a pen 50 shown in figure 1 can be used with the display screen 105 to allow for the input of freehand text or drawings.
- a camera 106 can also optionally be connected to the processor 100 to capture images. The camera 106 can also be used as an input arrangement for the input of information and commands using gesture recognition implemented by the processor on the images captured by the camera 106.
- a network connector 101 is connected to the processor 100 for communication with a communications or computer network. This communication link can pass data, information and controls between the display apparatus and other display apparatuses or computers over the network.
- the network connector 101 can be connected to a communications or computer network using a wired or wireless connection.
- a wired connection could be an ethernet connection to a LAN or a telephone connection, such that the network connector 101 acts as a modem or router.
- a wireless connection could be WIFI, WiMAX, CDMA, GSM, GPRS, wireless local loop or WAN.
- BluetoothTM may be used.
- a motion sensor 104 is connected to the processor 100 to detect motion of the display apparatus.
- the motion sensor 104 can comprise any conventional motion sensor, such as an accelerometer.
- the motion sensor 104 is capable of detecting motion of the display apparatus in any direction i.e. it is a multi-axis motion sensor to detect motion along any of three orthogonal axes in three dimensions.
- the display apparatus may have a display screen on both sides, so that it can be turned over to view information on both sides.
- the erase operation could apply to both sides or to just one side. This, in one example, can be selectively set.
- the display or displays can also be detached from a mount frame.
- the display apparatus can comprise a docking part or mount to which a display screen is mounted in a detachable or demountable manner so that the display screen can be used separately from the mount.
- the mount and display screen can communicate between each other using a wireless link so that the mount can contain some components of the display apparatus, such as the processor and storage, while the display screen can contain only those components necessary to operate the display screen, detect motion of the display screen, and the communicate with the processor.
- more than one display screen can be provided detachable from a mount, with each display screen operating independently to provide the erase operation by motion detection using a shared processor in the mount.
- the display apparatus can include one or more loudspeakers for audio output and one or more microphones for audio input. Also, the display apparatus can connect to a peripheral device such as a keyboard and/or pointing device by a wired or wireless connection to allow for input of information and commands to the display apparatus. [0060] In any of the examples above, the display apparatus can include other input arrangements, such as a keyboard, pointer device (e.g. a mouse), or a touch screen, to enable a user to input controls and information in conjunction with the motion detection control input method. The input arrangement can also be used to perform conventional erase operations to supplement the motion erase capability of the display apparatus, e.g. keyboard deletions, selections of text and displayed deletion operations, etc. Also, a camera can be provided to allow image capture to enable gesture input as a method of control or information input.
- a peripheral device such as a keyboard and/or pointing device by a wired or wireless connection to allow for input of information and commands to the display apparatus.
- the display apparatus can include other input arrangements, such as
- a camera in addition to or alternatively to the motion control input, can be provided in the apparatus to capture video in a direction facing the user so that the user can use gestures to control the erasing of information displayed on the display screen in dependence upon the type of identified gesture.
- the gesture control can comprise the recognition of movements of a user’s body parts i.e. hands, facial features or the overall body position or the recognition of an object moved by the user, such as a stick, ruler or something worn by the user.
- Pattern recognition techniques can be used to recognize the type of gestures and to identify them.
- Example gesture types can include sweeping of a hand to the left or right, or up or down, or a waving of a hand.
- the gesture control erase operation can be used, for example, where the user is unable to pick up the display apparatus, e.g. they have wet or dirty hands.
- the display screen can be controlled to be responsive to the detected gesture type and to identify a comparable or associated motion type that controls the same erase operation and to control the display screen to cause the display screen to mimic the motion equivalent command to the gesture command. For example, if the motion command to erase text in a display region is a left to right rotation, when an equivalent or associated gesture command e.g. waving the hand from left to right, is identified, the display screen can cause the display to rotate from left to right to act as a confirmation back to the user that the gesture command has been recognized as the motion command.
- the invention provides gesture control instead of motion control.
- the display apparatus can comprise a display screen to display information; a memory to store information for display on the display screen; a gesture detecting arrangement to detect a gesture (e.g. a camera and recognition software); and a processor programmed to control the display of the information on the display screen and responsive to gesture detected by the gesture detecting arrangement, to control the display screen to erase at least a part of the information displayed on the display screen, wherein the gesture detecting arrangement is configured to sense a plurality of gesture types, and the processor is programmed to identify the sensed gesture types and to selectively erase information displayed on the display screen in dependence upon the identified gesture types.
- the gesture recognition operates in the same way as the motion detection and hence, in the motion detection examples described herein above, motion can be replaced with gesture in this aspect.
- the display apparatus can be mobile or fixed/attached.
- a camera can be provided in the apparatus to capture video in a direction facing the user so that the user can use gestures to control the erasing of information displayed on the display screen in dependence upon the type of identified gesture.
- the gesture control can comprise the recognition of movements of a user’s body parts i.e. hands, facial features or the overall body position. Pattern recognition techniques can be used to recognize the type of gestures and to identify them.
- Example gesture types can include sweeping of a hand to the left or right, or up or down, or a waving of a hand.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Devices For Indicating Variable Information By Combining Individual Elements (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1805278.7A GB2572437A (en) | 2018-03-29 | 2018-03-29 | Display apparatus |
PCT/GB2019/050931 WO2019186201A1 (en) | 2018-03-29 | 2019-03-29 | Display apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3776150A1 true EP3776150A1 (en) | 2021-02-17 |
Family
ID=62142353
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19717550.8A Withdrawn EP3776150A1 (en) | 2018-03-29 | 2019-03-29 | Display apparatus |
Country Status (7)
Country | Link |
---|---|
EP (1) | EP3776150A1 (ja) |
JP (1) | JP7379364B2 (ja) |
KR (1) | KR20210002509A (ja) |
CN (1) | CN112041792A (ja) |
GB (1) | GB2572437A (ja) |
SG (1) | SG11202009633PA (ja) |
WO (1) | WO2019186201A1 (ja) |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4853302B2 (ja) | 2007-01-19 | 2012-01-11 | 日本電気株式会社 | 携帯端末用コマンド入力装置および携帯端末用コマンド入力方法 |
US20110029869A1 (en) * | 2008-02-29 | 2011-02-03 | Mclennan Hamish | Method and system responsive to intentional movement of a device |
KR101737829B1 (ko) * | 2008-11-10 | 2017-05-22 | 삼성전자주식회사 | 휴대 단말기의 모션 입력 장치 및 그의 운용 방법 |
US20120120000A1 (en) | 2010-11-12 | 2012-05-17 | Research In Motion Limited | Method of interacting with a portable electronic device |
WO2012078654A1 (en) * | 2010-12-07 | 2012-06-14 | Google Inc. | Editing based on force-based physical cues |
KR101457116B1 (ko) * | 2011-11-07 | 2014-11-04 | 삼성전자주식회사 | 음성 인식 및 모션 인식을 이용한 전자 장치 및 그의 제어 방법 |
US9841893B2 (en) * | 2012-03-30 | 2017-12-12 | Nokia Technologies Oy | Detection of a jolt during character entry |
CN102681667B (zh) * | 2012-04-24 | 2015-11-25 | 华为终端有限公司 | 文字输入的回退方法及终端 |
CN102681695B (zh) * | 2012-04-25 | 2016-12-07 | 北京三星通信技术研究有限公司 | 光标控制方法及装置 |
JP2015114700A (ja) | 2013-12-09 | 2015-06-22 | 株式会社Nttドコモ | オブジェクト表示装置 |
US9405377B2 (en) * | 2014-03-15 | 2016-08-02 | Microsoft Technology Licensing, Llc | Trainable sensor-based gesture recognition |
CN105183325B (zh) * | 2015-10-27 | 2019-06-18 | 上海斐讯数据通信技术有限公司 | 一种自定义输入法及系统 |
JP6663131B2 (ja) | 2016-01-28 | 2020-03-11 | カシオ計算機株式会社 | 表示装置、表示制御方法及びプログラム |
CN106850956A (zh) * | 2016-12-21 | 2017-06-13 | 珠海市魅族科技有限公司 | 信息删除方法和装置 |
CN107122106A (zh) * | 2017-04-25 | 2017-09-01 | 北京洋浦伟业科技发展有限公司 | 一种使用记录的删除方法及装置 |
-
2018
- 2018-03-29 GB GB1805278.7A patent/GB2572437A/en not_active Withdrawn
-
2019
- 2019-03-29 SG SG11202009633PA patent/SG11202009633PA/en unknown
- 2019-03-29 WO PCT/GB2019/050931 patent/WO2019186201A1/en active Application Filing
- 2019-03-29 JP JP2020552779A patent/JP7379364B2/ja active Active
- 2019-03-29 EP EP19717550.8A patent/EP3776150A1/en not_active Withdrawn
- 2019-03-29 CN CN201980029053.7A patent/CN112041792A/zh active Pending
- 2019-03-29 KR KR1020207031286A patent/KR20210002509A/ko unknown
Also Published As
Publication number | Publication date |
---|---|
JP7379364B2 (ja) | 2023-11-14 |
WO2019186201A1 (en) | 2019-10-03 |
GB2572437A (en) | 2019-10-02 |
CN112041792A (zh) | 2020-12-04 |
KR20210002509A (ko) | 2021-01-08 |
GB201805278D0 (en) | 2018-05-16 |
SG11202009633PA (en) | 2020-10-29 |
JP2021519975A (ja) | 2021-08-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11314804B2 (en) | Information search method and device and computer readable recording medium thereof | |
KR102144553B1 (ko) | 다중 디스플레이 방법, 저장 매체 및 전자 장치 | |
CN105683893B (zh) | 基于运动或运动的缺乏在启用触摸的设备上呈现控制界面 | |
KR102041332B1 (ko) | 모바일 운영 시스템 | |
US9842571B2 (en) | Context awareness-based screen scroll method, machine-readable storage medium and terminal therefor | |
US10949668B2 (en) | Electronic apparatus and method for controlling thereof | |
EP2811420A2 (en) | Method for quickly executing application on lock screen in mobile device, and mobile device therefor | |
JP2016511471A (ja) | 携帯端末に対する動作関連入力によって複数個のオブジェクトの表示を制御する方法及び携帯端末 | |
US10572144B2 (en) | Input processing method and apparatus of electronic device | |
US20140351725A1 (en) | Method and electronic device for operating object | |
CN107407945A (zh) | 从锁屏捕获图像的系统和方法 | |
US10331340B2 (en) | Device and method for receiving character input through the same | |
KR102140811B1 (ko) | 디바이스의 사용자 인터페이스 제공 방법 및 그 디바이스 | |
CN103765367A (zh) | 信息处理设备、信息处理方法和程序 | |
CN105320260A (zh) | 移动终端的控制方法及移动终端 | |
EP3776150A1 (en) | Display apparatus | |
KR102186103B1 (ko) | 상황인지 기반의 화면 스크롤 방법, 저장 매체 및 단말 | |
WO2019186203A1 (en) | Device operation control | |
US20160357319A1 (en) | Electronic device and method for controlling the electronic device | |
KR20180056973A (ko) | 다중 센서감지를 통해 다른 기기로 추가기능 및 피드백을 제공하는 기기 및 방법 | |
Date | Accepted in Partial Fulfillment of the Requirements For the Degree of Master of Fine Arts at |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20201015 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20210518 |