EP2548103B9 - Dispositif permettant de parcourir une interface utilisateur projetée - Google Patents

Dispositif permettant de parcourir une interface utilisateur projetée Download PDF

Info

Publication number
EP2548103B9
EP2548103B9 EP11712673.0A EP11712673A EP2548103B9 EP 2548103 B9 EP2548103 B9 EP 2548103B9 EP 11712673 A EP11712673 A EP 11712673A EP 2548103 B9 EP2548103 B9 EP 2548103B9
Authority
EP
European Patent Office
Prior art keywords
content
cursor
user device
invisible
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP11712673.0A
Other languages
German (de)
English (en)
Other versions
EP2548103B1 (fr
EP2548103A1 (fr
Inventor
Pär-Anders ARONSSON
Erik Backlund
Andreas Kristensson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Mobile Communications AB
Original Assignee
Sony Mobile Communications AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Mobile Communications AB filed Critical Sony Mobile Communications AB
Publication of EP2548103A1 publication Critical patent/EP2548103A1/fr
Publication of EP2548103B1 publication Critical patent/EP2548103B1/fr
Application granted granted Critical
Publication of EP2548103B9 publication Critical patent/EP2548103B9/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0386Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen

Definitions

  • the size of the display of the user device is relatively small in comparison to the size of a computer monitor or a television.
  • the size of the user interface displayed to the user is correspondingly limited.
  • WO 01/03106 A1 discloses a method for remotely controlling a computer having an associated screen for displaying output from the computer and having an internal cursor generated by the computer including detecting at least one property of an external cursor.
  • the invention includes a computer connected to a projector, which projects an image of the computer onto an external screen.
  • a camera is used to capture the image of the projected computer output.
  • An optical pointer such as a laser pointer, is used to generate and transmit an external cursor having various properties, such as color, shape, or intensity.
  • the image captured by the camera is used to detect and process at least one property of the external cursor to generate a corresponding command or commands to control the computer. Commands may be used to emulate control of the computer typically provided by a pointing device such as a mouse or track ball.
  • US 2008/180395 A1 discloses a computer pointing input device, which allows a user to determine the position of a cursor on a computer display.
  • the position of the input device in relation to the display controls the position of the cursor, so that when a user points directly at the display, the cursor appears at the intersection of the display and the line of sight from of the input device.
  • the cursor appears to move on the display in exact relation to the input device.
  • a cursor command unit allows the user to virtually operate the input device wherein changes in the position of the device allow the user to spatially invoke mouse functions.
  • the computer pointing input device is designed to operate with a computer having a processor through a computer communication device.
  • the present invention provides a method according to claim 1 and an apparatus according to claim 8. Further aspects of the invention are outlined in the dependent claims. Embodiments which do not fall within the scope of the claims do not describe part of the present invention.
  • a method may include projecting a content associated with a user device; receiving, by the user device, a projected content including an invisible cursor content; determining a position of the invisible cursor content with respect to the projected content; outputting, by the user device, a visible cursor having a position in correspondence to the invisible cursor; receiving, by the user device, an input from a pointer device; mapping, by the user device, a position of a visible cursor content to the content; and performing, by the user device, an input operation that interacts with the content and corresponds to the input from the pointer device and the position of the visible cursor content.
  • the method may include receiving, by the pointer device, the input; and transmitting, by the pointer device, the input to the user device, wherein the pointer device corresponds to an infrared laser pointer device.
  • the determining may include detecting an illumination level associated with the invisible cursor content; and comparing the illumination level with other illumination levels associated with coordinates of the projected content.
  • the method may include performing a calibration to calculate a translation between a coordinate space associated with the projected content and a coordinate space associated with the content.
  • mapping may include mapping one or more coordinates of the position of the invisible cursor content to a corresponding one or more coordinates of the content.
  • the input operation may correspond to one of scrolling, selecting, highlighting, dragging-and-dropping, a multi-touch operation, or navigating within a menu.
  • the content may correspond to one of a user interface of the user device or content accessible by the user device.
  • the method may include outputting, by the user device, a calibration image.
  • the determining may include recognizing a user's gesture based on a positional path associated with the invisible cursor content.
  • a user device may comprise components configured to receive a projected content including an invisible cursor content, wherein the projected content corresponds to a content associated with the user device; determine a position of the invisible cursor content with respect to the projected content; output a visible cursor having a position in correspondence to the invisible cursor; receive a user input from a pointer device; map a position of a visible cursor content to the content; and perform an input operation that interacts with the content, wherein the input operation corresponds to the user input and the position of visible cursor content.
  • the user device may comprise a radio telephone and a display capable of displaying the content.
  • the user device may further comprise at least one of a projector or a camera capable of detecting infrared light and visible light, and wherein the visible cursor is partially transparent.
  • the components may be further configured to detect at least one of an illumination level associated with the invisible cursor content, a frequency associated with the invisible cursor content, or a positioning path associated with the invisible cursor content.
  • components may be further configured to remove noise based on a Kalman filter or a low-pass filter to stabilize the position of the visible cursor.
  • the components may be further configured to receive a projected content corresponding to a calibration image; and map a coordinate space between the projected content corresponding to the calibration image and a content associated with the user device.
  • the input operation may correspond to one of scrolling, selecting, highlighting, dragging-and-dropping, a multi-touch operation, or navigating within a menu.
  • the content may correspond to one of a user interface of the user device or content accessible by the user device.
  • a computer-readable medium may include instructions executable by at least one processing system.
  • the computer-readable medium may storing instructions to receive a projected content including an infrared laser cursor content, wherein the projected content corresponds to a user interface associated with the user device; determine a position of the infrared laser cursor content with respect to the projected content; project a visible cursor having a position in correspondence to the infrared laser cursor content; receive a user input from a pointer device; map the position of the infrared laser cursor content to the user interface; and perform an input operation that interacts with the user interface, wherein the input operation corresponds to the user input and the position of infrared laser cursor content.
  • the computer-readable medium may further store one or more instructions to detect at least one of an illumination level associated with the infrared laser cursor content or a frequency associated with the infrared laser cursor content, in combination with a size of the infrared laser cursor content; and filter a spectrum associated with the received projected content based on the frequency.
  • the user device in which computer-readable medium resides may comprise a radio telephone
  • the input operation may correspond to one of scrolling, selecting, highlighting, dragging-and-dropping, a multi-touch operation, or navigating within a menu.
  • user interface is intended to be broadly interpreted to include a user interface of a user device, a user interface of another device communicatively coupled to the user device, or content accessible by the user device (e.g., Web content, etc.).
  • a user may interact with and navigate through a user interface using a pointer device, a projected user interface, and a camera.
  • the user interface of the user device may be projected by a projector on a surface (e.g., a screen, a wall, etc.).
  • the user may interact with the projected user interface using the pointer device.
  • the pointer device emits an invisible laser beam.
  • the pointer device may correspond to an infrared (IR) pointer device or an ultraviolet pointer device.
  • the projected user interface and an invisible cursor created by the invisible laser beam may be captured by the camera.
  • the camera may provide the user interface image to the user device.
  • the user device may detect the position of the invisible cursor and generate a visible cursor that may be output to the projector and projected by the projector.
  • the user device may generate the visible cursor having particular characteristics (e.g., shape, design, size, and/or transparency). These characteristics may be configured by the user on the user device.
  • the visible cursor may have a particular transparency so that the visible cursor does not obscure an underlying portion of the user interface.
  • the visible cursor may be partially transparent or translucent. This is in contrast to a red pointer device or some other type of pointer device that emits visible light and provides a visible cursor that obscures an underlying portion of the user interface, or the intensity, speckle effect, etc., makes it difficult for the user to focus on the item to which the user is pointing.
  • the visible cursor may take the form of various shapes or designs, such as, cross-hairs, a pointer, a finger, etc.
  • the user device may generate the visible cursor to have a particular size. For example, the size of the cursor may be reduced to provide the user with greater precision when selecting an object, etc., from the user interface. Additionally, or alternatively, the size of the cursor may be dynamically reduced depending on how long the visible cursor resides over a particular area of the user interface. For example, the visible cursor may be reduced dynamically, in an incremental fashion, after a period of time has transpired. This is in contrast to pointer devices that emit visible light and provide a variable cursor size depending on the user's proximity to the projected image.
  • the visible cursor may be used by the user to interact with the user interface.
  • the pointer device may comprise various input mechanisms (e.g., buttons, a scroll wheel, a switch, a touchpad, etc.) that may be communicated to the user device to allow the user to navigate and/or interact with the user interface associated with the user device.
  • the pointer device may correspond to an Infrared (IR) pointer device.
  • the pointer device may emit some other spectrum of light (e.g., ultraviolet), which is invisible to the human eye.
  • the laser beam and the cursor may be invisible to the user (and others).
  • a red pointer device or some other pointer device e.g., green, blue, yellow, etc.
  • a visible cursor may be distracting to the user (and others) (e.g., because of the intensity and/or speckle effect) and may obscure an underlying portion of the projected image.
  • the pointer device may comprise one or multiple input mechanisms (e.g., buttons, a scroll wheel, a switch, a touchpad, etc.).
  • the pointer device may comprise one or multiple buttons to allow the user to interact with and navigate through the user interface.
  • the pointer device may comprise pressure detector(s). The pressure detector(s) may operate in conjunction with the one or multiple buttons to allow for different user interaction and user navigation functionality.
  • the pointer device may comprise an anti-shake component (e.g., a gyroscope, an accelerometer, etc.).
  • the pointer device may emit a laser beam at varying levels of luminosity in correspondence to a coding method with which the user may interact and navigate.
  • the projector may correspond to a conventional projector.
  • the projector may project the user interface of the user device onto a surface, such as, for example, a wall, a screen, or the like.
  • the camera may capture the projected image of the user interface.
  • the camera may capture the cursor image associated with the pointer device and the cursor image generated by the user device.
  • the user device may correspond to a variety of mobile, portable, handheld, or stationary devices. According to some of the exemplary embodiments, the user device may comprise the camera and/or the projector. According to other exemplary embodiments, the user device may not comprise the camera and/or the projector. That is, the camera and/or the projector may be devices internal to the user device or external to the user device.
  • the user device responds to the user's interaction with the pointer device and the cursor (e.g., the visible cursor) on the projected user interface.
  • the camera may provide the captured images to the user device, which in turn, may interpret the captured images and appropriately respond to the user's interaction.
  • Fig. 1A is a diagram of an exemplary environment 100 in which one or more exemplary embodiments described herein may be implemented.
  • environment 100 may include a user device 105, a projector 110, a screen 115, a pointer device 120, a camera 125, a projected user interface 130, an invisible cursor 135, a visible cursor 140, and a user 145.
  • Environment 100 may include wired and/or wireless connections among the devices illustrated.
  • environment 100 may include more devices, fewer devices, different devices, and/or differently arranged devices than those illustrated in Fig. 1A .
  • user device 105 may comprise projector 110 and/or camera 125. Additionally, or alternative, in other exemplary embodiments, environment 100 may not include screen 115.
  • projected user interface 130 may be projected on a wall or some surface (e.g., a flat surface).
  • some functions described as being performed by a particular device may be performed by a different device or a combination of devices.
  • pointer device 120 and/or camera 125 may perform one or more functions that is/are described as being performed by user device 105.
  • User device 105 may correspond to a portable device, a mobile device, a handheld device, or a stationary device.
  • user device 105 may comprise a telephone (e.g., a smart phone, a cellular phone, an Internet Protocol (IP) telephone, etc.), a PDA device, a data organizer device, a Web-access device, a computer (e.g., a tablet computer, a laptop computer, a palmtop computer, a desktop computer), and/or some other type of user device.
  • a telephone e.g., a smart phone, a cellular phone, an Internet Protocol (IP) telephone, etc.
  • IP Internet Protocol
  • user device 105 may determine the position of invisible cursor 135 with respect to projected user interface 130.
  • User device 105 may generate visible cursor 140.
  • User device 105 may comprise a mapping function, which will be described further below, so that visible cursor 140 is projected onto screen 115 in the determined position of invisible cursor 135.
  • User 145 may navigate and interact with the user interface associated with user device 105, and user device 105 may correspondingly respond.
  • Projector 110 may comprise a device having the capability to project images.
  • projector 110 may include a micro-electromechanical projection system (MEMS) (e.g., a digital micro-mirror device (DMD) component, digital light processing (DLP) component, or a grating light valve (GLV) component).
  • MEMS micro-electromechanical projection system
  • DMD digital micro-mirror device
  • DLP digital light processing
  • GLV grating light valve
  • projector 110 may include, for example, a liquid crystal display (LCD) projection system, a liquid crystal on silicon (LCOS) projection system, or some other type of projection system.
  • Projector 110 may include a transmissive projector or a reflective projector.
  • Projector 110 may provide for various user settings, such as color, tint, resolution, etc. Projector 110 may also permit user 145 to identify other parameters that may affect the quality of the projected content. For example, user 145 may indicate the color of the surface, the type of surface (e.g., a screen, a wall, etc.) on which content will be projected; a type of light in the environment, the level of light in the environment, etc.
  • user 145 may indicate the color of the surface, the type of surface (e.g., a screen, a wall, etc.) on which content will be projected; a type of light in the environment, the level of light in the environment, etc.
  • Screen 115 may comprise a surface designed to display projected images. Screen 115 may be designed for front or back projection.
  • Pointer device 120 may comprise a device having the capability to emit light. According to an exemplary implementation, pointer device 120 may emit light invisible to user 145 (e.g., IR light or ultraviolet light). According to an exemplary implementation, pointer device 120 may comprise a laser pointer device. According to other exemplary implementations, pointer device 120 may comprise a different type of device that emits invisible light (e.g., a flash light that includes an adjustable focus). Pointer device 120 may comprise one or more input mechanisms (e.g., buttons, a scroll wheel, a switch, a touchpad, etc.). According to an exemplary implementation, the button(s) may comprise pressure-sensitive detectors to detect the pressure associated with user's 145 pressing of the button(s). As described further below, the input mechanism(s) may allow user 145 to emulate mouse-like inputs, single touch inputs, and/or multi-touch inputs. Pointer device 120 may comprise a communication interface to allow pointer device 120 to communicate with user device 105.
  • pointer device 120 may emit light invisible
  • pointer device 120 may include an anti-shake component.
  • the anti-shake component may include one or more gyroscopes, accelerometers, and/or another component to detect and compensate for unintentional shaking and/or angular movements caused by user 145.
  • pointer device 120 may include a gesture detector.
  • the gesture detector may comprise one or more gyroscopes, accelerometers, and/or some other type of in-air gesture technology.
  • pointer device 120 may not include the gesture detector and/or the anti-shake component.
  • Camera 125 may comprise a device having the capability to capture images.
  • the images may include visible light and invisible light (e.g., IR light or ultraviolet light).
  • camera 125 may capture projected user interface 130, invisible cursor 135, and a visible cursor (not illustrated).
  • Camera 125 may provide for various user settings (e.g., lighting conditions, resolutions, etc.).
  • Projected interface 130 may correspond to a user interface associated with user device 105.
  • Invisible cursor 135 may correspond to invisible light emitted from pointer device 120 that impinges screen 115.
  • Visible cursor 140 may correspond to a cursor generated by user device 105, which is projected by projector 110 onto screen 115.
  • Figs. 1B - 1F are diagrams illustrating exemplary operations that may be performed according to an exemplary embodiment of user navigation using a projected user interface and a pointer device.
  • user device 105 may perform a calibration process. The calibration process may assist user device 105 to generate a mapping of coordinate spaces between projected user interface 130 and the user interface of user device 105.
  • user device 105 may output a user interface 150 to projector 110.
  • user interface 150 may correspond to a default calibration image (e.g., a rectangular image or some other type of image).
  • user interface 142 may correspond to a user interface produced by user device 105 during normal operation (e.g., a desktop image, etc.).
  • Projector 110 may project a user interface 152 to screen 115.
  • Camera 125 may capture user interface 152 from screen 115 and provide a user interface 154 to user device 105.
  • User device 105 may calculate a mapping 156 of coordinate spaces between user interface 150 and user interface 154.
  • user device 105 may provide a user interface 128 to projector 110.
  • Projector 110 may project user interface 130 onto screen 115.
  • User 145 may point pointer device 120 toward projected user interface 130 causing invisible cursor 135 to be present on projected user interface 130.
  • camera 125 may capture projected user interface 130 that includes invisible cursor 135 and provide a user interface 132 to user device 105.
  • User device 105 may generate and map 134 the visible cursor 140.
  • User device 105 may send user interface 128 and visible cursor 140 to projector 110.
  • Projector 110 may project visible cursor 140 and user interface 128 onto screen 115.
  • User device 105 may continuously perform this process so that visible cursor 140 is projected in a position that corresponds to a position of invisible cursor 135. In this way, visible cursor 140 may track or have a position that corresponds to or substantially corresponds to the position of invisible cursor 135.
  • user 145 may make particular input selections 170 into pointer device 120 (e.g., user 145 may press some buttons, etc.) while visible cursor 140 is positioned on a particular spot with respect to projected user interface 130.
  • Input selections 170 may be sent to user device 105.
  • Camera 125 may capture projected user interface 130, invisible cursor 135, and visible cursor 140. Camera 125 may provide this image to user device 105.
  • User device 105 may map 175 the position of visible cursor 140 to a user interface space and interpret input selections 170.
  • user device 105 may cause a new projected user interface 180 to be projected on screen 115, as illustrated in Fig. 1F .
  • Fig. 2 is a diagram of an exemplary user device 105 in which exemplary embodiments described herein may be implemented.
  • user device 105 may comprise a housing 205, a microphone 210, speakers 215, keys 220, and a display 225.
  • user device 105 may comprise fewer components, additional components, different components, and/or a different arrangement of components than those illustrated in Fig. 2 and described herein.
  • user device 105 may not comprise microphone 210, speakers 215, and/or keys 220.
  • user device 105 may comprise a camera component and/or a projector component.
  • user device 105 is depicted as having a landscape configuration, in other embodiments, user device 105 may have a portrait configuration or some other type of configuration.
  • Housing 205 may comprise a structure to contain components of user device 105.
  • housing 205 may be formed from plastic, metal, or some other type of material.
  • Housing 205 may support microphone 210, speakers 215, keys 220, and display 225.
  • Microphone 210 may transduce a sound wave to a corresponding electrical signal. For example, a user may speak into microphone 210 during a telephone call or to execute a voice command. Speakers 215 may transduce an electrical signal to a corresponding sound wave. For example, a user may listen to music or listen to a calling party through speakers 215.
  • Keys 220 may provide input to user device 105.
  • Keys 220 may comprise a standard telephone keypad, a QWERTY keypad, and/or some other type of keypad (e.g., a calculator keypad, etc.).
  • keys 220 may comprise pushbuttons.
  • Keys 220 may comprise special purpose keys to provide a particular function (e.g., send, call, e-mail, etc.) and/or permit a user to select, navigate, etc., objects or icons displayed on display 225.
  • Display 225 may operate as an output component. Additionally, in some implementations, display 225 may operate as an input component. For example, display 225 may comprise a touch-sensitive screen. In such instances, display 225 may correspond to a single-point input device (e.g., capable of sensing a single touch) or a multipoint input device (e.g., capable of sensing multiple touches that occur at the same time). Further, display 225 may implement a variety of sensing technologies, including but not limited to, capacitive sensing, surface acoustic wave sensing, resistive sensing, optical sensing, pressure sensing, infrared sensing, or gesture sensing. Display 225 may also comprise an auto-rotating function.
  • Display 225 may comprise a liquid crystal display (LCD), a plasma display panel (PDP), a field emission display (FED), a thin film transistor (TFT) display, or some other type of display technology.
  • Display 225 may be capable of displaying text, pictures, and/or video.
  • Display 225 may also be capable of displaying various images (e.g., icons, a keypad, etc.) that may be selected by a user to access various applications and/or enter data.
  • Display 225 may operate as a viewfinder when user device 105 comprises a camera or a video capturing component.
  • Fig. 3 is a diagram illustrating exemplary components of user device 105.
  • user device 105 may comprise a processing system 305, a memory/storage 310 that may comprise applications 315, a communication interface 320, an input 325, and an output 330.
  • user device 105 may comprise fewer components, additional components, different components, or a different arrangement of components than those illustrated in Fig. 3 and described herein.
  • Processing system 305 may comprise one or multiple processors, microprocessors, data processors, co-processors, application specific integrated circuits (ASICs), controllers, programmable logic devices, chipsets, field programmable gate arrays (FPGAs), application specific instruction-set processors (ASIPs), system-on-chips (SOCs), and/or some other component that may interpret and/or execute instructions and/or data.
  • Processing system 305 may control the overall operation or a portion of operation(s) performable by user device 105.
  • Processing system 305 may perform one or more operations based on an operating system and/or various applications (e.g., applications 315).
  • Processing system 305 may access instructions from memory/storage 310, from other components of user device 105, and/or from a source external to user device 105 (e.g., a network or another device). Processing system 305 may provide for different operational modes associated with user device 105.
  • Memory/storage 310 may comprise one or multiple memories and/or one or more secondary storages.
  • memory/storage 310 may comprise a random access memory (RAM), a dynamic random access memory (DRAM), a read only memory (ROM), a programmable read only memory (PROM), a flash memory, and/or some other type of memory.
  • RAM random access memory
  • DRAM dynamic random access memory
  • ROM read only memory
  • PROM programmable read only memory
  • flash memory and/or some other type of memory.
  • Memory/storage 310 may comprise a hard disk (e.g., a magnetic disk, an optical disk, a magneto-optic disk, a solid state disk, etc.) or some other type of computer-readable medium, along with a corresponding drive.
  • Memory/storage 310 may comprise a memory, a storage device, or storage component that is external to and/or removable from user device 105, such as, for example, a Universal Serial Bus (USB) memory stick, a hard disk, mass storage, off-line storage, etc.
  • USB Universal Serial Bus
  • Computer-readable medium is intended to be broadly interpreted to comprise, for example, a memory, a secondary storage, a compact disc (CD), a digital versatile disc (DVD), or the like.
  • the computer-readable medium may be implemented in a single device, in multiple devices, in a centralized manner, or in a distributed manner.
  • Memory/storage 310 may store data, application(s), and/or instructions related to the operation of device 300.
  • Memory/storage 310 may store data, applications 315, and/or instructions related to the operation of user device 105.
  • Applications 315 may comprise software that provides various services or functions.
  • applications 315 may comprise an e-mail application, a telephone application, a voice recognition application, a video application, a multi-media application, a music player application, a visual voicemail application, a contacts application, a data organizer application, a calendar application, an instant messaging application, a texting application, a web browsing application, a location-based application (e.g., a GPS-based application), a blogging application, and/or other types of applications (e.g., a word processing application, a spreadsheet application, etc.).
  • Applications 315 may comprise one or more applications related to the mapping function, cursor generation, cursor detection, calibration, and/or cursor stabilizer.
  • Communication interface 320 may permit user device 105 to communicate with other devices, networks, and/or systems.
  • communication interface 320 may comprise one or multiple wireless and/or wired communication interfaces.
  • communication interface 320 may comprise an Ethernet interface, a radio interface, a microwave interface, a Universal Serial Bus (USB) interface, or some other type of wireless communication interface and/or wired communication interface.
  • Communication interface 320 may comprise a transmitter, a receiver, and/or a transceiver. Communication interface 320 may operate according to various protocols, standards, or the like.
  • Input 325 may permit an input into user device 105.
  • input 325 may comprise microphone 210, keys 220, display 225, a touchpad, a button, a switch, an input port, voice recognition logic, fingerprint recognition logic, a web cam, and/or some other type of visual, auditory, tactile, etc., input component.
  • Output 335 may permit user device 105 to provide an output.
  • output 330 may comprise speakers 215, display 225, one or more light emitting diodes (LEDs), an output port, a vibratory mechanism, and/or some other type of visual, auditory, tactile, etc., output component.
  • LEDs light emitting diodes
  • User device 105 may perform operations in response to processing system 305 executing software instructions contained in a computer-readable medium, such as memory/storage 310.
  • a computer-readable medium such as memory/storage 310.
  • the software instructions may be read into memory/storage 310 from another computer-readable medium or from another device via communication interface 320.
  • the software instructions stored in memory/storage 310 may cause processing system 305 to perform various processes described herein.
  • user device 105 may perform operations based on hardware, hardware and firmware, and/or hardware, software and firmware.
  • Fig. 4 is a diagram illustrating exemplary functional components of user device 105.
  • user device 105 may include a calibrator 405, a cursor detector 410, a cursor stabilizer 415, a mapper 420, an input manager 425, and a cursor generator 430.
  • Calibrator 405, cursor detector 410, cursor stabilizer 415, mapper 420, input manager 425, and/or cursor generator 430 may be implemented as a combination of hardware (e.g., processing system 305, etc.) and software (e.g., applications 315, etc.) based on the components illustrated and described with respect to Fig. 3 .
  • calibrator 405, cursor detector 410, cursor stabilizer 415, mapper 420, input manager 425, and/or cursor generator 430 may be implemented as hardware based on the components illustrated and described with respect to Fig. 3 .
  • calibrator 405, cursor detector 410, cursor stabilizer 415, mapper 420, input manager 425, and/or cursor generator 430 may be implemented as hardware or hardware and software in combination with firmware.
  • Calibrator 405 may interpret data received from camera 125 to translate points associated with a projected image to points associated with a user interface image of user device 105. As previously described, according to an exemplary implementation, calibrator 405 may utilize a default calibration image. According to other exemplary implementations, calibrator 405 may utilize a user interface associated with normal operation of user device 105 (e.g., a desktop image, etc.).
  • calibrator 405 may detect points associated with the projected image. For example, when the projected image has a shape of a four-sided figure (e.g., a square or a rectangle), calibrator 405 may detect corner points of the projected image received from camera 125. Additionally, or alternatively, the projected image may include distinctive focal points (e.g., colored spots on a white background, bright or illuminated spots, etc.) that calibrator 405 may detect. Calibrator 405 may utilize the detected points to calculate a translation between the coordinate space associated with the projected image and a coordinate space associated with the user interface of user device 105.
  • distinctive focal points e.g., colored spots on a white background, bright or illuminated spots, etc.
  • Cursor detector 410 may detect invisible cursor 135 (e.g., an invisible spot of light from pointer device 120). According to an exemplary implementation, cursor detector 410 may detect invisible cursor 135 based on its illumination level or its light intensity. That is, the illumination of invisible cursor 135 may be greater than the illumination associated with the projected image. Cursor detector 410 may compare the illumination of invisible cursor 135 with other illumination levels associated with coordinates of the projected image to determine the position of invisible cursor 135. Additionally, or alternatively, cursor detector 410 may detect invisible cursor 135 based on a frequency associated with the invisible light. For example, cursor detector 410 may use information about the frequency of the invisible light to filter out the spectrum of the projected image and detect invisible cursor 135.
  • invisible cursor 135 e.g., an invisible spot of light from pointer device 120. According to an exemplary implementation, cursor detector 410 may detect invisible cursor 135 based on its illumination level or its light intensity. That is, the illumination of invisible cursor 135 may be greater than
  • the size of invisible cursor 135 may also be used to detect invisible cursor 135 in combination with illumination level and/or frequency. Additionally, or alternatively, cursor detector 410 may detect invisible cursor 135 based on its movement. That is, the projected image may be relatively motionless or static whereas invisible cursor 135 may move due to the user's hand shaking, etc. Furthermore, in instances when the projected image includes a portion having an illumination level substantially the same as invisible cursor 135, cursor detector 410 may detect the position of invisible cursor 135 based on the movement of invisible cursor 135.
  • Cursor stabilizer 415 may stabilize the position of visible cursor 140 once the position of invisible cursor 135 is detected.
  • the position of invisible cursor 135 may be instable due to the unsteadiness of the user's hand.
  • cursor stabilizer 415 may comprise filters (e.g., Kalman filters or low-pass filters) to remove noise and provide an estimation of an intended position of invisible cursor 135 and to stabilize such a position.
  • filters e.g., Kalman filters or low-pass filters
  • a low-pass filter may operate between some range of values between 0Hz - 5Hz.
  • Visible cursor 140 may correspondingly maintain a stable position.
  • Mapper 420 may map the position of invisible cursor 135 to the coordinate space associated with the user interface of user device 105. Mapper 420 may map the position of invisible cursor 135 based on translation information provided by calibrator 405.
  • Input manger 425 may interpret inputs received from pointer device 120.
  • pointer device 120 may comprise input mechanisms (buttons, a scroll wheel, a switch, a touchpad, etc.).
  • a user may perform various operations using these input mechanisms.
  • these operations may include selecting, scrolling, highlighting, dragging-and-dropping, navigating in a menu (e.g., within a pull-down menu), navigating within other portions of the user interface, and/or other operations associated with the user interface.
  • the user may interact with a projected user interface and have available various operations corresponding to those provided by a mouse, single or multi-touch user interactions, or other types of input devices.
  • input manager 425 may interpret a user's gestures with pointer device 120.
  • Cursor generator 430 may generate visible cursor 140. Cursor generator 430 may allow user to configure various characteristics (e.g., shape, design, size, and/or transparency) associated with visible cursor 140.
  • user device 105 may include fewer functional components, additional functional components, different functional components, and/or a different arrangement of functional components than those illustrated in Fig. 4 and described.
  • user device 105 may include a prediction component to predict the position of invisible cursor 135.
  • the prediction component may use previous invisible cursor 135 positions as a basis for identifying a user's navigational patterns. For example, a user may frequently access a particular application(s) or area of the user interface.
  • the prediction component may minimize a lag time associated with detecting the position of invisible cursor 135 and paralleling its position with visible cursor 140.
  • a latency e.g., a latency t
  • User device 105 may use information about the speed and acceleration of invisible cursor 135 at current and previously detected points to make an extrapolation of where visible cursor 140 could be latency t later than when invisible cursor 135 is detected (i.e., when the cursor is presented to the user).
  • user device 105 may provide a better estimate of the position of visible cursor 140 at the time when it is presented to the user. From the perspective of the user, the latency may be perceived as minimal. Understandably, the longer the latency t is, the greater the importance of the prediction. However, at frequencies, such as, for example, 50 Hz - 60 Hz, or above, the effect of the latency should be less and less perceivable by the user.
  • one or more operations described as being performed by a particular functional component may be performed by one or more other functional components, in addition to or instead of the particular functional component, and/or one or more functional components may be combined.
  • Fig. 5 is a diagram illustrating an exemplary pointer device 120.
  • pointer device 120 may comprise a housing 200, a switch 205, buttons 210 and 215, a scroll wheel 220, and a portal 225.
  • pointer device 120 may comprise fewer components, additional components, different components, and/or a different arrangement of components than those illustrated in Fig. 5 and described herein.
  • pointer device 120 may not comprise scroll wheel 220.
  • Housing 200 may comprise a structure capable of encasing components and supporting components of pointer device 120.
  • housing 200 may be formed from plastic, metal, or some other type of material.
  • Housing 200 may be formed of any shape and/or design.
  • housing 200 is illustrated as having a pen-like shape.
  • Switch 205 may turn on and turn off pointer device 120.
  • a user may rotate switch 205 to on and off positions.
  • Button 210, button 215, and scroll wheel 220 may comprise a component capable of providing input into pointer device 120.
  • buttons 210 and 215 may permit a user to perform various operations, such as selecting, highlighting, dragging-and-dropping, and/or navigating.
  • Buttons 210 and 215 may also cause user device 105 to operate in different modes.
  • user device 105 may operate in a single touch mode or a multi-touch mode.
  • Scroll wheel 220 may permit a user to scroll.
  • Portal 225 may permit a light (e.g., a laser light or some other light) to emit from pointer device 120.
  • Fig. 6 is a diagram illustrating exemplary components of pointer device 120.
  • pointer device 120 may comprise a laser 605, input 610, and a communication interface 615.
  • pointer device 120 may comprise fewer components, additional components, different components, or a different arrangement of components than those illustrated in Fig. 6 and described herein.
  • pointer device 120 may not correspond to a laser device but some other type of device (e.g., a flashlight).
  • Laser 605 may comprise laser circuitry to generate a laser beam.
  • laser 605 may comprise a laser diode.
  • the laser diode may correspond to an IR laser diode.
  • the laser diode may correspond to an ultraviolet laser diode.
  • Laser 605 may also comprise a controller and a power source (not illustrated).
  • Input 610 may permit an input into pointer device 120.
  • input 610 may comprise buttons 210 and 215 and scroll wheel 220, an input port, and/or some other type input component.
  • Communication interface 615 may permit pointer device 120 to communicate with other devices.
  • communication interface 615 may comprise a wireless communication interface and/or a wired communication interface.
  • communication interface 615 may comprise a radio interface, a microwave interface, a USB interface, or some other type of wireless communication interface and/or wired communication interface.
  • Communication interface 615 may comprise a transmitter, a receiver, and/or a transceiver.
  • Communication interface 615 may operate according to various protocols, standards, or the like.
  • a user may perform various operations (e.g., selecting, scrolling, highlighting, dragging-and-dropping, navigating, etc.) based on input 610 of pointer device 120. Additionally, the user may cause user device 105 to operate in various modes based on the user's input. For example, the user may perform multi-touch type operations according to particular inputs and/or gestures.
  • Figs. 7A - 7C are diagrams illustrating an exemplary scenario in which multi-touch operations may be performed utilizing pointer device 120 and user device 105.
  • a projected page 705 is displayed.
  • User 145 may press one of buttons 210 or 215 while positioning invisible cursor 135 and visible cursor 140 somewhere on projected page 705.
  • the press may include a particular pressure, duration, number of pressings, etc.
  • pointer device 120 may send an input selection 710 to user device 105.
  • buttons 210 or 215 may be press one of buttons 210 or 215 while positioning invisible cursor 135 and visible cursor 140 somewhere on projected page 705.
  • the press may include a particular pressure, duration, number of pressings, etc.
  • pointer device 120 may send an input selection 715 to user device 105.
  • user device 105 may operate in a multi-touch mode.
  • user 145 may position invisible cursor 135 and visible cursor 140 somewhere on projected page 705 and make a rotation gesture 720 (e.g., counterclockwise), which may cause projected page 705 to correspondingly rotate.
  • cursor detector 410 of user device 105 may recognize the gesture associated with invisible cursor 135 (and visible cursor 140) (e.g., an arcing positional path of invisible cursor 135 (and visible cursor 140)) and provide this gesture information to input manager 425 so as to cause projected page 705 to correspondingly rotate.
  • pointer device 120 comprises an accelerometer and/or a gyroscope
  • pointer 120 may send an input selection (e.g., gesture information) to user device 105.
  • buttons 210 and/or 215 may allow user 145 to enter a variety of input commands.
  • the pressure applied by user 145 may regulate a level of luminosity (e.g., the greater the pressure - the greater the luminosity of the light) emitted from pointer device 120.
  • the pressure parameter may be coupled with other parameters (e.g., duration of press, number of presses, button sequences, etc.) to allow user 145 to perform multi-touch operations without necessarily requiring user 145 to select multiple portions (e.g., areas) of the projected user interface.
  • Figs. 8A and 8B are flow diagrams illustrating an exemplary process 800 for providing user navigation via a projected user interface and a pointer device.
  • process 800 may be performed by user device 105, projector 110, pointer device 120, and camera 125.
  • projector 110 and/or camera 125 may be internal to or external from user device 105.
  • Process 800 is described with the assumption that a calibration process has been successfully completed.
  • Process 800 may include projecting a content associated with a user device on a surface (block 805).
  • projector 110 may project a user interface associated with user device 105 on a surface (e.g., screen 115, a wall, etc.).
  • An invisible cursor may be emitted from a pointer device on the projected content (block 810).
  • pointer device 120 may be pointed towards the projected content causing invisible cursor 135 (e.g., an invisible spot) to be positioned on the projected content.
  • invisible cursor 135 e.g., an invisible spot
  • pointer device 120 may emit an IR laser beam.
  • pointer device 120 may emit some other type of laser beam (e.g. ultraviolet beam) or light that is invisible to the human eye and does not create a visible cursor.
  • a visible cursor may be generated by the user device and the visible cursor may be projected on the surface (block 815).
  • camera 125 may capture the projected content that includes invisible cursor 135, and provide this image to user device 105.
  • User device 105 e.g., cursor generator 430
  • user device 105 e.g., cursor detector 410, cursor stabilizer 415) may determine the position of invisible cursor 135.
  • visible cursor 140 may appear on the projected content in a position that corresponds to or is substantially the same as the position of invisible cursor 135.
  • User device 105 e.g., mapper 420
  • An input may be received by the pointer device (block 820).
  • pointer device 120 may receive a user input via button 210, button 215, and/or scroll wheel 220.
  • the user input may correspond to selecting, highlighting, a multi-touch type operation, etc.
  • the projected content and the invisible/visible cursor may be captured by a camera (block 825).
  • camera 125 may capture the image of the projected content, invisible cursor 135, and visible cursor 140.
  • the projected content and the cursor may be received by the user device (block 830).
  • camera 125 may provide the image associated with the projected content, invisible cursor 135, and visible cursor 140 to user device 105.
  • the pointer device input may be received by the user device (block 835).
  • user device 105 may receive the user input from pointer device 120.
  • user device 105 e.g., input manager 425) may interpret the user input from pointer device 120.
  • the user inputs may include selecting, scrolling, etc.
  • the position of the visible cursor with respect to the content may be mapped (block 840).
  • user device 105 may map the position visible cursor 140 to the content associated user device 105.
  • mapper 420 of user device 105 may map the position of visible cursor 140 to the coordinate space associated with user device 105 based on translation information provided by calibrator 405 during a calibration process.
  • An input operation that interacts with the content may be performed by the user device (block 845).
  • user device 105 may perform an input operation (e.g., selecting, highlighting, scrolling, a multi-touch operation, etc.) that interacts with the content.
  • an input operation e.g., selecting, highlighting, scrolling, a multi-touch operation, etc.
  • Figs. 8A and 8B illustrate an exemplary process 800 for, in other implementations, process 800 may include additional operations, fewer operations, and/or different operations than those illustrated and described with respect to Figs. 8A and 8B . Additionally, depending on the components of user device 105, pointer device 120, and/or camera 125, process 800 may be modified such that some functions described as being performed by a particular device may be performed by another device. In addition, while a series of blocks has been described with regard to process 800 illustrated in Figs. 8A and 8B , the order of the blocks may be modified in other implementations. Further, non-dependent blocks may be performed in parallel.
  • This logic or component may include hardware, such as processing system (e.g., one or more processors, one or more microprocessors, one or more ASICs, one or more FPGAs, etc.,), a combination of hardware and software (e.g., applications 315), or a combination with firmware, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Claims (15)

  1. Procédé comprenant :
    la projection d'un contenu associé à un dispositif utilisateur (105) sur une surface ;
    la projection, par un dispositif de pointage (120) utilisant une lumière invisible, d'un curseur invisible (135) sur le contenu projeté ;
    la réception, par le dispositif utilisateur (105), d'une image du contenu projeté comportant une image d'un contenu de curseur invisible du curseur invisible (135) ;
    la détermination d'une position du contenu de curseur invisible par rapport au contenu projeté ;
    la génération, par le dispositif utilisateur (105), du contenu et d'un curseur visible (140) ayant une position correspondant au curseur invisible (135) ;
    la projection du contenu et du curseur visible (140) sur la surface ;
    la réception, par le dispositif utilisateur (105), d'une entrée provenant du dispositif de pointage (120) ;
    la mise en correspondance, par le dispositif utilisateur (105), de la position du curseur visible (140) avec le contenu ; et
    la mise en oeuvre, par le dispositif utilisateur (105), d'une opération d'entrée qui interagit avec le contenu et correspond à l'entrée provenant du dispositif de pointage (120) et à la position du curseur visible (140) ;
    dans lequel, lors de la détermination de la position du contenu de curseur invisible par rapport au contenu projeté, le procédé comprenant en outre la détection d'une taille associée au curseur invisible (135) et d'au moins l'un d'un niveau d'éclairage associé au contenu de curseur invisible, et d'une fréquence associée à la lumière invisible, et l'utilisation de la taille en combinaison avec le niveau d'éclairage et/ou la fréquence pour détecter le curseur invisible (135).
  2. Procédé selon la revendication 1, comprenant en outre :
    la réception, par le dispositif de pointage (120), de l'entrée ; et
    la transmission, par le dispositif de pointage (120), de l'entrée au dispositif utilisateur (105), dans lequel le dispositif de pointage (120) correspond à un dispositif de pointage à laser infrarouge.
  3. Procédé selon la revendication 1, dans lequel la détermination comprend :
    la détection d'un niveau d'éclairage associé au contenu de curseur invisible ; et
    la comparaison du niveau d'éclairage avec d'autres niveaux d'éclairage associés à des coordonnées du contenu projeté.
  4. Procédé selon la revendication 1, comprenant en outre :
    la mise en oeuvre d'un étalonnage pour calculer une translation entre un espace de coordonnées associé au contenu projeté et un espace de coordonnées associé au contenu.
  5. Procédé selon la revendication 4, dans lequel la mise en correspondance comprend :
    la mise en correspondance d'une ou de plusieurs coordonnées de la position du contenu de curseur invisible avec une ou plusieurs coordonnées correspondantes du contenu.
  6. Procédé selon la revendication 1, comprenant en outre :
    la génération, par le dispositif utilisateur (105), d'une image d'étalonnage, dans lequel le contenu correspond à l'un d'une interface utilisateur du dispositif utilisateur (105) ou d'un contenu accessible par le dispositif utilisateur (105).
  7. Procédé selon la revendication 1, dans lequel la détermination comprend :
    la reconnaissance d'un geste de l'utilisateur sur la base d'un chemin de positionnement associé au contenu de curseur invisible, dans lequel l'opération d'entrée correspond à l'un du défilement, de la sélection, de la surbrillance, du glisser-déposer, d'une opération tactile multipoint ou de la navigation à l'intérieur d'un menu.
  8. Dispositif utilisateur (105) comprenant des composants configurés pour :
    recevoir une image d'un contenu projeté comportant une image d'un contenu de curseur invisible du curseur invisible (135), dans lequel le contenu projeté est projeté sur une surface et correspond à un contenu associé au dispositif utilisateur (105), et dans lequel le curseur invisible (135) est projeté par un dispositif de pointage (120) utilisant une lumière invisible ;
    déterminer une position du contenu de curseur invisible par rapport au contenu projeté ;
    générer le contenu et un curseur visible (140) ayant une position en correspondance avec le curseur invisible (135) ;
    recevoir une entrée utilisateur provenant du dispositif de pointage (120) ;
    mettre en correspondance la position du curseur visible (140) avec le contenu ; et
    mettre en oeuvre une opération d'entrée qui interagit avec le contenu, dans lequel l'opération d'entrée correspond à l'entrée utilisateur et à la position de curseur visible (140) ;
    dans lequel, lors de la détermination de la position du contenu de curseur invisible par rapport au contenu projeté, les composants sont en outre configurés pour détecter une taille associée au curseur invisible (135) et au moins l'un d'un niveau d'éclairage associé au contenu de curseur invisible et d'une fréquence associée à la lumière invisible, et les composants sont en outre configurés pour utiliser la taille en combinaison avec le niveau d'éclairage et/ou la fréquence pour détecter le curseur invisible (135).
  9. Dispositif utilisateur selon la revendication 8, comprenant en outre :
    un radiotéléphone ; et
    un écran capable d'afficher le contenu.
  10. Dispositif utilisateur selon la revendication 8, comprenant en outre au moins l'un d'un projecteur ou d'une caméra capable de détecter une lumière infrarouge et une lumière visible, et dans lequel le curseur visible (140) est partiellement transparent.
  11. Dispositif utilisateur selon la revendication 8, dans lequel lors de la détermination, les composants sont en outre configurés pour :
    détecter au moins l'un d'un niveau d'éclairage associé au contenu de curseur invisible, d'une fréquence associée au contenu de curseur invisible ou d'un chemin de positionnement associé au contenu de curseur invisible.
  12. Dispositif utilisateur selon la revendication 8, dans lequel les composants sont en outre configurés pour :
    supprimer le bruit sur la base d'un filtre de Kalman ou d'un filtre passe-bas pour stabiliser la position du curseur visible (140) .
  13. Dispositif utilisateur selon la revendication 8, dans lequel les composants sont en outre configurés pour :
    recevoir un contenu projeté correspondant à une image d'étalonnage ; et
    mettre en correspondance un espace de coordonnées entre le contenu projeté correspondant à l'image d'étalonnage et un contenu associé au dispositif utilisateur (105).
  14. Dispositif utilisateur selon la revendication 8, dans lequel l'opération d'entrée correspond à l'un du défilement, de la sélection, de la surbrillance, du glisser-déposer, d'une opération tactile multipoint ou de la navigation à l'intérieur d'un menu.
  15. Dispositif utilisateur selon la revendication 8, dans lequel le contenu correspond à l'un d'une interface utilisateur du dispositif utilisateur (105) ou d'un contenu accessible par le dispositif utilisateur (105), et les composants sont en outre configurés pour :
    détecter au moins l'un d'un niveau d'éclairage associé au contenu de curseur invisible ou d'une fréquence associée au contenu de curseur invisible, en combinaison avec une taille du contenu de curseur invisible ; et
    filtrer un spectre associé au contenu projeté reçu sur la base de la fréquence.
EP11712673.0A 2010-03-17 2011-02-17 Dispositif permettant de parcourir une interface utilisateur projetée Active EP2548103B9 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US12/725,841 US20110230238A1 (en) 2010-03-17 2010-03-17 Pointer device to navigate a projected user interface
PCT/IB2011/050673 WO2011114244A1 (fr) 2010-03-17 2011-02-17 Dispositif permettant de parcourir une interface utilisateur projetée

Publications (3)

Publication Number Publication Date
EP2548103A1 EP2548103A1 (fr) 2013-01-23
EP2548103B1 EP2548103B1 (fr) 2019-08-14
EP2548103B9 true EP2548103B9 (fr) 2019-12-25

Family

ID=43988615

Family Applications (1)

Application Number Title Priority Date Filing Date
EP11712673.0A Active EP2548103B9 (fr) 2010-03-17 2011-02-17 Dispositif permettant de parcourir une interface utilisateur projetée

Country Status (3)

Country Link
US (1) US20110230238A1 (fr)
EP (1) EP2548103B9 (fr)
WO (1) WO2011114244A1 (fr)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8538367B2 (en) * 2009-06-29 2013-09-17 Qualcomm Incorporated Buffer circuit with integrated loss canceling
US20110119638A1 (en) * 2009-11-17 2011-05-19 Babak Forutanpour User interface methods and systems for providing gesturing on projected images
WO2012012262A1 (fr) * 2010-07-19 2012-01-26 Google Inc. Déclenchement de pointage prédictif
US20120054667A1 (en) * 2010-08-31 2012-03-01 Blackboard Inc. Separate and simultaneous control of windows in windowing systems
CN102915157B (zh) * 2011-08-05 2015-12-09 英华达(上海)科技有限公司 影像输入系统及其输入方法
JP2013058138A (ja) * 2011-09-09 2013-03-28 Sony Corp 画像処理装置及び方法、並びにプログラム
US9459716B2 (en) * 2011-10-03 2016-10-04 Furuno Electric Co., Ltd. Device having touch panel, radar apparatus, plotter apparatus, ship network system, information displaying method and information displaying program
US9274642B2 (en) 2011-10-20 2016-03-01 Microsoft Technology Licensing, Llc Acceleration-based interaction for multi-pointer indirect input devices
US9658715B2 (en) * 2011-10-20 2017-05-23 Microsoft Technology Licensing, Llc Display mapping modes for multi-pointer indirect input devices
US8933896B2 (en) 2011-10-25 2015-01-13 Microsoft Corporation Pressure-based interaction for indirect touch input devices
CN103092432B (zh) * 2011-11-08 2016-08-03 深圳市中科睿成智能科技有限公司 人机交互操作指令的触发控制方法和系统及激光发射装置
US9389679B2 (en) 2011-11-30 2016-07-12 Microsoft Technology Licensing, Llc Application programming interface for a multi-pointer indirect touch input device
US20130342704A1 (en) * 2012-06-23 2013-12-26 VillageTech Solutions Interactive audiovisual device
US10341627B2 (en) * 2012-06-28 2019-07-02 Intermec Ip Corp. Single-handed floating display with selectable content
US9880193B2 (en) 2012-08-15 2018-01-30 Qualcomm Incorporated Device driven inertial interference compensation
RU2494441C1 (ru) * 2012-10-12 2013-09-27 Закрытое акционерное общество "ЭВЕРЕСТ ПЛЮС" Интерактивный учебный комплекс
PT2925200T (pt) 2012-11-29 2019-06-07 Vorwerk Co Interholding Robô de cozinha
PL2925198T3 (pl) 2012-11-29 2019-05-31 Vorwerk Co Interholding Robot kuchenny
US10682016B2 (en) * 2012-11-29 2020-06-16 Vorwerk & Co. Interholding Gmbh Food processor
US9513776B2 (en) * 2012-12-05 2016-12-06 At&T Mobility Ii, Llc Providing wireless control of a visual aid based on movement detection
JP6372487B2 (ja) * 2013-06-26 2018-08-15 ソニー株式会社 情報処理装置、制御方法、プログラム、および記憶媒体
JP2015014882A (ja) * 2013-07-04 2015-01-22 ソニー株式会社 情報処理装置、操作入力検出方法、プログラム、および記憶媒体
EP3398029B1 (fr) * 2015-12-31 2021-07-07 Robert Bosch GmbH Système intelligent de contrôle d'une pièce
US10216289B2 (en) 2016-04-29 2019-02-26 International Business Machines Corporation Laser pointer emulation via a mobile device

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08334832A (ja) * 1995-04-07 1996-12-17 Seiko Epson Corp 映像投射システム及びその位置マーク表示制御方法
US6050690A (en) * 1998-01-08 2000-04-18 Siemens Information And Communication Networks, Inc. Apparatus and method for focusing a projected image
US6952198B2 (en) * 1999-07-06 2005-10-04 Hansen Karl C System and method for communication with enhanced optical pointer
US6275214B1 (en) * 1999-07-06 2001-08-14 Karl C. Hansen Computer presentation system and method with optical tracking of wireless pointer
US6346933B1 (en) * 1999-09-21 2002-02-12 Seiko Epson Corporation Interactive display presentation system
US6729731B2 (en) * 2001-06-11 2004-05-04 Info Valley Corporation Untethered laser pointer for use with computer display
US6910778B2 (en) * 2001-09-28 2005-06-28 Fujinon Corporation Presentation system using laser pointer
US6840627B2 (en) * 2003-01-21 2005-01-11 Hewlett-Packard Development Company, L.P. Interactive display device
US6948820B2 (en) * 2003-08-28 2005-09-27 Scram Technologies, Inc. Interactive display system having an optical channeling element
US20080180395A1 (en) * 2005-03-04 2008-07-31 Gray Robert H Computer pointing input device
TWM300825U (en) * 2006-05-25 2006-11-11 Everest Display Inc Projector capable of extracting image and briefing system using the same
US20080279453A1 (en) * 2007-05-08 2008-11-13 Candelore Brant L OCR enabled hand-held device
US20090079944A1 (en) * 2007-09-24 2009-03-26 Mustek Systems, Inc. Contactless Operating Device for a Digital Equipment and Method for the Same
US7862179B2 (en) * 2007-11-07 2011-01-04 Omnivision Technologies, Inc. Dual-mode projection apparatus and method for locating a light spot in a projected image
KR20090063679A (ko) * 2007-12-14 2009-06-18 삼성전자주식회사 포인팅 기능을 구비한 영상표시장치 및 그 방법
FR2933212B1 (fr) * 2008-06-27 2013-07-05 Movea Sa Pointeur a capture de mouvement resolue par fusion de donnees

Also Published As

Publication number Publication date
EP2548103B1 (fr) 2019-08-14
US20110230238A1 (en) 2011-09-22
WO2011114244A1 (fr) 2011-09-22
EP2548103A1 (fr) 2013-01-23

Similar Documents

Publication Publication Date Title
EP2548103B9 (fr) Dispositif permettant de parcourir une interface utilisateur projetée
US10416789B2 (en) Automatic selection of a wireless connectivity protocol for an input device
US7173605B2 (en) Method and apparatus for providing projected user interface for computing device
JP5694867B2 (ja) 携帯端末装置、プログラムおよび表示制御方法
WO2019212575A1 (fr) Système, appareil et procédé d'optimisation de l'expérience de visualisation sur un terminal intelligent
US20190012000A1 (en) Deformable Electronic Device with Methods and Systems for Controlling the Deformed User Interface
EP2772844A1 (fr) Dispositif terminal et procédé de lancement rapide d'un programme
US20120299876A1 (en) Adaptable projection on occluding object in a projected user interface
KR20170126295A (ko) 헤드 마운티드 디스플레이 장치 및 그것의 제어방법
WO2012133227A1 (fr) Appareil électronique, procédé de commande et programme de commande
JP2013524311A (ja) 近接性に基づく入力のための装置および方法
JP2011216088A (ja) タッチ検出投影像を有する投影システム
US20170068418A1 (en) Electronic apparatus, recording medium, and operation method of electronic apparatus
KR20140126949A (ko) 터치스크린을 구비하는 전자 장치의 메뉴 운용 방법 및 장치
US10474324B2 (en) Uninterruptable overlay on a display
CN110417960B (zh) 一种可折叠触摸屏的折叠方法及电子设备
WO2018171112A1 (fr) Procédé et dispositif de commutation d'angle d'observation
CN110858860B (zh) 响应于指纹传感器上手指旋转的电子设备控制及相应方法
US20100031201A1 (en) Projection of a user interface of a device
KR20140082434A (ko) 전자장치에서 화면 표시 방법 및 장치
KR20150000656A (ko) 휴대 단말에서 화면 이미지 출력 방법 및 장치
CN112099717B (zh) 折叠屏状态检测方法、装置、电子设备和显示系统
JP2023511156A (ja) 撮影方法及び電子機器
US9946333B2 (en) Interactive image projection
CN106293036B (zh) 一种交互方法及电子设备

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20120829

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20170922

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602011061242

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G06F0003030000

Ipc: G06F0003038000

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/0346 20130101AFI20190322BHEP

Ipc: G06F 3/038 20130101ALI20190322BHEP

INTG Intention to grant announced

Effective date: 20190411

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 3/038 20130101AFI20190401BHEP

Ipc: G06F 3/0346 20130101ALI20190401BHEP

Ipc: G06F 3/0354 20130101ALI20190401BHEP

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

Ref country code: AT

Ref legal event code: REF

Ref document number: 1167785

Country of ref document: AT

Kind code of ref document: T

Effective date: 20190815

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602011061242

Country of ref document: DE

REG Reference to a national code

Ref country code: CH

Ref legal event code: PK

Free format text: BERICHTIGUNG B9

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20190814

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191114

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191216

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191114

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1167785

Country of ref document: AT

Kind code of ref document: T

Effective date: 20190814

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191214

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191115

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200224

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602011061242

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG2D Information on lapse in contracting state deleted

Ref country code: IS

26N No opposition filed

Effective date: 20200603

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20200217

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20200229

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200217

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200229

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200229

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200217

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200217

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200229

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200229

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20210120

Year of fee payment: 11

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190814

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602011061242

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220901