EP2661669A1 - Verfahren und vorrichtung für gestenbasierte steuerungen - Google Patents

Verfahren und vorrichtung für gestenbasierte steuerungen

Info

Publication number
EP2661669A1
EP2661669A1 EP12732016.6A EP12732016A EP2661669A1 EP 2661669 A1 EP2661669 A1 EP 2661669A1 EP 12732016 A EP12732016 A EP 12732016A EP 2661669 A1 EP2661669 A1 EP 2661669A1
Authority
EP
European Patent Office
Prior art keywords
video
gesture
playing
recited
gestures
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP12732016.6A
Other languages
English (en)
French (fr)
Other versions
EP2661669A4 (de
Inventor
Robin Hayes
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Adeia Media Solutions Inc
Original Assignee
Tivo Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/986,054 external-priority patent/US20120179967A1/en
Priority claimed from US12/986,060 external-priority patent/US9430128B2/en
Application filed by Tivo Inc filed Critical Tivo Inc
Publication of EP2661669A1 publication Critical patent/EP2661669A1/de
Publication of EP2661669A4 publication Critical patent/EP2661669A4/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/005Reproducing at a different information rate from the information rate of recording
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42224Touch pad or touch panel provided on the remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data

Definitions

  • the present invention relates to the use of gestures. Specifically, the invention relates to gesture-based controls for multimedia content.
  • Multimedia content such as web pages, images, video, slides, text, graphics, sound files, audio/video files etc. may be displayed or played on devices.
  • Commands related to playing or displaying of content on devices may be submitted by a user on the device itself or on a separate device functioning as a remote control.
  • Figure 1 is a block diagram illustrating an example system in accordance with one or more embodiments
  • Figure 4 shows a block diagram that illustrates a system upon which an embodiment of the invention may be implemented.
  • FIG. 1 is a block diagram illustrating an example system (100) in accordance with one or more embodiments.
  • the example system (100) includes one or more components that function as content sources, touch screen interface devices, multimedia devices (e.g. , devices that play audio and/or video content), and/or content management devices.
  • Specific components are presented to clarify the functionalities described herein and may not be necessary to implement one or more embodiments.
  • Each of these components are presented to clarify the functionalities described herein and may not be necessary to implement one or more embodiments.
  • input device (110) may include a touch screen interface (115) configured to detect one or more gestures, as described herein.
  • Input device (110) may be configured to detect a gesture, a path of a gesture, a speed of a gesture, an acceleration of the gesture, a direction of a gesture, etc.
  • input device (110) may include a resistive system where an electrical current runs through two layers which make contact at spots/areas on the touch screen interface (115) that are touched. The coordinates of the contact points or contact spots may be compared to gesture information stored in a data repository (150) to identify a gesture performed by a user on the touch screen interface (115).
  • input device (110) may include a capacitive system with a layer that stores electrical charge, a part of which is transferred to a user where the user touches the touch screen interface (115).
  • input device (110) may include a surface acoustic wave system with two transducers with an electrical signal being sent from one transducer to another transducer. Any interruption of the electrical signal (e.g. , due to a user touch) may be used to detect a contact point on the touch screen interface (115).
  • input device (110) may be configured to first detect that an initial user touch on a visual representation, of the data, displayed on the touch screen interface.
  • input device (110) may include one or more of: Read Only Memory (ROM) (206), a Central Processing Unit (CPU), Random Access Memory (RAM), Infrared Control Unit (ICU), a key pad scan, a key pad, Non- Volatile Memory (NVM), one or more microphones, a general purpose input/output (GPIO) interface, a speaker/tweeter, a key transmitter/indicator, a microphone, a radio, an Infrared (IR) blaster, a network card, a display screen, a Radio Frequency (RF) Antenna, a QWERTY keyboard, a network card, network adapter, network interface controller (NIC), network interface card, Local Area Network adapter, Ethernet network card, and/or any other component that can receive information over a network.
  • ROM Read Only Memory
  • CPU Central Processing Unit
  • RAM Random Access Memory
  • ICU Infrared Control Unit
  • NVM Non- Volatile Memory
  • GPIO general purpose input/output
  • GPIO
  • input device (110) generally represents any device which may be configured for detecting a gesture as user input.
  • a user may perform a gesture by touching the touch screen interface (115) on the input device (110).
  • a user may perform a gesture by tapping the touch screen interface (115) with a finger or sliding a finger on the touch screen interface (115).
  • Gestures relating to touching or making contact with the touch screen interface (115), as referred to herein, may include hovering over a touch screen interface (115) with a finger (or other input instrument) without necessarily touching the touch screen interface (115) such that the touch screen interface (115) detects the finger (e.g. , due to transfer of electrical charge at a location on the touch screen interface (115)).
  • a tap gesture may be performed by touching a particular location on the touch screen interface (115) and then releasing contact with the touch screen interface (115).
  • a tap gesture may be detected by detecting a contact to a touch screen interface (115) at a particular location followed by detecting that the contact is released.
  • a flick gesture may be performed by touching a particular location on the touch screen interface (115) of the input device (110) with a finger (or any other item, e.g. , a stylus), and sliding the finger away from the particular location while maintaining contact with the touch screen interface (115) for a portion of the sliding action performed by the user and continuing the sliding action even after contact with the touch screen interface (115) has ended.
  • the touch screen interface (115) may be configured to detect the proximity of the finger after physical contact with the touch screen interface (115) has ended.
  • a swipe gesture may be performed by touching a particular location on the touch screen interface (115) of the input device (110) with a finger and sliding the finger away from the particular location while maintaining contact with the touch screen interface (115) during the sliding action.
  • a sliding action (e.g. , a swipe or a flick) may be detected before the sliding action is completed.
  • a right-direction sliding gesture may be detected by detecting contact at a first location followed by contact at a second location that is to the right of the first location (or within a particular degree in the right direction). The user may continue the sliding gesture to a third location that is right of the second location, however, the direction of the sliding gesture may already be detected using the first location and the second location.
  • a flick gesture and a slide gesture may be mapped to different commands.
  • a flick gesture to the left may correspond to a twenty second rewind command and a swipe gesture to the left may correspond to a command for selecting the previous bookmarked scene in a video.
  • a scene may be bookmarked, for example, by a user or hard coded into a media recording such as selectable scenes from a movie recorded on a Digital Video Disc (DVD).
  • DVD Digital Video Disc
  • a slide gesture may be performed with multiple input instruments being used concurrently. For example, a user may slide two fingers across a touch screen interface at the same time. Further the user may concurrently slide the two fingers in parallel (e.g. , sliding two fingers in the same direction from left to right).
  • concurrently has referred to herein includes approximately concurrent.
  • two fingers concurrently performing a parallel gesture may refer to two fingers of different lengths performing the same gesture at slightly different times. For example, one finger may lag in time behind another finger for starting and/or finishing the gesture.
  • the two fingers may start and finish the gesture at different start and/or finish times.
  • the touch screen interface (115) includes a gesture area.
  • a gesture area is at least a portion of the touch screen interface (115) that is configured to detect a gesture performed a user.
  • the gesture area may include the entire touch screen interface (115) or a portion of the touch screen interface (115).
  • the gesture area may display a blank box or one or more items.
  • the gesture area may display a video.
  • the gesture area may display information on how to perform gestures.
  • a gesture may be detected within a gesture area without a user's interaction with any visual objects that may be displayed in the gesture area. For example, a swipe gesture across a cellular phone's touch screen interface (115) may be detected in a gesture area that is an empty box on the touch screen interface. In another example, a progress indicator displayed in the gesture area is not touched by a detected swipe gesture associated with a rewind command.
  • the touch screen interface (115) may include multiple gesture areas.
  • a gesture detected within one gesture area may be mapped to a different command than the same gestured performed in a different gesture area.
  • a device may be configured to identify an area in which a gesture is performed and determine an action based on the gesture and the gesture area in which the action was performed.
  • a gesture may be mapped to (or associated with) a command.
  • a command mapped to a gesture may be a video playback command related to the playback of a video.
  • the command may be related to playback of a video on the device on which the command was received or on a different device.
  • a command may specify a video playing speed and direction.
  • the command may select rewinding at a particular rewinding speed or fast- forwarding a particular fast- forwarding speed.
  • video playback commands include, but are not limited to, pausing the playing of the video, resuming the playing of the video, replaying a played portion of the video, stopping playing of the video, stopping playing of the video and resuming playing of the video at a particular playing position, playing the video in slow motion, frame- stepping through a video, playing the video from the beginning, playing one or more videos from a next playlist, playing the video from a particular scene forward, bookmarking a playing position in the video, stopping playing and resuming playing at a bookmarked position, or rating the video.
  • a command may select a particular option out of a list of options.
  • a list of available media content may be displayed on a screen and the command may select particular media content of the available media content.
  • a list of configuration settings may be displayed and the command may select a particular setting for modification.
  • detecting a gesture may include detecting interface contact at an initial location that is a part of the detected gesture (Step 202).
  • the initial contact on the touch screen interface may be made with a user finger, a stylus, or any other item which may be used to perform a gesture on a touch screen interface.
  • the initial contact with the touch screen interface may involve a quick touch at the initial location (e.g. , a tap gesture) or a touch that is maintained at the initial location for any period of time (e.g. , a millisecond, a second, two seconds, etc.).
  • the initial contact with the touch screen interface may be brief as may be made by a finger already moving in a direction. For example, a finger moving in the air without making contact, and thereafter during the moving making the initial contact with a portion of the touch screen interface.
  • the initial contact as referred to herein may include a finger (or other item) being close enough to a touch screen interface that the touch screen interface detects the finger.
  • a finger or other item
  • a part of the electrical charge may be transferred to a user where the user touches the touch screen interface or where a user simply hovers close to the touch screen interface without touching.
  • initial contact or maintained contact as referred to herein may include a user hovering a finger or other item over a touch screen interface.
  • detecting a gesture may further include detecting interface contact at additional locations, on the touch screen interface (Step 204).
  • detecting a flick gesture or a swipe gesture may include detecting interface contact at additional locations in a chronological sequence along a path from the initial contact location.
  • interface contact may be detected continuously in a left-direction path away from an initial contact location on the touch screen interface.
  • a gesture may be identified based on contact detected at one or more locations on the touch screen interface (Step 206). For example, detecting concurrent contact at three locations on a remote control interface followed by a release of contact at all three locations may be identified as a three finger tap gesture.
  • detecting a gesture may include identifying a path along which contact was detected on the touch screen interface. For example, a circle gesture may be identified in response to detecting contact along a circular path on a touch screen interface. A flick gesture or a swipe gesture may be identified based on contact points in a chronological sequence on a touch screen interface.
  • identifying a gesture may include determining a number of concurrent parallel gestures (Step 208). For example, initial contact may be detected concurrently at multiple locations on a touch screen interface. Subsequent to the initial contact at each initial location, contact along paths beginning from the initial locations may be detected. If the paths are determined to be parallel, the number of paths may be identified to determine the number of concurrent parallel gestures.
  • a command is determined based on an identified gesture (Step 210).
  • the command may be determined while the gesture is still being performed or after the gesture is completed.
  • determining a command may include determining that a particular detected gesture is mapped to a command in a database. For example, a two fingered swipe to the right may be queried in a command database to identify a command associated with the two fingered swipe. In another example, a two fingered flick toward the bottom of the gesture area may be associated with a command for selecting the second menu item out of items currently displayed in a menu.
  • a number of parallel fingers in a command may be used to determine a playback speed for the playing of multi-media content. For example, detection of two parallel gestures may be mapped to a command for playback speed which is two times a normal playback speed.
  • a command may include resuming playing of a video at particular bookmarks (e.g. , user defined bookmarks or manufacturer defined bookmarks).
  • bookmarks e.g. , user defined bookmarks or manufacturer defined bookmarks.
  • a number of fingers used to perform a concurrent parallel gesture may be used to select the bookmark. For example, in response to detecting a two-fingered flick downward, the playing of a video may be resumed at the second bookmark from a current playing position.
  • an action corresponding to the command is performed (Step 212).
  • the action may be performed by a device that detects the command. For example, if a gesture for a fast- forward command is detected on a hand-held touch screen phone that is playing a video, the hand-held touch screen phone play the video in fast-forward mode.
  • Figure 3 illustrates an example screen shot for an input device configured to detect gestures.
  • the gestures, commands, mapping between gestures and commands, gesture areas, visual objects, and any other items discussed in relation to Figure 3 are examples and should not be construed as limiting in scope.
  • One or more of the items described in relation to Figure 3 may not be necessarily implemented and other items described may be implemented in accordance with one or more embodiments.
  • Figure 3 illustrates an example interface (300) with a circular gesture area (305) and a square gesture area (310). Any gestures detected in circular gesture area (305) are mapped to navigation commands. For example, a two fingered tap detected in circular gesture area (305) may be associated with a command selecting a second item on any currently displayed menu. If the second item is a folder, the items within the folder may be displayed in response to detecting the two fingered tap.
  • square gesture area (310) may identify commands that are associated with one or more gestures detected within the square gesture area (310).
  • the square gesture area (310) may include graphics illustrating that a single finger swipe gesture to the left corresponds to a rewind command, a single finger tap gesture corresponds to a pause command, a single finger swipe gesture to the right corresponds to a fast-forward command, a two fingered swipe gesture to the left corresponds to a ten second rewind, a two fingered tap gesture corresponds to show motion playback command, and a two fingered swipe to the right corresponds to skip to next bookmark command.
  • the example interface (300) may also include a tool (e.g. , a drop down box) to select a particular media device to be controlled by detected gestures.
  • the example interface (300) may include an option to switch between input mechanisms (e.g. , gesture based input, buttons, text box, radio boxes, etc.).
  • the remote control device can also receive information from the media device indicating the extent of the cache bar (325) which indicates the amount of multimedia content stored or recorded by the media device. If the media device is in the process of recording or caching a multimedia content, the cache bar (325) will increase in size as the media device records or caches more content. If the media device is playing a recorded multimedia content, then the cache bar (325) extends the length of the trickplay bar (330).
  • the remote control device being configured to receive a time stamp closest to the frame being displayed.
  • the remote control device may also be configured to use a step function, e.g. , next frame or previous frame from the time stamp if no frame is an exact match to the time stamp.
  • a step function e.g. , next frame or previous frame from the time stamp if no frame is an exact match to the time stamp.
  • the remote control device continuously receiving images (e.g., bitmap, display instructions, etc.) from the media device of the progress indicator to display on the remote control device.
  • the remote control device may include a particular starting position and a display rate for use by the remote control device to determine the playing position of the multimedia content.
  • a digital video recorder may transmit an initial playing position in the playing of the multimedia content to the remote control device with a rate of progress (e.g. , change of the slider (320) per unit of time, frame rate, etc.).
  • the remote control device may use the information to first display a progress indicator based on the initial
  • the remote control device may further receive updates selecting specific playing positions or indicating changes in the rate of progress. For example, a user may submit one or more commands to pause the playing of multimedia content at a current playing position, then skip back 10 seconds before the current playing position and then resume playing.
  • a media device may provide information to the remote control device to pause the slider (320), display a new playing position corresponding to 10 seconds before the current playing position by moving the slider (320), and then resume periodically updating the slider (320).
  • the slider (320) may be updated when the remote control device is activated.
  • the remote control device may request playing position information from a media device.
  • the remote control device may include an accelerometer configured to detect motion and/or a touch screen interface configured to detect touch.
  • the media device may provide playing position information to the remote control device.
  • the remote control device may then display the slider (320) indicating a current playing position of multimedia content based on the playing on the position information received from the media device.
  • information related to the playing position of the multimedia content may be continuously received by the remote control device for the remote control device to constantly update the slider (320).
  • the information related to the playing position of the multimedia content may be periodically received and the remote control device may update the slider each time the information is received.
  • the remote control device may transmit the multimedia content to the multimedia device for display by the multimedia device.
  • the remote control device may obtain a video stream over the internet and send the video stream to a multimedia device for display on the multimedia device.
  • the remote control device may determine the display position of the slider (320) based on playing position information determined by the remote control device itself.
  • the remote control device may compute the playing position information based on a frame being sent to the multimedia device from the remote control device.
  • the sliding gesture is detected without detecting selection of any video progress indicator displayed within the particular area.
  • the slide gesture may be detected in the particular area while displaying at least a portion of the video in the particular area.
  • the slide gesture may be detected in the particular area while displaying information on how to perform one or more gestures in the particular area.
  • identifying the video playback command is further based on the particular area, in which the slide gesture was detected, from a plurality of areas on the touch screen interface.
  • performing the action comprises a first device sending information to a second device, the information based on the video playback command.
  • Performing the action associated with the video may comprise performing the action on a same device as the device detecting the slide gesture.
  • the video playback command may select a playing speed and direction.
  • the slide gesture comprises a swipe gesture from the first location to a second location.
  • the slide gesture may comprise a flick gesture starting at the first location.
  • the video playback command is for one or more of: pausing the playing of the video; resuming the playing of the video; replaying a played portion of the video; stopping playing of the video; stopping playing of the video and resuming playing of the video at a particular playing position; playing the video in slow motion; playing the video from the beginning; playing one or more videos from a next playlist; playing the video from a particular scene forward; bookmarking a playing position in the video; stopping playing and resuming playing at a bookmarked position; or rating the video.
  • a method comprises concurrently detecting a plurality of parallel gestures on a touch screen interface of a device; determining a number of the plurality of parallel gestures; selecting a command from a plurality of commands based on the number of the plurality of parallel gestures; performing an action associated with the command.
  • selecting the command comprises selecting a menu option based on the number of the plurality of parallel gestures.
  • the plurality of parallel gestures may comprise a plurality of parallel sliding gestures performed in a same direction.
  • the techniques described herein are implemented by one or more special-purpose computing devices.
  • the special-purpose computing devices may be hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination.
  • ASICs application-specific integrated circuits
  • FPGAs field programmable gate arrays
  • Such special-purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques.
  • the special-purpose computing devices may be desktop computer systems, portable computer systems, handheld devices, networking devices or any other device that incorporates hard- wired and/or program logic to implement the techniques.
  • FIG. 4 is a block diagram that illustrates a computer system 400 upon which an embodiment of the invention may be implemented.
  • Computer system 400 includes a bus 402 or other communication mechanism for communicating information, and a hardware processor 404 coupled with bus 402 for processing information.
  • Hardware processor 404 may be, for example, a general purpose microprocessor.
  • Computer system 400 also includes a main memory 406, such as a random access memory (RAM) or other dynamic storage device, coupled to bus 402 for storing information and instructions to be executed by processor 404.
  • Main memory 406 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 404.
  • Such instructions when stored in non-transitory storage media accessible to processor 404, render computer system 400 into a special-purpose machine that is customized to perform the operations specified in the instructions.
  • Computer system 400 further includes a read only memory (ROM) 408 or other static storage device coupled to bus 402 for storing static information and instructions for processor 404.
  • ROM read only memory
  • a storage device 410 such as a magnetic disk or optical disk, is provided and coupled to bus 402 for storing information and instructions.
  • Computer system 400 may be coupled via bus 402 to a display 412, such as a cathode ray tube (CRT), for displaying information to a computer user.
  • a display 412 such as a cathode ray tube (CRT)
  • An input device 414 is coupled to bus 402 for communicating information and command selections to processor 404.
  • cursor control 416 is Another type of user input device
  • cursor control 416 such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 404 and for controlling cursor movement on display 412.
  • This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
  • Computer system 400 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 400 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 400 in response to processor 404 executing one or more sequences of one or more instructions contained in main memory 406. Such instructions may be read into main memory 406 from another storage medium, such as storage device 410. Execution of the sequences of instructions contained in main memory 406 causes processor 404 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
  • Non- volatile media includes, for example, optical or magnetic disks, such as storage device 410.
  • Volatile media includes dynamic memory, such as main memory 406.
  • Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
  • Storage media is distinct from but may be used in conjunction with transmission media.
  • Transmission media participates in transferring information between storage media.
  • transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 402.
  • transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications .
  • Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 404 for execution.
  • the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer.
  • the remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem.
  • a modem local to computer system 400 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal.
  • An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 402.
  • Bus 402 carries the data to main memory 406, from which processor 404 retrieves and executes the instructions.
  • the instructions received by main memory 406 may optionally be stored on storage device 410 either before or after execution by processor 404.
  • the received code may be executed by processor 404 as it is received, and/or stored in storage device 410, or other non- volatile storage for later execution.
  • the received code may be executed by processor 604 as it is received, and/or stored in storage device 610, or other non- volatile storage for later execution.
  • an apparatus is a combination of one or more hardware and/or software components described herein.
  • a subsystem for performing a step is a combination of one or more hardware and/or software components that may be configured to perform the step.
EP12732016.6A 2011-01-06 2012-01-05 Verfahren und vorrichtung für gestenbasierte steuerungen Withdrawn EP2661669A4 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US12/986,054 US20120179967A1 (en) 2011-01-06 2011-01-06 Method and Apparatus for Gesture-Based Controls
US12/986,060 US9430128B2 (en) 2011-01-06 2011-01-06 Method and apparatus for controls based on concurrent gestures
PCT/US2012/020306 WO2012094479A1 (en) 2011-01-06 2012-01-05 Method and apparatus for gesture based controls

Publications (2)

Publication Number Publication Date
EP2661669A1 true EP2661669A1 (de) 2013-11-13
EP2661669A4 EP2661669A4 (de) 2017-07-05

Family

ID=46457709

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12732016.6A Withdrawn EP2661669A4 (de) 2011-01-06 2012-01-05 Verfahren und vorrichtung für gestenbasierte steuerungen

Country Status (5)

Country Link
EP (1) EP2661669A4 (de)
JP (2) JP6115728B2 (de)
CN (1) CN103329075B (de)
CA (1) CA2823388A1 (de)
WO (1) WO2012094479A1 (de)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014186577A (ja) * 2013-03-25 2014-10-02 Konica Minolta Inc ビューワ装置および画像形成装置
TW201543268A (zh) * 2014-01-07 2015-11-16 Thomson Licensing 用於使用手勢控制媒體播放之系統及方法
EP3158426B1 (de) * 2014-06-18 2019-09-11 Google LLC Verfahren, systeme und medien zur steuerung der wiedergabe eines videos anhand eines berührungsbildschirms
KR20160018268A (ko) * 2014-08-08 2016-02-17 삼성전자주식회사 라인 인터랙션을 이용하여 컨텐트를 제어하는 방법 및 장치
CN105446608A (zh) * 2014-09-25 2016-03-30 阿里巴巴集团控股有限公司 信息搜索方法、信息搜索装置及电子装置
CN107341259B (zh) * 2014-11-25 2020-11-20 北京智谷睿拓技术服务有限公司 搜索方法及装置
CN105744322B (zh) * 2014-12-10 2019-08-02 Tcl集团股份有限公司 一种屏幕焦点的控制方法及装置
CN104657059A (zh) * 2015-03-16 2015-05-27 联想(北京)有限公司 一种数据处理方法及电子设备
WO2017044326A2 (en) * 2015-09-09 2017-03-16 Cpg Technologies, Llc Device location and antitheft system
CN105389118B (zh) * 2015-12-10 2018-12-11 广东欧珀移动通信有限公司 一种音频文件的切换方法及用户终端
US10397632B2 (en) 2016-02-16 2019-08-27 Google Llc Touch gesture control of video playback
CN105867775A (zh) * 2016-03-28 2016-08-17 乐视控股(北京)有限公司 一种调整视频播放进度的方法及移动终端
DE202016105915U1 (de) * 2016-10-21 2016-11-16 Ma Lighting Technology Gmbh Lichtstellpult mit Drehregler
US10699746B2 (en) 2017-05-02 2020-06-30 Microsoft Technology Licensing, Llc Control video playback speed based on user interaction
JP6483306B1 (ja) * 2018-03-30 2019-03-13 ピーシーフェーズ株式会社 動画再生制御システム
CN109753148A (zh) * 2018-11-15 2019-05-14 北京奇艺世纪科技有限公司 一种vr设备的控制方法、装置及控制终端
CN113728621A (zh) 2019-04-09 2021-11-30 麦克赛尔株式会社 头戴式信息处理装置
KR102349629B1 (ko) * 2020-03-25 2022-01-12 주식회사 트윙클소프트 직관적인 ui/ux를 이용한 콘텐츠 공유 서비스 제공 방법
CN112104915B (zh) * 2020-09-14 2022-08-26 腾讯科技(深圳)有限公司 一种视频数据处理方法、装置及存储介质

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070177804A1 (en) * 2006-01-30 2007-08-02 Apple Computer, Inc. Multi-touch gesture dictionary
JP2001325071A (ja) * 2000-05-17 2001-11-22 Tokai Rika Co Ltd タッチ操作入力装置
EP2000894B1 (de) * 2004-07-30 2016-10-19 Apple Inc. Modusabhängige grafische Benutzerschnittstellen für berührungsempfindliche Eingabevorrichtungen
EP1662358B1 (de) * 2004-11-24 2009-10-21 Research In Motion Limited System und Verfahren zur selektiven Aktiverung eines Kommunikationsgerätes
EP1672471A1 (de) * 2004-12-14 2006-06-21 Thomson Multimedia Broadband Belgium Inhaltswiedergabegerät mit einer berührungsempfindlicher Bildschirm
US20080084400A1 (en) * 2006-10-10 2008-04-10 Outland Research, Llc Touch-gesture control of video media play on handheld media players
US7956847B2 (en) * 2007-01-05 2011-06-07 Apple Inc. Gestures for controlling, manipulating, and editing of media files using touch sensitive devices
DE202007014957U1 (de) * 2007-01-05 2007-12-27 Apple Inc., Cupertino Multimediakommunikationseinrichtung mit Berührungsbildschirm, der auf Gesten zur Steuerung, Manipulierung und Editierung von Mediendateien reagiert
US7889175B2 (en) * 2007-06-28 2011-02-15 Panasonic Corporation Touchpad-enabled remote controller and user interaction methods
US20090156251A1 (en) * 2007-12-12 2009-06-18 Alan Cannistraro Remote control protocol for media systems controlled by portable devices
KR20090077480A (ko) * 2008-01-11 2009-07-15 삼성전자주식회사 조작 가이드를 표시하는 ui 제공방법 및 이를 적용한멀티미디어 기기
JP4666053B2 (ja) * 2008-10-28 2011-04-06 ソニー株式会社 情報処理装置、情報処理方法およびプログラム
JP5233708B2 (ja) * 2009-02-04 2013-07-10 ソニー株式会社 情報処理装置、情報処理方法およびプログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2012094479A1 *

Also Published As

Publication number Publication date
JP6220953B2 (ja) 2017-10-25
EP2661669A4 (de) 2017-07-05
JP2014506369A (ja) 2014-03-13
CN103329075B (zh) 2017-12-26
JP6115728B2 (ja) 2017-04-19
WO2012094479A1 (en) 2012-07-12
CN103329075A (zh) 2013-09-25
JP2017054538A (ja) 2017-03-16
CA2823388A1 (en) 2012-07-12

Similar Documents

Publication Publication Date Title
US9430128B2 (en) Method and apparatus for controls based on concurrent gestures
US20120179967A1 (en) Method and Apparatus for Gesture-Based Controls
JP6220953B2 (ja) ジェスチャに基づく制御方法および装置
US10921980B2 (en) Flick to send or display content
US11792256B2 (en) Directional touch remote
US20100101872A1 (en) Information processing apparatus, information processing method, and program
US20120308204A1 (en) Method and apparatus for controlling a display of multimedia content using a timeline-based interface
US20090153389A1 (en) Scroll bar with video region in a media system
US20160253087A1 (en) Apparatus and method for controlling content by using line interaction
KR20100125784A (ko) 터치입력 방식의 전자기기 및 그 제어방법

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20130802

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: TIVO SOLUTIONS INC.

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: TIVO SOLUTIONS INC.

RA4 Supplementary search report drawn up and despatched (corrected)

Effective date: 20170608

RIC1 Information provided on ipc code assigned before grant

Ipc: G11B 27/10 20060101ALI20170601BHEP

Ipc: H04N 21/41 20110101ALI20170601BHEP

Ipc: H04N 21/422 20110101ALI20170601BHEP

Ipc: G06F 3/0488 20130101AFI20170601BHEP

Ipc: G06F 1/32 20060101ALI20170601BHEP

Ipc: G11B 27/00 20060101ALI20170601BHEP

Ipc: G11B 27/34 20060101ALI20170601BHEP

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: TIVO SOLUTIONS INC.

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20201104

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20210316