KR20140044881A - Video peeking - Google Patents

Video peeking Download PDF

Info

Publication number
KR20140044881A
KR20140044881A KR1020147002241A KR20147002241A KR20140044881A KR 20140044881 A KR20140044881 A KR 20140044881A KR 1020147002241 A KR1020147002241 A KR 1020147002241A KR 20147002241 A KR20147002241 A KR 20147002241A KR 20140044881 A KR20140044881 A KR 20140044881A
Authority
KR
South Korea
Prior art keywords
video
swiping
screen
video image
different
Prior art date
Application number
KR1020147002241A
Other languages
Korean (ko)
Inventor
매튜 야곱 워드나
Original Assignee
톰슨 라이센싱
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201161515578P priority Critical
Priority to US61/515,578 priority
Application filed by 톰슨 라이센싱 filed Critical 톰슨 라이센싱
Priority to PCT/US2012/025878 priority patent/WO2013022486A1/en
Publication of KR20140044881A publication Critical patent/KR20140044881A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network, synchronizing decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network, synchronizing decoder's clock; Client middleware
    • H04N21/438Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving MPEG packets from an IP network
    • H04N21/4383Accessing a communication channel, e.g. channel tuning

Abstract

Display systems and new user interfaces for video display devices with a touch screen make it possible to peek, ie briefly watch, selected video content while viewing other content. During the video peak, the video from the second video source partially displaces and temporarily replaces a portion of the video currently being watched. The selection of different video sources is for example swiping with one, two, three or four finger tips, and from any of the four edges of the video display. It can be controlled by swiping in. Also, the video currently being watched can be exchanged for the video being picked.

Description

Video picking {VIDEO PEEKING}

This application claims the benefit of the filing date of Provisional Patent Application No. 61 / 515,578, filed on August 5, 2011, under 35 USC§119 (e), and filed on August 5, 2011. The subject matter of No. 61 / 515,578 is incorporated by reference. As far as the background of the present invention is concerned, the present application relates to PCT / US2010 / 049772, which is jointly owned with an international application date of September 22, 2010.

Background of the invention relates to the general operation of a touch activatable screen of a video tablet or other display device. Descriptions of touch-activated screens of video tablets or other display devices found herein may also be found in PCT / US2010 / 049772. Mentioned in the original international search report for PCT / US2010 / 049772, the patents, published applications and articles related to the general operation of touch-activated screens are: US2009210819A1; US20070013708A1; US20080079972A1; US20090153478A1; US2009019924A1; EP1450277A2; And SHIRAZI J: "Java Performance Tuning-Chapter 4 Object Creation". The elements shown in the figures may be implemented in various forms of hardware, software, or a combination thereof. Preferably, these elements are implemented in a combination of hardware and software in one or more suitably programmed general purpose devices that may include a processor, memory, and input / output interfaces. As used herein, the phrase “coupled” is defined to mean directly or indirectly connected through one or more intermediate components. These intermediate components may include both hardware and software based components.

This description illustrates the principles of the present disclosure. Thus, it will be understood by those skilled in the art that, although not explicitly described or shown herein, may implement various arrangements that implement the principles of the present disclosure and are included in its scope and spirit.

All examples and conditional languages listed herein are intended for educational purposes to help the reader understand the concepts and principles contributed to advancing the prior art by the inventor. This is to be construed as not placing any limitation on these specifically enumerated examples and conditions.

In addition, all statements herein that list the principles, aspects, and embodiments of the present disclosure, as well as specific examples thereof, are intended to include both structural and functional equivalents thereof. In addition, these equivalents are intended to include all currently known equivalents, as well as equivalents to be developed in the future, that is, any developed elements that perform the same function regardless of structure.

Thus, for example, it will be understood by those skilled in the art that the block diagrams presented herein represent conceptual diagrams of example circuitry that implements the principles of the present disclosure. Similarly, any flowcharts, flow diagrams, state transition diagrams, pseudocodes, and the like can be substantially represented in a computer readable medium and thereby whether or not a computer or processor is explicitly shown. It will be understood that the various processes can be executed by a computer or a processor.

The functions of the various elements shown in the figures may be provided through use of dedicated hardware, as well as hardware capable of executing the software in combination with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. In addition, the explicit use of the term "processor" or "controller" shall not be construed as referring solely to hardware capable of executing software, but may include digital signal processor (DSP) hardware, read-only memory (ROM) for software storage, Random access memory (RAM), and non-volatile storage, may be implicitly included without limitation.

Other conventional and / or customized hardware may also be included. Similarly, any of the switches shown in the Figures are conceptual only. Their functions may be performed through the operation of program logic, through dedicated logic, through program control and interaction of dedicated logic, or even manually, as certain techniques may be more specifically understood from the context. Likewise it can be chosen by the implementer.

In the claims of this specification, any element represented by a means for performing a specified function is intended to include any method of performing such a function, for example, a method for performing such a function. Or combinations of circuit elements, or b) thus, any form of software, including firmware or microcode, etc., coupled with appropriate circuitry for executing software for performing such functions. The present disclosure, defined by these claims, belongs to the fact that the functions provided by the various listed means are combined and provided in a manner required by the claims. Therefore, any means capable of providing such functions is considered equivalent to the means shown herein.

The present disclosure provides a grid display, which is a graphical user interface view that enables a user to navigate a set of data elements in two-dimensional space (ie, x and y directions). The grid display may have a two dimensional (2D) pattern, such as columns and rows, but may take other forms. Navigation of the grid display can be implemented using commands, such as gestures, to locate the desired element. The entry of the grid display is tapped or otherwise selected to initiate additional actions, ie, to play or play the associated content. This interface mechanism is for use in media applications where items on the grid display can be represented graphically, such as by audio album covers or video poster images. Certain embodiments describe an apparatus and method associated with optimizations for views of a grid display implementation so that the number of display elements is minimized and this is independent of the number of items in the complete dataset. Embodiments also address issues with navigation through the database so that it can be smooth and efficient with respect to visual perception of the displayed portion. The apparatus and method may be specifically adapted for use in a content distribution network that includes controlled access to a large database of media content.

Navigation through the user interface of the present disclosure is facilitated by a mechanism for moving quickly, simply, and accurately across displays such as televisions, monitors, or touch screens. In one embodiment, an input device such as a motion sensing remote controller is provided. In another embodiment, a touch screen or panel remote device is used having a cursor on the screen that essentially tracks the user's fingers or fingers as they move across the screen of the remote controller. As the user traverses the grid display, the graphical elements representing content in the database move in response to the user's input that certain graphical elements disappear and new graphical elements appear. In an embodiment of a touch screen or panel remote device, it will be appreciated that the touch screen or panel remote device may serve as the display device itself, or may simply act as a tool for navigation. In a further embodiment, a conventional hand-held remote controller using an input mechanism or at least one button disposed on the surface of the remote controller to navigate the grid display is used.

Initially, systems for delivering various types of content to a user will be described. Subsequently, a method and user interface for retrieving content will be detailed in accordance with embodiments of the present disclosure.

Turning now to FIG. 1, shown is a block diagram of one embodiment of a system 100 for delivering content to a home or end user. The content originates from a content source 102 such as a movie studio or production house. The content may be supplied in at least one of two forms. One form may be a broadcast form of content. Broadcast content is typically routed to Broadcast affiliate manager 104, a National broadcast service such as the American Broking Company (ABC), National Broadcasting Company (NBC), Colombia Broadcasting System (CBS), and the like. Is provided. The broadcast member manager may collect and store the content and may schedule the delivery of the content via the delivery network shown as delivery network 1 106. Delivery network 1 106 may include satellite link transmission from a National Center to one or more regional or local centers. Delivery network 1 106 may also include local content delivery using local delivery systems such as over the air broadcast, satellite broadcast, or cable broadcast. The locally delivered content is provided to the receiving device 108 in the user's home, where the content will be retrieved later by the user. It will be appreciated that the receiving device 108 can take many forms and can be implemented as a set top box / digital video recorder (DVR), a gateway, a modem, or the like. In addition, the receiving device 108 may act as an entry point or gateway for a home network system that includes additional devices configured as clients or peer devices in the home network.

The second form of content is referred to as special content. Certain content may be delivered as other content not provided to the broadcast member manager, such as a premium view, a pay-per-view, or otherwise, such as a movie, video game, Content. In many cases, the specific content may be the content requested by the user. The specific content may be communicated to the content manager 110. Content manager 110 may be, for example, a content provider, a broadcast service, or a service provider, such as an Internet web site affiliated with a delivery network service. The content manager 110 may also merge the Internet content into a delivery system. The content manager 110 may deliver the content to the user's receiving device 108 via the delivery network 2 112, which is a separate delivery network. Delivery network 2 112 may include high speed broadband internet type communication systems. It may be assumed that the content from the broadcast member manager 104 may be delivered using all or a portion of the delivery network 2 112 and that content from the content manager 110 may be delivered to all Lt; / RTI > may be transmitted using portions or portions thereof. In addition, the user may acquire the content directly from the Internet via the delivery network 2 (112) without necessarily allowing the content to be managed by the content manager 110. [

Several adaptations may be possible for using separately delivered content. In one possible approach, certain content is provided as an augmentation to broadcast content that provides alternative displays, purchase and sale options, enhancement material, and the like. In other embodiments, certain content may completely replace some programming content provided as broadcast content. Finally, the particular content may be completely separate from the broadcast content, which may simply be a media alternative that the user may choose to use. For example, the particular content may be a library of movies that are not yet available as broadcast content.

Receiving device 108 may receive different types of content from one or both of delivery network 1 and delivery network 2. The receiving device 108 processes the content and provides separation of the content based on user preferences and commands. Receiving device 108 may also include a storage device, such as a hard drive or an optical disc drive, for recording and playing back audio and video content. Further details regarding the operation of the receiving device 108 and features associated with playing the stored content will be described below with respect to FIG. 2. The processed content is provided to the display device 114. The display device 114 may be a conventional 2-D type display, or alternatively it may be an advanced 3-D display.

A block diagram of one embodiment of the receiving device 200 is shown in FIG. The receiving device may operate similar to the receiving device 108 described in FIG. 1 and may be included as part of a gateway device, modem, set top box, or other similar communication device. The device 200 shown may also be incorporated into other systems, including the display device 114 itself. In either case, the various components necessary for the complete operation of the system are not shown for brevity, but they are well known to those skilled in the art.

At device 200, content is received at input signal receiver 202. Input signal receiver 202 is one of several known receiver circuits used to receive, demodulate, and decode signals provided over one of several possible networks, including terrestrial, cable, satellite, Ethernet, fiber, and telephone line networks. Can be. Preferred input signals may be selected and retrieved at input signal receiver 202 based on user input provided through a control interface (not shown). The decoded output signal is provided to an input stream processor 204. Input stream processor 204 performs final signal selection and processing and includes separation of video content from audio content for the content stream. Audio content is provided to an audio processor 206 for conversion from a received format, such as a compressed digital signal, to an analog waveform signal. The analog waveform signal is provided to the audio interface 208 and further provided to the display device 114 or an audio amplifier (not shown). Alternatively, the audio interface 208 may provide a digital signal to a display device or an audio output device using an alternative audio interface, such as via a high-definition multimedia interface (HDMI) cable or Sony / Phillips Digital Interconnect Format (SPDIF). Can be provided. The audio processor 206 also performs any necessary conversion for the storage of audio signals.

The video output from the input stream processor 204 is provided to the video processor 210. The video signal may be one of several formats. Video processor 210 provides the necessary conversion of video content based on the input signal format. Video processor 210 also performs any necessary conversion for the storage of video signals.

The storage device 212 stores the audio and video content received at the input. The storage device 212 is under the control of the controller 214 and also based on commands received from the user interface 216, such as navigation instructions such as fast forward (FF) and rewind (Rew). Allow later retrieval and playback of content. Storage device 212 may be a hard disk drive and one or more large capacity integrated electronic memories, such as static RAM (SRAM) or dynamic RAM (DRAM), or a compact disk (CD) drive or digital video disk (DVD) It may be an interchangeable optical disc storage system such as a drive.

The converted video signal from the video processor 210, from the input or from the storage device 212, is provided to the display interface 218. The display interface 218 further provides a display signal to a display device of the type described above. Display interface 218 may be an analog signal interface, such as Red-Green-Blue (RGB), or it may be a digital interface, such as HDMI. It will be appreciated that display interface 218 will generate various screens for representing search results in a three-dimensional grid, as described in more detail below.

Controller 214 is a bus to various components of device 200 including input stream processor 202, audio processor 206, video processor 210, storage device 212, and user interface 216. Are interconnected through. The controller 214 manages the conversion process for converting the input stream signal into a signal for storage in a storage device or for display. The controller 214 also manages the retrieval and playback of stored content. In addition, as described below, the controller 214 performs the retrieval of content and the creation and adjustment of a grid display representing the content to be stored or to be delivered via the delivery networks described above. Controller 214 stores control memory 220 (e.g., RAM, SRAM, DRAM, ROM, programmable ROM (PROM), flash memory, electronically programmed to store information and command codes for controller 214). And volatile or nonvolatile memory including electronically erasable and programmable ROM (EEPROM). In addition, the implementation of memory may include various possible embodiments, such as a single memory device or, alternatively, one or more memory circuits coupled or coupled together to form a shared or common memory. In addition, the memory may be included in other circuits, such as portions of bus communication circuitry in larger circuits.

To operate effectively, the user interface 216 of the present disclosure uses an input device to move the cursor around the display. To further improve the user experience and to facilitate the display of a database, such as a movie library, and navigation around the database, the touch panel device 300 can be interfaced to the receiving device 108 shown in FIG. 3A. The touch panel device 300 allows for operation of the receiving device or set top box based on hand movements or gestures and actions converted into commands through the panel for the set top box. In one embodiment, the touch panel 300 may simply serve as a navigational tool for navigating the grid display. In other embodiments, the touch panel 300 will additionally serve as a display device allowing the user to interact more directly with the navigation through a grid display of content.

Alternatively, a mouse device, a remote controller with navigation features, or a gesture based remote controller can also be used, as described below.

User interface control may be included in the receiving device 200 as part of the user interface 216 or as part of the controller 214. User interface controls incorporate features useful for display and navigation through a grid representing content in a database, as well as for video display of content. The user interface, and more specifically the grid user interface element, is incorporated into a video media player interface that includes scripting or programming functions for manipulation of graphics. The video media player and interface may be implemented at the receiving device 200 using any combination of hardware, software, or firmware. Alternatively, some of the control and video display operations may be included in the touch panel device 300 and may also be part of the information transmitted over the home network.

In another embodiment, the input device is a remote controller in the form of motion detection such as a gyroscope or accelerometer that allows the user to move the cursor freely around the screen or display. An exemplary hand-held angle-sensing remote controller 301 is shown in FIG. 3B. The remote controller 301 includes a thumb button 302 positioned on the top side of the controller 301 to be selectively activated by the user's thumb. Activation of the thumb button 302 will also be referred to as "click", a command often associated with the activation or launch of a selected function. The controller 301 further includes a trigger button 304 positioned on the bottom side of the controller 301 to be selectively activated by the user's index (or “trigger”) finger. Activation of the trigger button 304 will also be referred to as a "trigger", where an angular movement of the controller 301 with the trigger depressed (ie, pitch, yaw) yaw) and / or roll} will be referred to as "trigger-drag". Trigger-drag commands are often associated with the movement of the user's other interactive position indications on the display, such as cursors, virtual cursors, or changes in state (ie, highlighted and outlined cells), which are interactive displays Commonly used to navigate and select entries from. In addition, a plurality of buttons 306 are provided to enter numbers and / or letters. In one embodiment, the plurality of buttons 306 are configured similar to a telephone-type keypad.

The use of a portable angle-sensitive remote controller such as controller 301 described in FIG. 3B provides many types of user interaction. When using the angle-sensing controller, the change in yaw is mapped to the left and right motions, the change in pitch is mapped to the up and down motions, and the change in roll Is mapped to rotational motions along the longitudinal axis of the controller. These inputs are used to define gestures, and gestures also define commands on a specific context. Also, the combination of yaw and pitch can be used to define any two-dimensional motion, such as a diagonal, and the combination of yaw, pitch, and roll can be used to define any three-dimensional motion, such as swing. Can be. Many gestures are shown in FIG. 3. Gestures are interpreted in context and are identified as defined movements (“trigger-drag” movements) of controller 301 while trigger button 304 is held.

Bumping 320 is defined as a two-stroke drawing indicating one direction of up, down, left, or right. Bump gestures are associated with specific commands in the context. For example, in TimeShifting mode, the left-bump gesture 320 represents rewinding and the right-bump gesture represents fast-forwarding. In other contexts, the bump gesture 320 is interpreted to increase a certain value in the direction specified by the bump. Checking 330 is defined as drawing a checkmark. This is similar to the downward bump gesture 320. Checking is identified in the context by specifying a reminder and a user tag, or selecting an item or element. Circle 340 is defined as drawing a circle in both directions. It is possible that both directions can be distinguished. However, to avoid confusion, circles are identified by a single command regardless of direction. Dragging 350 is defined as the angled movement (change in pitch and / or yaw) of the controller while holding trigger button 304 (ie, “trigger drag”). The dragging gesture 350 is used for navigation, speed, distance, time-shifting, rewinding, and forwarding. Dragging 350 may be used to move a change in state such as highlighting, outlining, or selection on a cursor, virtual cursor, or display. Dragging 350 may be in any direction and is generally used to navigate in two dimensions. However, in certain interfaces, it is desirable to modify the response to the dragging command. For example, in some interfaces, motion in one dimension or direction is preferred over other dimensions or directions, depending on the location or direction of movement of the virtual cursor. Nodding 360 is defined as two fast trigger-drag up-and-down vertical movements. The naming 360 is used to indicate "yes" or "accept". X-ing 370 is defined as drawing the letter "X". X-ing 370 is used for "Delete" or "Block" commands. Wagging 380 is defined as two trigger-drag fast back-and-forth horizontal movements. The wagging gesture 380 is used to indicate "no" or "cancel".

In addition to conventional controllers for video playback, the input device will also include a mechanism for invoking or executing at least three separate options for any element selected in the display or screen. These options will be referred to as "additional information", "playback", and "additional search". The "additional information" function is used to display more information about the currently selected element. The "play" function will select the element to be played, assuming that it is available for the selected element, which may require a secondary user interface for content purchase or the like. The "additional search" function represents a mechanism that enables a user to use any element as a source for further advanced search to generate a whole new set of content, which is based on criteria defined by the selected element. Update the screen. These three options may be associated with, for example, predefined or new gestures on the touch panel 300, or each option may be, for example, a plurality of buttons 306 on the remote controller 301. Will be assigned to a predetermined button.

It will be appreciated that at least some of the components described above in connection with FIGS. 1-3 will form an apparatus and / or system for generating a user interface.

4 shows a graphical flow diagram for the navigational aspects of a grid display of the present disclosure and the operation of a user interface related to the display. Initially, at step 402, video content 401 from a broadcast source or a particular managed source may be displayed. At step 404, entering the main menu of the interface may be implemented by tapping or otherwise selecting on the video screen. The main menu screen may include many user information elements 403 and may also include a portion of the display that still displays the previous video content 401. Video content may continue to run or be placed in pause mode.

Navigation to the content library may include using a search, browse, or recommend button by tapping or otherwise selecting the desired button. In step 406, selection of a search or recommend button accesses a linear display structure of specific objects in a library or database and includes an actor, genre, or title criteria. It includes additional criteria for search or recommended features to limit the coverage size of the database. The linear grid 405 may be even more useful for these access functions due to the limitations placed on access to the library and the reduction of searchable content.

In step 408, selecting a browse function as a navigation tool pulls up a separate two-dimensional grid display 407 of content selection. The browse function provides access to a complete library or database and promotes very small restrictions on navigation around the database. The grid display 407 and navigation to the library or database will be described in more detail below. Entry or selection of an element from the content display (eg, by tapping over the element) provides additional details regarding the selected content entry at step 410 after the element has been highlighted or enlarged to one of the previous functional operations. Open the detail screen provided. The detail screen also provides additional options for playing or playing, renting, recording, or purchasing content, as well as options for returning to the previous content navigation function described above.

5 shows a detailed view of one embodiment of a grid display 500 using aspects of the present disclosure. Grid display 500 operates in a manner similar to that of grid display 407 described in FIG. 4. Grid display 500 is displayed on display device 114 and can be manipulated or navigated using touch panel 300 or other navigation devices described above. The interface screen also enables the user to interact more directly with the navigation through a grid display of content and can be displayed on the touch panel device 300 as a remote display.

Grid display 500 consists of several graphical elements 502 arranged in a two dimensional grid. The two-dimensional grid may include rows and columns, or may include some other two-dimensional pattern arrangement, such as a radial or elliptical pattern around one or more center points. In one embodiment, all the elements move together on successive units. Each graphical element 502 represents a single data entry location from a library or database of content referred to as a model of control software, which will be described below with respect to FIG. For example, grid display 500 includes graphical elements representing movie posters. Grid displays showing graphical elements representing book covers, albums, CD covers, etc. may also be used. In this case, the current item 504 is highlighted by adjusting the appearance of the item, such as by enlarging and centering elements within the view area. When the item is highlighted in response to a user input, the additional information may be provided with a graphical element associated with the particular content associated with the graphical element. In addition, certain content associated with the graphical element may be executed in response to additional user input, such as a movie to be played, a game to be loaded, and a website to be launched.

The grid display uses screen real-estate or display area to provide additional options and context in multiple dimensions. Navigation of the grid display is not limited to a single, generally horizontal dimension. Data in the grid, such as movie content, audio content, and the like, can be arbitrarily or explicitly constructed in a two-dimensional space. When explicitly constructed, the data or graphic elements 502 are constructed in accordance with at least one variable associated with the particular content associated with the graphical element. For example, rows may represent an alphabetical configuration, while columns may represent genres.

The optional element 502 consists of images representing specific content, such as a frame defined from a piece of recorded content, from a network or from a user, or from a library of common elements manually or automatically assigned to the content. Can be. Some of these elements may be increased with additional smaller elements for indicating the type of content, and / or text superimposed on or with the element itself. For example, elements representing content stored locally at a receiving device, such as the receiving device 108 described in FIG. 1, may include a disk drive in the bottom right hand corner of a large image representing the content itself. Small elements may be provided. The elements are configured to be detailed enough for the user to clearly identify the type of content they will present. Elements, such as including elements of content currently playing on a broadcast channel, can also be created partially or fully dynamically. For example, elements of a scene from recently broadcast video can be dynamically generated (delivered locally or from a network) and then combined with some indication or logo of the channel on which it is broadcast. This will enable the user to check at once what is currently being broadcast on many channels.

6 illustrates user operation and navigation for grid display using aspects of the present disclosure. Interaction with the grid display 600 shown in FIG. 6 is described in connection with the touch panel device 300 shown in FIG. 3A. Gestures made by the user's hand on a touch or capacitively sensitive panel are translated into messages delivered to the receiving device 108 via a network, such as the home network described in FIG. 1. The messages are converted by the controller in the touch panel device into changes processed by the set top box or the receiving device. Messages that generate changes may be interpreted by the receiving device 108 in a manner that causes different effects within the display structure portions (known as views) of the implementation and the physical data structure representing the content library (known as models). It is important to note.

As an example, as shown in FIG. 3A, when initiating a drag motion from the lower left to the upper right on the touch panel, the item at position B, as shown in FIG. 6. {Element 602} is moved off from the screen of the display device to the upper right and replaced by an item at position A {element 604}, and additionally an item at position C {element 606} is positioned A The elements of grid 600 move so that they move. In addition, the movements can be animated on the display for a smooth transition. Momentum effects can also be applied to improve the physics of the view. For example, the rate of speed at which the gesture is made may be converted to the distance at which the display is shifted through the grid and / or grid display.

It is important to note that the touch panel interface is probably the only one of many input devices that can be used for input to the apparatus or system. For example, the use of the portable angle-sensing controller shown in FIG. 3B provides many types of user interaction.

7 illustrates a state control diagram for implementation of a grid display using aspects of the present disclosure. The implementation for the grid display and interface follows the model-view-controller (MVC) coding structure. The model portion 702 or database library holds the full set of data (or provides access to the full set of data) and also converts the virtual x / y coordinates 704 into a two-dimensional matrix. Correlate with specific data items that are arranged. The model portion 702 also tracks the currently selected item 706 in the dataset based on the virtual coordinates (ideally, focusing the selected item when created in the view). The controller portion 708 uses the mouse and other messages 710 (eg, from the remote input device) with relative x / y coordinate changes that are submitted to the model that also updates the virtual location 704. Convert. View portion 712 subscribes to events from model 702 and creates a grid for display based on the updates. View portion 712 includes location updates and item detail updates.

In addition, the implementation may include a control interpreter for a remote input device such as a touch panel using gesture interaction. Messages from the remote input device are communicated to the grid display implementation via interface software and / or hardware and are interpreted by the controller 708 into the model 702 for input.

A method for optimizing the display and interface of a grid display will be described with reference to FIGS. 8 to 13.

Initially, at step 802, the total number of items in the database of graphical elements is determined. Data or graphic elements are arranged in a two dimensional array that correlates to a two dimensional grid of virtual space. The extent of the virtual space depends on the height and width of the individual grid elements multiplied by the number of rows and columns of data in the data set. In addition, the data sets need not be arranged symmetrically in the horizontal and vertical dimensions. Each data item in the data set is at least one of an image, title, rating, and uniform resource locator (URL), and other metadata associated with a feature film, which is a particular piece of content in the example described above. It includes.

9 illustrates an embodiment of a data structure 900 using aspects of the present disclosure. Data structure 900 may include an arrangement of a plurality of data elements arranged based on a pattern for display. For example, data structure 900 may include a data set of 6400 items with data elements arranged in an array of 80 columns by 80 rows (row and column dimensions need not be the same).

Data structure 900 shows two-dimensional indexing for each data element 902 as it relates between the display area and the arrangement on the display. As one illustrative description of an embodiment, the virtual dimensions for the grid including each of the 80x80 elements, with a virtual dimension of 150x200 pixels, will provide a virtual space of 12,000x16,000 pixels. Rather than load images for all 6400 items in planar 12,000x16,000 pixels, the method of the present disclosure, at step 804, involves only one fraction of the entire space, ie graphics. Will generate a first subset of elements. This is a data set that, at step 806, constitutes a visible area and an additional border area to facilitate caching sufficient to support smooth navigation to adjacent areas of the grid. Implemented by selecting "Window". 10 shows a single element border 1002 around the visible area 1004. As described in more detail below, all elements 1001 in the border 1002 are created, but only the elements 1003 in the area 1004 are visible. In cases where it is desirable to obtain the worth of the full screen of information in any direction, the border area 1002 can be increased to support fast gesture movement across the entire screen width.

Loading priorities for data (such as images) may be established. The images will be preferentially loaded outward from the center relative to the edge of the edge 1002. Once the direction of movement is known, the image loading priority is weighted to the elements entering the view relative to the elements leaving the view. 10 further shows the visible area 1004, the border area 1002, and the non-generated virtual data space 1006. The resulting visual elements are labeled AA {element 1001}, BA {element 1005}, and the like. It will be appreciated that graphical elements in the virtual data space 1006, such as element 1007, are not created and are designated as containers or placeholders. When element 1007 enters region 1002 in response to user input, the container will be loaded and the graphic element associated with the same will be loaded or created.

In step 808, a portion of the first subset of graphical elements will be displayed. As shown in FIG. 10, elements in the visible area 1004 are created and visible on the display, while elements of the pre-generated border area 1002 are created but not visible on the display. In step 810, the position of the at least one graphic element in the first subset is adjusted to a center point on the display in response to user input, and the second subset of graphic elements is at least at the center point in step 812. It is displayed with one graphic element. It will be appreciated that once the elements are moved, the elements of the border area 1002 will appear quickly to the visible area 1004 because they have already been created. For example, if the user selects an element {CE (37, 25)} and drags it to the upper left, at least the element {GC 41, 23} will be moved from the visible area to the border area 1002. will be. In addition, at least the elements AG 35, 27 will be moved from the border area 1002 to the visible area 1004, and since the element AG has already been created and cached in the border area, the transition is seamless. Will appear).

Certain exceptions can be handled, such as when edge or corner data is accessed. The grid display will be configured to avoid navigating beyond the edge. The distal position relative to the edge will ideally be focused (and highlighted with the selected item) in the view, as shown in FIG. Here, elements 1, 1 (element 1101) are concentrated in the visible area 1004. If the user selects element 1, 1 (element 1101) and attempts to move in the direction toward the lower right corner of the display, the elements will either stay in the same position or be locked. In this way, user input will not cause a blank screen.

12 is a flowchart of an optimization process for displaying a portion of a larger database or library in a grid display using aspects of the present disclosure. Optimization involves reusing visual elements rather than allocating and de-allocating visual elements as needed. At step 1202 the method begins and at step 1204 proceeds with positioning of the element based on the change, i. E. In response to user input. In step 1206, it is determined whether the position of the element exceeds the boundary area 1002. When the display element moves out of the pre-generated border area 1002, the element is moved to the opposite edge of the border area in step 1208. In step 1210, the view element queries the model for data related to the new location of the virtual space. The virtual space position is determined by the current virtual position offset by the actual display coordinates.

FIG. 13 illustrates one exemplary embodiment for the movement of display elements in a grid display following a response to user input for shifting a window of displayed elements with a database or library. The display diagram shows how when the grid is moved diagonally, it shifts to the bottom and left to fill in the visual elements on the top and right. For example, element IB 1302 on the right side shifts the physical location to IB '1304 on the left side. As part of this transition, the visual element queries the model to obtain data related to the new virtual location (in the case of IB '1304, it will be the data elements 34 and 22). In the case of visual element IA 1306, there are actually two movements, one movement from right to left is IA '1308 and the other movement from top to bottom is the final position of IA "1310. In this way, when an element is moved to the border area 1002, it is moved to another location within the border area so that the underlying container or placeholder for the data or graphic element does not need to be unloaded. Will be reused, but only graphic elements, which are less intensive resources than having to load a new container, will need to be loaded.

The grid display of the present disclosure can be used to browse content such as hundreds or even thousands of items, such as movies. Creating a visual element for each element can be a strong processor. The techniques of this disclosure provide for minimizing the number of display elements required to create a grid display view while maintaining the illusion of navigating a larger virtual space of information.

One embodiment of software code that can be used to operate a video display with a touch-activated screen described above can be found in Provisional Patent Application Nos. 61 / 515,578 and PCT / US2010 / 049772. Software code represents a means for enabling and implementing the features of the various inventions taught herein, which are not known when PCT / US2010 / 049772 is filed. The software code is illustrative, and it will be understood by those skilled in the art that other software code may be developed to implement the features of the invention taught herein. Thus, the noted software code is deemed to be understood by one skilled in the art and need not be repeated herein.

So far, apparatus and methods have been described for navigating through and displaying a database or library of elements representing available content. The elements may be image objects such as album covers or movie posters. The structure arranges the objects in a grid display for using a two-dimensional display, that is, a vertical dimension and a horizontal dimension for navigation purposes. Navigational aspects associated with the user interface include gesture based movements converted into display changes for a cover grid. Optimization of the view of the grid display implementation is described so that the number of display elements is minimized and independent of the number of items in the complete data set, and that navigation through the database is smooth and efficient for visual perception of the displayed portion. It became.

Although the present preferred embodiments of the apparatus, method, and user interface for grid navigation that are intended to be illustrative and non-limiting have been described, modifications and variations can be made by those skilled in the art in light of the foregoing teachings. This is noted.

Through the present invention, one would like to obtain a method and system capable of picking a second or likeable video content while viewing the first video content.

A new user interface and display system for a video display device with a touch screen makes it possible to pick up, ie, watch a second or favored video content while viewing the first video content. During the video peak, the video from the second video source partially displaces and temporarily replaces a portion of the video currently being watched. The selection of different video sources can be swiped with one, two, three or four fingers or finger tips, for example, and swipe in from any of the corners of the video display. Can be controlled. Also, the video currently being watched can be exchanged for the video being picked.

Provisions of interest relating to aspects of video picking beyond patents, published applications, and basic touch screen operation include: touch screen at least a portion of a structured electronic document comprising a plurality of content boxes US7864163B2 related to detecting on a display and detecting a first gesture at a location of the displayed portion of the structured electronic document such that a first box of a plurality of boxes within the location of the first gesture is determined. Then, the first box on the touch screen display is enlarged and substantially concentrated; US2010077433A1 related to a method for displaying program content from a subscription television service on a display and receiving a signal to initiate a multi-panel browsing mode on the display. The method includes displaying a multi-panel view on a display, the multi-panel view including a panel having program content and a panel having a top program based on top program information received from a server. Additional panels included in the multi-panel view may include interactive games or other content available from a subscription reservation television service; After browsing and navigating to the highlighted program titled cell through the grid guide of the Electronic Program Guide (EPG), it automatically starts a video clip preview and predetermines US2003126605B1 associated with an interactive television system designed to populate an electronic program guide (EPG) that provides Video-Clip Previews on Demand by remaining in such highlighted cells during the delay. The display process may be a “No-Touch Display” process that does not require any selection by the viewer while browsing; And "INDIRECT MULTI-TOUCH INTERACTION FOR BRUSHING IN PARALLEL COORDINATES", Kosara, R., Univ N Carolina, VISUALIZATION AND DATA ANALYSIS ) 2011│7868: Regarding the use of multi-touch interaction to provide fast and convenient interaction with parallel coordinates by directly using a multi-touch trackpad rather than a screen-2011, SPIE-INT SOC OPTICAL ENGINEERING, so the user's hand does not blur the manifestation during the interaction, and the user uses one, two, three, or four fingers on the trackpad to make complex selections in the data set. .

Arrangements of the present invention for video picking are a substantial improvement over, for example, picture-in-picture (PIP) functionality, since the PIP picture is so small that the PIP picture has a smaller resolution, This is because making the PIP picture larger than this impairs the relative completeness of the primary video source or main picture. The picture-in-picture function also does not provide video from the secondary video source that appears to flow in from the sides of the video display, and automatically extinguishes the video from the secondary video source within the positively appearing flow. does not provide (disappearance). In contrast, the PIP displays must be turned on and off by separate actuations of the remote controller. The same drawbacks are generally true for picture-outside-picture (POP) functions.

A user interface in accordance with an arrangement of the present invention comprises: a touch enabled screen; Detect swiping motions over areas of the screen, including distinguishing at least one or more different directions of swiping motions, different lengths of swiping motions, and different widths of the swiping motions; A touch screen processor; A video signal processor for selectively supplying a first video image to the screen and for selectively supplying at least one video image of the plurality of other video images to the screen, wherein at least one video of the plurality of other video images The image is selectively supplied to the screen for a given interval of time in response to the swiping motion across the screen occurring within a given range of directions, lengths, and widths, and the other video image supplied to the screen is the first video image. Is displayed on behalf of a portion of the.

The touch screen processor can preferably detect swiping motions across regions of the screen, distinguishing at least two of different directions of swiping motions, different lengths of swiping motions, and different widths of swiping motions. It involves doing. The touch screen processor may also preferably detect swiping motions across regions of the screen, distinguishing each of the different directions of the swiping motions, the different lengths of the swiping motions, and the different widths of the swiping motions. Include.

The other video image is generally supplied to the screen by deplacing a portion of the first video image in a sweeping motion corresponding to the direction of the swiping motion. At the end of the interval of time, the other video image supplied to the screen is withdrawn from the view in a sweeping motion generally corresponding to the direction opposite to the sweeping motion direction.

Most screens and each side of the screen are the starting point for at least one, two, three, or more of the swiping motions, unless all display devices have discernable sides.

In one presently preferred embodiment taught herein, each swiping motion is characterized by at least one of a swiping motion, swiping widths, swiping directions, and the point of origin of the swiping lengths. . Various combinations of the characteristics of the swiping motions can preferably result in different video images among a plurality of other video images selectively supplied to the screen. For example, if the screen is generally rectangular, different combinations of swiping widths and swiping directions can provide different video images selected from any of at least eight different video sources.

Additional control may be provided in accordance with the inventive arrangements. For example, maintaining pressure on the screen at the end of the swiping motion for a given time interval can result in another video image being replaced for the first video image. Alternatively, a longer swiping motion than necessary to initiate displaying one of the other video images on the screen results in one of the other video images being replaced for the first video image.

Arrangements of the invention also include: displaying a first video image on a screen; Detecting a swiping motion over the first area of the screen; Distinguishing different possible origins of the swiping motion, different possible directions of the swiping motion, and different possible widths of the swiping motion; Selecting at least one of the plurality of different video images in response to the distinguishing step; Supplying the selected video image to the screen for a given interval of time in place of the portion of the first video image; Terminating the supplying upon expiration of the interval of time, and may be implemented in a method for controlling a video display with a touch activated screen.

For distinguishing at least one range of the length of the swiping motion, arrangements of the present invention include: initiating the step of supplying in response to detecting a first length of the swiping motion occurring within a first range of lengths. step; And replacing the first video image with the selected video image in response to detecting the second length of the swiping motion occurring within the second range of lengths other than the first range. Alternatively, the method may include replacing the first video image with the selected video image in response to detecting a user input different than the swiping motion.

In a presently preferred embodiment, the supplying step further comprises gradually replacing a portion of the first video image by moving the selected video image in a direction generally corresponding to the direction of the swiping motion, but terminating Generally includes gradually replacing the selected video image into a portion of the first video image by moving the selected video image in a direction opposite to the direction of the swiping motion. In general, the method preferably includes associating different ones of the plurality of other video images with different combinations of different possible origins, directions, and widths of the swiping motion.

A new user interface and display system for a video display device with a touch screen makes it possible to pick up, ie, watch a second or favored video content while viewing the first video content.

1 is a block diagram of an exemplary system for delivering video content in accordance with the present disclosure;
2 is a block diagram of an exemplary receiving device in accordance with the present disclosure.
3A is a perspective view of a touch panel in accordance with the present disclosure.
FIG. 3B includes a perspective view of a wireless portable angle-sensing remote controller and illustrates many exemplary gestures performed through the remote controller. FIG.
4 is a graphical flow diagram of the operation of an example user interface in accordance with an example embodiment of the present disclosure.
5 illustrates an exemplary embodiment of a user interface of the present disclosure.
6 illustrates user actions and navigation of a user interface in accordance with one exemplary embodiment of the present disclosure.
FIG. 7 illustrates a state control diagram for an exemplary embodiment of a user interface in accordance with the present disclosure. FIG.
8 is a flow diagram of an example process for optimizing a user interface in accordance with one embodiment of the present disclosure.
9 illustrates two-dimensional indexing for each data element of the user interface.
FIG. 10 illustrates a visible area window and a border area of generated graphical elements for a user interface, in accordance with an exemplary embodiment of the present disclosure. FIG.
11 illustrates a view of a user interface in accordance with one exemplary embodiment of the present disclosure.
12 is a flow diagram of an example process for optimizing a user interface in accordance with another embodiment of the present disclosure.
FIG. 13 illustrates the movement of graphical elements in a grid of a user interface in accordance with one exemplary embodiment of the present disclosure. FIG.
14 illustrates a video display useful for describing the user interface of one exemplary embodiment in accordance with the present disclosure.
15 illustrates exemplary single and multiple finger swipes of respective widths corresponding to one, two, three, and four fingers in accordance with the inventive arrangements.
16A-16D illustrate various video display alternatives according to the flow chart shown in FIG. 14.
17A-17E sequentially illustrate video peaking in accordance with the present disclosure, for the purposes of the present application wherein color pictures are shown on a gray scale.

1-3 and the accompanying description describe the general operation of a video tablet in the context of methods and apparatus for grid navigation of a video tablet comprising a touch-enabled user interface.

4-7 and the accompanying description describe methods and apparatus for implementing the “video picking” feature described above.

In the drawings, like reference numerals refer to like elements throughout the drawings.

It is to be understood that the drawings are for the purpose of illustrating the concepts of the present disclosure and do not necessarily represent the only possible configuration for illustrating and implementing the present disclosure.

The arrangements of the present invention provide a user interface and display system for a video display device that makes it possible to pick preferentially and possibly pleasing video content while viewing other content, referred to herein as video picking. . Such video displays can receive video content from multiple video sources, for example and without limitation, and can be placed close enough to a user for touch screen activation and control, video tablets, video pads, smart Phones, e-books, laptop computers, computer monitors, and various television devices. Video content may also be as primary and secondary videos or video sources; Or more generally as a first video source and a plurality of other video sources. Swipe commands on the touch screen interface may, for example, enable a user to pick video from one of a plurality of different video sources for an interval of time.

According to the arrangements of the present invention, video from one of the other video sources will appear to partially deplay and temporarily replay a portion of the video that is watched along with the portion of the other selected video. The selection of other video sources can be controlled, for example, by swiping with one, two, three, or four finger tips. The selection of other video sources can be further controlled, for example, by swiping in from any of the four corners of the rectangular video display. In this way, for example, a user may view a video from a primary or primary video source based on how many finger tips are swiped at the same time and on which side of the video display the swipe originates. While watching, you can easily peak video from either of 16 different or secondary video sources. In addition, the touch screen may be further programmed to enable exchanging "peaked" video from the secondary source with what was previously video from the primary video source. Such an exchange may be implemented with a longer swipe than required for the peak, for example and without limitation. Then, the video from being the secondary source can be swipe-able accordingly.

Video peaking is described in more detail in conjunction with FIGS. 14-17. In one fundamental embodiment of video picking, a user watching a first video (or first video content) from a first video source may be a video from one of a plurality of other video sources (or second video content), After smoothly partially displaying and temporarily replacing a portion of the video that is watched along with the portion of the other selected video, a smooth disappearance of the other selected video may follow. The content of the first video source may also be considered the primary video source.

14 is a flowchart 1400 illustrating the use of swiping described above to implement and control video peaking at selected video sources in a video tablet or similarly capable device. In step 1402, the user selects a plurality of video sources that can be viewed in accordance with the inventive arrangements. These sources of video content are "favorable" or "preferred" sources that the user expects to watch over time by video picking according to the inventive arrangements, while still watching video content from other sources together. Can be considered as. As noted in the flow chart, the viewer can assign a first video source, referred to as video source 1; Additional video sources 2-5 may be assigned to Horizontal Video Peeking as video sources 2-5; Four additional video sources may be assigned to vertical video picking as video sources 6-9. Accordingly, when the user starts the display of video source 1 according to step 1404, and video source 1 is displayed according to step 1406, the user does not completely disturb the display of video source 1 by video picking. It has an available set of eight different channels or sources of video content that can be viewed briefly. The user can swipe from the top or bottom edges or sides of the video display by swiping from the left or right edges or sides of the video display in accordance with step 1408, or according to step 1412. You can choose to implement horizontal video peaking by using a vertical screen that pings.

As represented by the blocks associated with reference numeral 1410, swiping one, two, three, or four fingers horizontally may result in video peaks in video content or video sources 6, 7, 8, and 9 Will be disclosed respectively. As represented by the blocks associated with reference 1414, swiping one, two, three, or four fingers vertically is a video in video content or video sources 2, 3, 4, and 5 Each peak will be initiated. It will be appreciated that many types of video displays have been provided with the ability to rotate video content to compensate for rotation of the video display device. It will be appreciated that the arrangements of the present invention will be operable with control of this rotation.

In the embodiment shown in FIG. 14, the number of different sources of video content can be displayed in response to the swiping in the up or down direction and the four sources that can be displayed in response to the swiping in the left or right direction. There are four sources. The number of sources in FIG. 14 is not limited for the purposes of the present invention. In fact, for example, swiping in the left-only and right-only directions and swiping in the up-only and down-only directions One of 16 different sources of video content can be invoked for video picking. However, the number of different sources of video content will be limited to practical issues due to certain practical and personal issues. One such issue is how many combinations of directions and the number of fingers can be remembered by any given user. Another such issue is the user's manual dexterity. Thus, for flow chart 1400 , it will be understood that selecting less than eight other sources of video content still falls within the scope of the inventive arrangements. Indeed, even video picking a single other source of video content is within the scope of the present invention.

15 shows a video display device 1502. The display device has an upper edge or side 1504, a right side or edge 1506, a lower edge or side 1508, and a left side or edge 1510. Each of the dashed circles 1512, 1514, 1516, and 1518 corresponds to a location on the touch-activated display where a swiping motion can be initiated. The circles are not intended to represent exact size or shape. The dotted circles may be displayed as a temporary training measure, may be displayed permanently, or may remain undisplayed. The arrows are intended to indicate the swipe direction. The arrow associated with the dashed circle 1512 is intended to indicate a one-finger swipe. Arrows associated with dashed circle 1514 are intended to indicate a two-finger swipe. Arrows associated with dashed circle 1516 are intended to indicate a three-finger swipe. Arrows associated with dashed circle 1518 are intended to indicate a four-finger swipe.

The practical application of the flowchart 1400 of FIG. 14 is shown in FIGS. 16A-16D. For the flow chart 1400 of FIGS. 14 and 16A, a one-finger wide downward swipe will invoke video picking of video source 2 to video source 1. For the flowcharts 1400 of FIGS. 14 and 16, a two-finger wide swipe to the left will invoke video picking of video source 7 to video source 1. For the flowcharts 1400 of FIGS. 14 and 16C, a three-finger wide swipe will invoke video picking of video source 4 to video source 1. For the flow chart 1400 of FIGS. 14 and 16D, a four-finger wide swipe of right will invoke video picking of video source 9 to video source 1.

It will be noted that the association of the swiping width and the swiping direction can be made in a way that can optimize the value of video peaking. For example, FIG. 16A shows video peaking downward from the top side or corner. Tickers or banners are typically known to be displayed underneath news and weather related video content. Thus, it is advantageous for the user to select sources of news and weather related video content to be invoked with swipes directed downward so that the ticker or banner is fully visible. Similarly, in sources of video content for other state information, sporting events and scores are often displayed at the top of the video content, or in the upper left corner of the video content. A user familiar with display practices for broadcasts of the user's preferred teams can also select swiping options that provide immediate information about the status of sporting events.

17A-17E are a sequence 1700 of video frames 1702, 1704, 1706, 1708, and 1710 extracted from a video clip to visually illustrate video picking. In each of the video frames, the content of the first video source (video source 1 or primary video or primary video source) is referred to as 1712. Only video source 1, in this example, the penguin is displayed in frame 1702. As described in connection with FIGS. 15 and 16A-16D, circles 1716 and arrows 1718 are also shown in video frame 1702. At frame 1704 and in response to the swiping motion in the direction of arrow 1716, the portion of primary video 1712 is partially part of another video content 1714, a baseball game moving from the right side in this example. Can be displayed and temporarily replaced. In frame 1706, the displacement and replacement of a portion of the primary video 1714 is completed. In video frame 1708, the left edge of another video 1714 may be moved to the right and shown to be smaller. In frame 1710, only primary video 1712 is displayed once again.

According to presently preferred embodiments, it is contemplated that video peaking may last for an interval of time, which may be adjusted. According to further presently preferred embodiments, for example, the user can hold or hold the video peaking by continuing to press the touch screen at the end of the swipe, so that the video peak can indicate that the user presses the screen. It can continue until it stops. According to presently preferred embodiments, a user who picks a new video that wants to watch the picked video within the full screen can be pressed, for example by swiping across the entire screen, or at the end of the swiping motion. You can do that by doing In this regard, it is also contemplated that the limit at which the secondary video displays and replaces the primary video can be controlled by the length of the swipe.

According to the arrangements of the invention, the user can advantageously watch sports, news, or entertainment channels, for example and without limitation, and can pick up other videos to check scores and breaking news. Or may even determine whether the commercial of the entertainment video has ended and return to the primary video source accordingly.

102: content source 104: broadcast member manager
106, 112: delivery network 108: receiving device
110: content manager 114: display device
202: input signal receiver 204: input stream processor
206: audio processor 208: audio interface
210: video processor 212: storage device
214: controller 216: user interface
320: bumping 330: checking

Claims (20)

  1. User interface for video display,
    Touch-activated screen,
    A touch screen processor capable of detecting swiping motions over regions of the screen, the touch screen processor comprising: different directions of the swiping motions, different lengths of the swiping motions, and different of the swiping motions; A touch screen processor, comprising identifying at least one of the widths,
    A video signal processor for selectively supplying a first video image to the screen and selectively supplying at least one video image of a plurality of other video images to the screen
    Lt; / RTI >
    The at least one video image of the plurality of other video images is the screen for a given interval of time in response to a swiping motion across the screen occurring within a given range of the directions, the lengths, and the widths. Optionally supplied to
    The other video image supplied to the screen is displayed in place of a portion of the first video image.
  2. The device of claim 1, wherein the touch screen processor is capable of detecting swiping motions over regions of the screen, wherein different directions of the swiping motions, different lengths of the swiping motions, and the swiping motion User interface for distinguishing at least two of the different widths of the device.
  3. The device of claim 1, wherein the touch screen processor is capable of detecting swiping motions over regions of the screen, wherein different directions of the swiping motions, different lengths of the swiping motions, and the swiping motion Distinguishing each of the different widths of the device.
  4. The apparatus of claim 1, wherein the other video image supplied to the screen is to displace the portion of the first video image in a sweeping motion generally corresponding to the direction of the swiping motion. User interface for video display.
  5. 3. The method of claim 2, wherein at the end of the interval of time, the other video image supplied to the screen is withdrawn from view with a sweeping motion generally corresponding to the opposite direction of the swiping motion. (recede), user interface for video display.
  6. The user interface of claim 1, wherein the screen has discernable sides, each side of the screen being a starting point for at least one of the swiping motions.
  7. 3. The method of claim 2,
    Each side of the screen is a starting point for at least two of the swiping motions,
    The different combinations of the swiping widths and the swiping directions result in different ones of the plurality of other video images selectively supplied to the screen.
  8. The method of claim 3,
    Each side of the screen is a starting point for at least three of the swiping motions,
    The different combinations of the swiping widths and the swiping directions result in different ones of the plurality of other video images selectively supplied to the screen.
  9. 9. The method of claim 8,
    The screen is generally rectangular
    And the different combinations of the swiping widths and the swiping directions can provide the other video images selected from any one of at least eight different video sources.
  10. The user interface of claim 8, wherein the at least eight different video sources can be assigned to the swiping widths and at least eight different combinations of the swiping directions.
  11. The method of claim 1, wherein the maintenance of pressure on the screen at the end of a swiping motion for a given time interval results in the other video image being substituted for the first video image. , User interface for video display.
  12. 4. The further video image of claim 3, wherein a longer swiping motion than that required to initiate displaying an image of one of the other video images on the screen is replaced for the first video image. User interface for a video display resulting in an image of one of the two.
  13. A method for controlling a video display with a touch enabled screen, the method comprising:
    Displaying a first video image on the screen;
    Detecting a swiping motion over the first area of the screen;
    Distinguishing different possible origins of the swiping motion, different possible directions of the swiping motion, and different possible widths of the swiping motion;
    Selecting at least one of a plurality of different video images in response to said distinguishing;
    Supplying the selected video image to the screen for a given interval of time in place of the portion of the first video image;
    Terminating the supplying at the expiration of the interval of time
    Comprising a video display.
  14. The method of claim 13, further comprising distinguishing at least one range of the length of the swiping motion.
  15. The method of claim 14, wherein the step of distinguishing at least one range of the length of the swiping motion is:
    Initiating said supply in response to detecting a first length of said swiping motion occurring within a first range of lengths;
    Replacing the first video image with the selected video image in response to detecting a second length of the swiping motion occurring within a second range of lengths other than the first range.
    Comprising a video display.
  16. The method of claim 13, comprising replacing the first video image with the selected video image in response to detecting a user input different than the swiping motion.
  17. 15. The method of claim 13, wherein the supplying step gradually replaces the portion of the first video image by moving the selected video image in a direction generally corresponding to the direction of the swiping motion. And controlling the video display.
  18. 15. The method of claim 13, wherein the terminating step is to progressively ripple the selected video image into the portion of the first video image by generally moving the selected video image in a direction opposite the direction of the swiping motion. Racing. The method for controlling a video display.
  19. The video display of claim 13, further comprising associating different ones of the plurality of other video images with different combinations of the different possible origins, directions, and widths of the swiping motion. Method for controlling.
  20. 15. The method of claim 14,
    Said supplying further comprises gradually replacing said portion of said first video image by moving said selected video image in a direction generally corresponding to said direction of said swiping motion,
    The terminating step generally includes gradually replacing the selected video image into the portion of the first video image by moving the selected video image in a direction opposite the direction of the swiping motion. , Method for controlling the video display.
KR1020147002241A 2011-08-05 2012-02-21 Video peeking KR20140044881A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US201161515578P true 2011-08-05 2011-08-05
US61/515,578 2011-08-05
PCT/US2012/025878 WO2013022486A1 (en) 2011-08-05 2012-02-21 Video peeking

Publications (1)

Publication Number Publication Date
KR20140044881A true KR20140044881A (en) 2014-04-15

Family

ID=45809665

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020147002241A KR20140044881A (en) 2011-08-05 2012-02-21 Video peeking

Country Status (7)

Country Link
US (1) US9152235B2 (en)
EP (1) EP2740264B1 (en)
JP (1) JP6050352B2 (en)
KR (1) KR20140044881A (en)
CN (1) CN103797784A (en)
BR (1) BR112014002039A2 (en)
WO (1) WO2013022486A1 (en)

Families Citing this family (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8612874B2 (en) 2010-12-23 2013-12-17 Microsoft Corporation Presenting an application change through a tile
US9471145B2 (en) 2011-01-06 2016-10-18 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US9423878B2 (en) * 2011-01-06 2016-08-23 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US9465440B2 (en) 2011-01-06 2016-10-11 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US10353566B2 (en) 2011-09-09 2019-07-16 Microsoft Technology Licensing, Llc Semantic zoom animations
AU2013259613B2 (en) 2012-05-09 2016-07-21 Apple Inc. Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface
US20130300697A1 (en) * 2012-05-14 2013-11-14 Samsung Electronics Co. Ltd. Method and apparatus for operating functions of portable terminal having bended display
US9582157B1 (en) * 2012-08-03 2017-02-28 I4VU1, Inc. User interface and program guide for a multi-program video viewing apparatus
CN108845748A (en) 2012-12-29 2018-11-20 苹果公司 For abandoning generating equipment, method and the graphic user interface of tactile output for more contact gestures
US20140298245A1 (en) * 2013-03-28 2014-10-02 Microsoft Corporation Display Instance Management
KR101799294B1 (en) * 2013-05-10 2017-11-20 삼성전자주식회사 Display appratus and Method for controlling display apparatus thereof
US9071798B2 (en) 2013-06-17 2015-06-30 Spotify Ab System and method for switching between media streams for non-adjacent channels while providing a seamless user experience
US9516082B2 (en) 2013-08-01 2016-12-06 Spotify Ab System and method for advancing to a predefined portion of a decompressed media stream
KR20150024637A (en) * 2013-08-27 2015-03-09 삼성전자주식회사 Method for displaying data and an electronic device thereof
US9529888B2 (en) 2013-09-23 2016-12-27 Spotify Ab System and method for efficiently providing media and associated metadata
US9917869B2 (en) 2013-09-23 2018-03-13 Spotify Ab System and method for identifying a segment of a file that includes target content
US20150100885A1 (en) * 2013-10-04 2015-04-09 Morgan James Riley Video streaming on a mobile device
US9063640B2 (en) * 2013-10-17 2015-06-23 Spotify Ab System and method for switching between media items in a plurality of sequences of media items
US9686581B2 (en) * 2013-11-07 2017-06-20 Cisco Technology, Inc. Second-screen TV bridge
US9990125B2 (en) * 2013-11-15 2018-06-05 Lg Electronics Inc. Mobile terminal and method of controlling the same
KR101522468B1 (en) * 2013-12-05 2015-05-28 네이버 주식회사 Video transition method and video transition system
USD765139S1 (en) * 2013-12-24 2016-08-30 Tencent Technology (Shenzhen) Company Limited Portion of a display screen with graphical user interface
USD770325S1 (en) * 2013-12-24 2016-11-01 Tencent Technology (Shenzhen) Company Limited Penguin figurine
US10222935B2 (en) 2014-04-23 2019-03-05 Cisco Technology Inc. Treemap-type user interface
KR20150142347A (en) * 2014-06-11 2015-12-22 삼성전자주식회사 User terminal device, and Method for controlling for User terminal device, and multimedia system thereof
USD770492S1 (en) * 2014-08-22 2016-11-01 Google Inc. Portion of a display panel with a computer icon
KR20160028272A (en) * 2014-09-03 2016-03-11 삼성전자주식회사 Display apparatus and method for controlling the same
US10095396B2 (en) 2015-03-08 2018-10-09 Apple Inc. Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object
US9645732B2 (en) 2015-03-08 2017-05-09 Apple Inc. Devices, methods, and graphical user interfaces for displaying and using menus
US9632664B2 (en) 2015-03-08 2017-04-25 Apple Inc. Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
US9891811B2 (en) 2015-06-07 2018-02-13 Apple Inc. Devices and methods for navigating between user interfaces
US10200598B2 (en) 2015-06-07 2019-02-05 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US9860451B2 (en) * 2015-06-07 2018-01-02 Apple Inc. Devices and methods for capturing and interacting with enhanced digital images
US10346030B2 (en) 2015-06-07 2019-07-09 Apple Inc. Devices and methods for navigating between user interfaces
JP2017000480A (en) * 2015-06-11 2017-01-05 株式会社バンダイナムコエンターテインメント Terminal device and program
US9652125B2 (en) * 2015-06-18 2017-05-16 Apple Inc. Device, method, and graphical user interface for navigating media content
US10416800B2 (en) 2015-08-10 2019-09-17 Apple Inc. Devices, methods, and graphical user interfaces for adjusting user interface objects
US9990113B2 (en) 2015-09-08 2018-06-05 Apple Inc. Devices, methods, and graphical user interfaces for moving a current focus using a touch-sensitive remote control
US9928029B2 (en) 2015-09-08 2018-03-27 Apple Inc. Device, method, and graphical user interface for providing audiovisual feedback
USD811428S1 (en) * 2015-09-24 2018-02-27 4Thought Sa Display screen or portion thereof with transitional graphical user interface
CN106791352A (en) * 2015-11-25 2017-05-31 中兴通讯股份有限公司 A kind of photographic method, device and terminal
US10397632B2 (en) * 2016-02-16 2019-08-27 Google Llc Touch gesture control of video playback
USD833454S1 (en) * 2016-05-27 2018-11-13 Axis Ab Display screen or portion thereof with graphical user interface
US10372520B2 (en) 2016-11-22 2019-08-06 Cisco Technology, Inc. Graphical user interface for visualizing a plurality of issues with an infrastructure
US20180217719A1 (en) * 2017-02-01 2018-08-02 Open Tv, Inc. Menu modification based on controller manipulation data
USD831700S1 (en) * 2017-07-31 2018-10-23 Shenzhen Valuelink E-Commerce Co., Ltd. Display screen or portion thereof with graphical user interface

Family Cites Families (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60124196U (en) 1984-01-30 1985-08-21
JPH11103470A (en) * 1997-09-26 1999-04-13 Toshiba Ave Co Ltd Video changeover processing device
US20030126605A1 (en) 2001-12-28 2003-07-03 Betz Steve Craig Method for displaying EPG video-clip previews on demand
JP3925297B2 (en) * 2002-05-13 2007-06-06 ソニー株式会社 Video display system and video display control device
GB0303888D0 (en) 2003-02-19 2003-03-26 Sec Dep Acting Through Ordnanc Image streaming
US7975531B2 (en) 2005-03-18 2011-07-12 Nanyang Technological University Microfluidic sensor for interfacial tension measurement and method for measuring interfacial tension
US9041744B2 (en) 2005-07-14 2015-05-26 Telecommunication Systems, Inc. Tiled map display on a wireless device
US7532253B1 (en) * 2005-07-26 2009-05-12 Pixelworks, Inc. Television channel change picture-in-picture circuit and method
JP2007334525A (en) * 2006-06-14 2007-12-27 Sofny Group:Kk Computer, client/server computer group, server computer, display program, and display representation method
US7864163B2 (en) 2006-09-06 2011-01-04 Apple Inc. Portable electronic device, method, and graphical user interface for displaying structured electronic documents
US8564543B2 (en) * 2006-09-11 2013-10-22 Apple Inc. Media player with imaged based browsing
US7667719B2 (en) 2006-09-29 2010-02-23 Amazon Technologies, Inc. Image-based document display
JP4973245B2 (en) * 2007-03-08 2012-07-11 富士ゼロックス株式会社 Display device and program
US8194037B2 (en) 2007-12-14 2012-06-05 Apple Inc. Centering a 3D remote controller in a media system
US8250604B2 (en) 2008-02-05 2012-08-21 Sony Corporation Near real-time multiple thumbnail guide with single tuner
JP5039903B2 (en) 2008-02-18 2012-10-03 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Maschines Corporation System, method and program for executing application
JP5016553B2 (en) * 2008-05-28 2012-09-05 京セラ株式会社 Mobile communication terminal and terminal operation method
US20090328101A1 (en) * 2008-06-30 2009-12-31 Nokia Corporation User interface for mobile tv interactive services
KR101526973B1 (en) * 2008-07-07 2015-06-11 엘지전자 주식회사 Mobile terminal and method for controlling the same
US20100077433A1 (en) 2008-09-24 2010-03-25 Verizon Data Services Llc Multi-panel television browsing
KR101588660B1 (en) * 2008-09-30 2016-01-28 삼성전자주식회사 A display apparatus capable of moving image and the method thereof
JP4666053B2 (en) * 2008-10-28 2011-04-06 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5202425B2 (en) * 2009-04-27 2013-06-05 三菱電機株式会社 Video surveillance system
JP5179537B2 (en) * 2010-04-09 2013-04-10 株式会社ソニー・コンピュータエンタテインメント Information processing device
JP5541998B2 (en) * 2010-07-28 2014-07-09 株式会社ソニー・コンピュータエンタテインメント Information processing device
JP2012038271A (en) * 2010-08-11 2012-02-23 Kyocera Corp Electronic apparatus and method for controlling the same
US20120069055A1 (en) * 2010-09-22 2012-03-22 Nikon Corporation Image display apparatus
JP5678576B2 (en) * 2010-10-27 2015-03-04 ソニー株式会社 Information processing apparatus, information processing method, program, and monitoring system
US9471145B2 (en) * 2011-01-06 2016-10-18 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US9423878B2 (en) * 2011-01-06 2016-08-23 Blackberry Limited Electronic device and method of displaying information in response to a gesture
US20130141371A1 (en) * 2011-12-01 2013-06-06 Research In Motion Limited Electronic device and method of displaying information in response to a gesture
US20120262462A1 (en) * 2011-04-18 2012-10-18 Johan Montan Portable electronic device for displaying images and method of operation thereof
CN104040292B (en) * 2012-01-12 2016-05-04 三菱电机株式会社 Map display and map-indication method
KR20130090138A (en) * 2012-02-03 2013-08-13 삼성전자주식회사 Operation method for plural touch panel and portable device supporting the same
JP5882779B2 (en) * 2012-02-15 2016-03-09 キヤノン株式会社 Image processing apparatus, image processing apparatus control method, and program
JP5598737B2 (en) * 2012-02-27 2014-10-01 カシオ計算機株式会社 Image display device, image display method, and image display program
CN102929527A (en) * 2012-09-27 2013-02-13 鸿富锦精密工业(深圳)有限公司 Device with picture switching function and picture switching method
CN103902080A (en) * 2012-12-27 2014-07-02 华硕电脑股份有限公司 Touch device and touch processing method
KR102010955B1 (en) * 2013-01-07 2019-08-14 삼성전자 주식회사 Method for controlling preview of picture taken in camera and mobile terminal implementing the same
KR20150024637A (en) * 2013-08-27 2015-03-09 삼성전자주식회사 Method for displaying data and an electronic device thereof

Also Published As

Publication number Publication date
CN103797784A (en) 2014-05-14
JP2014529212A (en) 2014-10-30
US9152235B2 (en) 2015-10-06
WO2013022486A1 (en) 2013-02-14
EP2740264B1 (en) 2016-10-19
BR112014002039A2 (en) 2017-03-01
EP2740264A1 (en) 2014-06-11
JP6050352B2 (en) 2016-12-21
US20140176479A1 (en) 2014-06-26

Similar Documents

Publication Publication Date Title
KR101233562B1 (en) Gui applications for use with 3d remote controller
US8555315B2 (en) Systems and methods for navigating a media guidance application with multiple perspective views
US7159177B2 (en) System and method for focused navigation within a user interface
US8151215B2 (en) Favorite GUI for TV
US8504939B2 (en) Vertical click and drag to drill down into metadata on user interface for audio video display device such as TV
JP2013509008A (en) System and method for searching the internet on video devices
JP2007516496A (en) Control framework with zoomable graphical user interface for organizing, selecting and starting media items
KR101939316B1 (en) Systems and methods for navigating a three-dimensional media guidance application
KR20140121387A (en) Method and system for providing a display of social messages on a second screen which is synched to content on a first screen
US20080065989A1 (en) Playlist creation tools for television user interfaces
US7761812B2 (en) Media user interface gallery control
US20060248475A1 (en) Graphical user interface system
US20140059484A1 (en) Voice and video control of interactive electronically simulated environment
JP2007073053A (en) Method for selecting button in graphical bar and receiver implementing the same
US7810043B2 (en) Media user interface left/right navigation
JP2008541232A (en) Method and system for scrolling and pointing in a user interface
US8180672B2 (en) Systems and methods for placing advertisements
US20040123320A1 (en) Method and system for providing an interactive guide for multimedia selection
AU2006252194B2 (en) Scrolling Interface
JP2007108805A (en) Electronic equipment, display control method for electronic equipment, graphical user interface, and display control program
RU2530284C2 (en) User interface having zoom functionality
US20070067798A1 (en) Hover-buttons for user interfaces
US20110289458A1 (en) User interface animation for a content system
KR100904151B1 (en) Method of selecting a scheduled content item, method of accessing scheduled content data, method of displaying a hierarchical program guide, method of controlling a hierarchical program guide, computer readable medium, system for selecting a scheduled content item, system for accessing scheduled content data, and system for controlling a hierarchical program guide
KR101190462B1 (en) Scaling and layout methods and systems for handling one-to-many objects

Legal Events

Date Code Title Description
A201 Request for examination
WITB Written withdrawal of application