WO2016074120A1 - Scrolling animation or video playback controlled by touch user interface - Google Patents

Scrolling animation or video playback controlled by touch user interface Download PDF

Info

Publication number
WO2016074120A1
WO2016074120A1 PCT/CN2014/090689 CN2014090689W WO2016074120A1 WO 2016074120 A1 WO2016074120 A1 WO 2016074120A1 CN 2014090689 W CN2014090689 W CN 2014090689W WO 2016074120 A1 WO2016074120 A1 WO 2016074120A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
moving
display area
display
response
Prior art date
Application number
PCT/CN2014/090689
Other languages
French (fr)
Inventor
Yu He
Yuan LIANG
Meng JIANG
Original Assignee
Tencent Technology (Shenzhen) Company Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology (Shenzhen) Company Limited filed Critical Tencent Technology (Shenzhen) Company Limited
Priority to PCT/CN2014/090689 priority Critical patent/WO2016074120A1/en
Publication of WO2016074120A1 publication Critical patent/WO2016074120A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • a typical such current user interface has the user starting and stopping the rendered animation or video by touching a play or pause icon on a touch screen. This is sometimes called “click to play”.
  • Such a current user interface is not very flexible in terms of the types of user input and control that may be responded to.
  • Another typical current interface is “auto play” in which the animation or video plays automatically after the page (data file) has finished loading. This interface is often employed for example with animated GIF images.
  • Still another current user interface sometimes called “scroll to trigger automatic playback) responds to the user picking a location on a time line (either by tapping or scrolling) and the video starts playing automatically from that point in time.
  • a display area takes up less than all of the area of a display screen.
  • the user touches the display area and moves the touch in a first direction. This causes the display area to move in a corresponding fashion, and during such motion, the animated or video sequence is displayed in a first time sequence within the display area. Movement by the user of the touch in the opposite direction may also cause the display area to move in a corresponding direction, and during such motion, the animated or video sequence is displayed in the opposite time sequence.
  • Figure 1a shows a first display according to the invention.
  • Figure 1b shows a second display according to the invention.
  • Figure 1c shows a third display according to the invention.
  • Figure 1d shows a fourth display according to the invention.
  • FIG. 2 shows exemplary hardware with which the invention may be carded out.
  • Figures 3a and 3b show offset calculations according to the invention.
  • Figures 4a, 4b, 4c, and 4d show a sequence of events in a fifth embodiment of the invention.
  • FIG. 1a, 1b, 1c, and 1d we see a sequence of events.
  • a consumer electronics device 16 which may for example be a smart phone or a tablet.
  • the device has a touch-sensitive display screen 17.
  • FIG 1a we see a display area 12a in a first position on the display screen 17. It will be appreciated that the display area 12a comprises less than all of the area of the display screen 17. In Figure 1a the display area 12a displays a first frame of a multiframe animated sequence 11a.
  • Figure 1a shows at 13a a user touching the screen 17 at a touch point 15.
  • the touch point 15 is denoted in Figure 1d.
  • the user moves the touch point 15 upward in the figures, and as shown in Figure 1b the touch point has moved upwards as shown at 13b.
  • the display area location 12b has moved upward correspondingly from the location of the display area 12a.
  • the response further comprises displaying a second frame 11b of the multiframe animated sequence.
  • Figure 1c shows what happens when the touch point 15 has moved upwards even further than the movement shown in Figure 1b.
  • the display area location 12c has moved upward correspondingly according to the moving of the touch point 15.
  • the response further comprises displaying a third frame 11c of the multiframe animated sequence.
  • Figure 1d shows what happens when the touch point 15 has moved upwards even further than the movement shown in Figure 1c.
  • the display area location 12d has moved upward correspondingly according to the moving of the touch point 15.
  • the response further comprises displaying a fourth frame 11d of the multiframe animated sequence.
  • the multiframe animated sequence shown in Figures 1a, 1b, 1c and 1d is a sequence showing a bird in flight. It will be appreciated that the sequence may show other things such as actions of sports figures playing a sporting game.
  • the sequence of events shown in Figures 1a, 1b, 1c and 1d shows a user input as a touch point moves upwards and the result is that the display area location likewise moves upward and the animated sequence moves forward in time. It is convenient to configure the device 16 so that the opposite is also possible. By this is meant that a sequence could be from Figure 1d to Figure 1c and further to Figure 1b and finally to Figure 1a. With such a sequence the animated sequence moves backward in time. The user could thus move the touch point 15 upwards and downwards repeatedly and could watch as the animated sequence moves forward and backward repeatedly in time.
  • FIG. 1a, 1b, 1c and 1d shows a first embodiment in which “upward” movement of the user touch point bringing about “forward” movement in time for the displayed animated sequence and “downward” with “backward”. It will be appreciated that this is quite arbitrary in the sense that there could be a second embodiment in which “downward” movement of the user touch point brings about “forward” movement in time for the displayed animated sequence and “upward” with “backward”.
  • FIG. 2 shows typical hardware 41 with which the claimed invention may be carried out.
  • a processor 34 is connected by means of a display driver 33 to a display 31.
  • Touch-sensitive circuitry 32 provides touch-screen inputs to the processor 34.
  • Processor 34 contains a general-purpose processor or microprocessor 35, a digital signal processor 36, and a graphics processor or accelerator 37.
  • Processor 34 is linked by a communications link with random access memory (RAM) 40.
  • a radio frequency (RF) front end 38 connects the antenna 42 with the processor 34.
  • An RF power circuit 39 permits the processor 34 to transmit RF energy at the antenna 42.
  • Other hardware components including battery, charging circuit, and input-output devices are omitted for clarity in the view of Figure 2.
  • the method may be carried out by means of a suitable program loaded into RAM 40, making reference to an animated sequence which may be loaded into RAM 40 from local storage (e.g. read only memory (ROM)) or an online server (omitted for clarity in Figure 2).
  • the stored program uses graphics processor or accelerator 37 to drive display driver 33 which puts the desired images onto the display 31.
  • User touch inputs at the screen 31 are detected by circuitry 32 and provided to the processor 34 for use by the stored program.
  • the hardware together with the stored program provides means which carry out the various steps.
  • the discussion above refers to an animated sequence, which is intended to encompass as well a video sequence.
  • the form of storage may, for example, be PNG image sequences, JPG picture sequences, animated GIF images, and MP4 video files.
  • the touch by the user is described as being a touch at the display area, for example by means of touch location 15. It will be appreciated, however, that the touch that brings about the results described here may instead be a touch at a scroll bar 14. It will further be appreciated that the touch may be at a pointing device wheel such as a mouse wheel.
  • one aspect of the user experience is that the display area for the animated sequence is less than all of the display screen and the display area moves in one direction or another corresponding to the display of the animated sequence forward or backward in time.
  • the display of frames of the multiframe animated sequence is continuous in response to continuous motion by the user in the touch movement.
  • this continuous display in response to continuous motion by the user in the touch movement works both forward in time display and backwards in time display, and the progress and speed of the continuous display is monotonically related to the progress and speed of the user touch movement.
  • the progress and speed of the continuous display is directly proportional to the progress and speed of the user touch movement.
  • Figures 3a and 3b show offset calculations according to the invention.
  • the display screen 66 and within it a movable display area 67 which, as previously mentioned, comprises less than all of the area of the display screen 66.
  • a first frame of the animated sequence is displayed in Figure 3a.
  • the display area 67 may be so low on the screen 66 that a bottom portion of the display area 67 is not visible to the user.
  • the start position 62 for the display area is less than or equal to the scroll position 61.
  • the start position and scroll position may be conveniently measured as integer values of pixel positions on the screen.
  • the number of frames of animation is used to work out a step size 66.
  • the offset, divided by the step size, is used as a frame number to determine which frame is being displayed.
  • Figures 4a, 4b, 4c, and 4d show a sequence of events in a fifth embodiment of the invention.
  • a page has multiple animations.
  • the user can scroll up and down (or left and right in a related embodiment) and this controls the playback.
  • a first animation 61a, 61b, 61c, and 61d may be seen as a result of the user scrolling.
  • a second animation 62a, 62b, 62c, and 62d may also be seen as a result of the same user scrolling.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

In a touchscreen approach to display of an animated sequence, a display area takes up less than all of the area of a display screen. The user touches the display area and moves the touch in a first direction. This causes the display area to move in a corresponding fashion, and during such motion, the animated sequence is displayed in a first time sequence within the display area. Movement by the user of the touch in the opposite direction may also cause the display area to move in a corresponding direction, and during such motion, the animated sequence is displayed in the opposite time sequence.

Description

Scrolling animation or video playback controlled by touch user interface Background
The recent great popularity of smart phones and tablets has prompted many investigators to try to devise better user interfaces for display of animations and video. Most such user interfaces are little more than the porting of current user interfaces from desktop computer systems and notebook computer systems. A typical such current user interface has the user starting and stopping the rendered animation or video by touching a play or pause icon on a touch screen. This is sometimes called “click to play”. Such a current user interface is not very flexible in terms of the types of user input and control that may be responded to. Another typical current interface is “auto play” in which the animation or video plays automatically after the page (data file) has finished loading. This interface is often employed for example with animated GIF images. Still another current user interface, sometimes called “scroll to trigger automatic playback) responds to the user picking a location on a time line (either by tapping or scrolling) and the video starts playing automatically from that point in time.
These approaches all offer disadvantages. With these current approaches the user cannot freely control the playback process (for example with a GIF animation) . Or additional controls must be provided to control playback, for example with a player progress bar, controls which take up display screen real estate and are not very intuitive for the user to use. The user attempting to touch and scroll on the player progress bar will (in many current implementations) result in an interruption of the displayed video or animation, and the displayed video or animation only resumes later in a discontinuous fashion.
Summary of the invention
In a touchscreen approach to display of an animated sequence, a display area takes up less than all of the area of a display screen. The user touches the display area and moves the touch in a first direction. This causes the display area to move in a corresponding fashion, and during such motion, the animated or video sequence is displayed in a first time sequence within the display area. Movement by the user  of the touch in the opposite direction may also cause the display area to move in a corresponding direction, and during such motion, the animated or video sequence is displayed in the opposite time sequence.
Description of the drawing
The invention is described with respect to a drawing in several figures.
Figure 1a shows a first display according to the invention.
Figure 1b shows a second display according to the invention.
Figure 1c shows a third display according to the invention.
Figure 1d shows a fourth display according to the invention.
Figure 2 shows exemplary hardware with which the invention may be carded out.
Figures 3a and 3b show offset calculations according to the invention.
Figures 4a, 4b, 4c, and 4d show a sequence of events in a fifth embodiment of the invention.
Detailed description
Turning now to Figures 1a, 1b, 1c, and 1d, we see a sequence of events. We see a consumer electronics device 16 which may for example be a smart phone or a tablet. The device has a touch-sensitive display screen 17.
In Figure 1a we see a display area 12a in a first position on the display screen 17. It will be appreciated that the display area 12a comprises less than all of the area of the display screen 17. In Figure 1a the display area 12a displays a first frame of a multiframe animated sequence 11a.
Figure 1a shows at 13a a user touching the screen 17 at a touch point 15. (The touch point 15 is denoted in Figure 1d.) The user moves the touch point 15 upward in the figures, and as shown in Figure 1b the touch point has moved upwards as shown at 13b.
Responsive to the movement of the touch point due to the touch of the user at the display area 17, the display area location 12b has moved upward correspondingly from the location of the display area 12a. The response further comprises displaying a second frame 11b of the multiframe animated sequence.
Figure 1c shows what happens when the touch point 15 has moved upwards even further than the movement shown in Figure 1b. The display area location 12c has moved upward correspondingly according to the moving of the touch point 15. The response further comprises displaying a third frame 11c of the multiframe animated sequence.
Figure 1d shows what happens when the touch point 15 has moved upwards even further than the movement shown in Figure 1c. The display area location 12d has moved upward correspondingly according to the moving of the touch point 15. The response further comprises displaying a fourth frame 11d of the multiframe animated sequence.
The multiframe animated sequence shown in Figures 1a, 1b, 1c and 1d is a sequence showing a bird in flight. It will be appreciated that the sequence may show other things such as actions of sports figures playing a sporting game.
The sequence of events shown in Figures 1a, 1b, 1c and 1d shows a user input as a touch point moves upwards and the result is that the display area location likewise moves upward and the animated sequence moves forward in time. It is convenient to configure the device 16 so that the opposite is also possible. By this is meant that a sequence could be from Figure 1d to Figure 1c and further to Figure 1b and finally to Figure 1a. With such a sequence the animated sequence moves backward in time. The user could thus move the touch point 15 upwards and downwards repeatedly and could watch as the animated sequence moves forward and backward repeatedly in time.
The embodiment shown in Figures 1a, 1b, 1c and 1d shows a first embodiment in which “upward” movement of the user touch point bringing about “forward” movement in time for the displayed animated sequence and “downward” with “backward”. It will be appreciated that this is quite arbitrary in the sense that there could be a second embodiment in which “downward” movement of the user touch point brings about “forward” movement in time for the displayed animated sequence and “upward” with “backward”.
It will be further appreciated that there could be a third embodiment in which “rightward” movement of the user touch point brings about “forward” movement in time for the displayed animated sequence and “leftward” with “backward”. There could likewise be a fourth embodiment in which “leftward” movement of the user touch point brings about “forward” movement in time for the displayed animated sequence and “rightward” with “backward”.
Figure 2 shows typical hardware 41 with which the claimed invention may be carried out. A processor 34 is connected by means of a display driver 33 to a display 31. Touch-sensitive circuitry 32 provides touch-screen inputs to the processor 34. Processor 34 contains a general-purpose processor or microprocessor 35, a digital signal processor 36, and a graphics processor or accelerator 37. Processor 34 is linked by a communications link with random access memory (RAM) 40. A radio frequency (RF) front end 38 connects the antenna 42 with the processor 34. An RF power circuit 39 permits the processor 34 to transmit RF energy at the antenna 42. Other hardware components including battery, charging circuit, and input-output devices are omitted for clarity in the view of Figure 2.
The method may be carried out by means of a suitable program loaded into RAM 40, making reference to an animated sequence which may be loaded into RAM 40 from local storage (e.g. read only memory (ROM)) or an online server (omitted for clarity in Figure 2). The stored program uses graphics processor or accelerator 37 to drive display driver 33 which puts the desired images onto the display 31. User touch inputs at the screen 31 are detected by circuitry 32 and provided to the processor 34 for use by the stored program. The hardware together with the stored program provides means which carry out the various steps.
The discussion above refers to an animated sequence, which is intended to encompass as well a video sequence. The form of storage may, for example, be PNG image sequences, JPG picture sequences, animated GIF images, and MP4 video files.
The touch by the user is described as being a touch at the display area, for example by means of touch location 15. It will be appreciated, however, that the touch that brings about the results described here may instead be a touch at a scroll bar 14. It will further be appreciated that the touch may be at a pointing device wheel such as a mouse wheel. In any of these approaches, one aspect of the user experience is that the display area for the animated sequence is less than all of the display screen and the display area moves in one direction or another corresponding to the display of the animated sequence forward or backward in time.
Preferably the display of frames of the multiframe animated sequence is continuous in response to continuous motion by the user in the touch movement. Preferably this continuous display in response to continuous motion by the user in the touch movement works both forward in time display and backwards in time display, and the progress and speed of the continuous display is monotonically related to the progress and speed of the user touch movement. Ideally the progress and speed of the continuous display is directly proportional to the progress and speed of the user touch movement. When the user touch input movement stops, the animated display pauses.
Figures 3a and 3b show offset calculations according to the invention. In these figures we can see the display screen 66 and within it a movable display area 67 which, as previously mentioned, comprises less than all of the area of the display screen 66.
A first frame of the animated sequence is displayed in Figure 3a. In this figure the display area 67 may be so low on the screen 66 that a bottom portion of the display area 67 is not visible to the user. Saying this in a different way, the start position 62 for the display area is less than or equal to the scroll position 61. The start position and scroll position may be conveniently measured as integer values of pixel positions on the screen.
In Figure 3b, the user has conducted a touch movement that has moved upwards. This defines an  offset 65 which is determined by subtracting the scroll position 64 from the start position 63. The offset is used, as will now be described, to pick which frame of the animated sequence will be displayed.
The number of frames of animation is used to work out a step size 66. The offset, divided by the step size, is used as a frame number to determine which frame is being displayed.
Figures 4a, 4b, 4c, and 4d show a sequence of events in a fifth embodiment of the invention. In this embodiment, a page has multiple animations. The user can scroll up and down (or left and right in a related embodiment) and this controls the playback. A  first animation  61a, 61b, 61c, and 61d may be seen as a result of the user scrolling. In addition, a  second animation  62a, 62b, 62c, and 62d may also be seen as a result of the same user scrolling.
The alert reader will have no difficulty devising myriad obvious improvements and variations of the invention, all of which are intended to be encompassed within the claims which follow.

Claims (19)

  1. A method performed with respect to a consumer electronics device, the device having a display screen, the display screen being touch-sensitive, the display screen having an area, the method comprising the steps of:
    displaying within the display screen a first display area in a first position on the display screen, the first display area comprising less than all of the area of the display screen, the first display area displaying a first frame of a first multiframe animated sequence;
    responding to a touch of a user at the first display area, the touch moving a first distance in a direction, the response comprising moving the first display area according to the moving of the touch thereby defining a second position, the second position being non-identical to the first position, the response further comprising displaying a second frame of the first multiframe animated sequence within the first display area at its second position;
    responding to the touch moving a second distance in the direction, the response comprising moving the first display area according to the moving of the touch thereby defining a third position, the third position being non-identical to the first position and non-identical to the second position, the response further comprising displaying a third frame of the first multiframe animated sequence within the first display area at its third position;
    responding to the touch moving a third distance in the direction, the response comprising moving the first display area according to the moving of the touch thereby defining a fourth position, the fourth position being non-identical to the first position and non-identical to the second position and non-identical to the third position, the response further comprising displaying a fourth frame of the first multiframe animated sequence within the first display area at its fourth position.
  2. The method of claim 1 in which the direction is upward and the first, second, third, and fourth frames depict forward movement in time of the sequence.
  3. The method of claim 1 in which the direction is downward and the first, second, third, and fourth frames depict forward movement in time of the sequence.
  4. The method of claim 1 in which the direction is leftward and the first, second, third, and fourth frames depict forward movement in time of the sequence.
  5. The method of claim 1 wherein the display of frames of the first multiframe animated sequence is continuous in response to continuous motion by the user in the touch movement.
  6. The method of claim 5 wherein the continuous display in response to continuous motion by the user in the touch movement works both forward in time display and backwards in time display, and the progress and speed of the continuous display is monotonically related to the progress and speed of the user touch movement.
  7. The method of claim 6 wherein the progress and speed of the continuous display is directlyproportional to the progress and speed of the user touch movement.
  8. The method of any of claims 1, 5, 6 or 7 wherein when the user touch input movement stops, the animated display pauses.
  9. The method of claim 1 further comprising:
    displaying within the display screen a second display area in a position on the display screen relative to the first display area, the second display area comprising less than all of the area of the display screen, the second display area displaying a first frame of a second multiframe animated sequence;
    responding to the touch of the user at the first display area, the response comprising moving the second display area according to the moving of the touch, the response further comprising displaying a second frame of the second multiframe animated sequence;
    responding to the touch moving a second distance in the direction, the response comprising moving the second display area according to the moving of the touch, the response further comprising displaying a third frame of the second multiframe animated sequence;
    responding to the touch moving a third distance in the direction, the response comprising moving the second display area according to the moving of the touch, the response further comprising displaying a fourth frame of the second multiframe animated sequence.
  10. Apparatus comprising a display screen and a means comprising hardware, the display screen being touch-sensitive, the display screen having an area, the apparatus further characterized in that:
    the means is responsive to a touch of a user for displaying within the display screen a first display area in a first position on the display screen, the first display area comprising less than all of the area of the display screen, the first display area displaying a first frame of a first multiframe animated sequence;
    the means is further responsive to the touch moving a first distance in a direction, the response comprising moving the first display area according to the moving of the touch, the response further comprising displaying a second frame of the first multiframe animated sequence;
    the means is further responsive to the touch moving a second distance in the direction, the response comprising moving the first display area according to the moving of the touch, the response further comprising displaying a third frame of the first multiframe animated sequence;
    the means is further responsive to the touch moving a third distance in the direction, the response comprising moving the first display area according to the moving of the touch, the response further comprising displaying a fourth frame of the first multiframe animated sequence.
  11. The apparatus of claim 10 in which the direction is upward and the first, second, third, and fourth frames depict forward movement in time of the sequence.
  12. The apparatus of claim 10 in which the direction is downward and the first, second, third, and  fourth frames depict forward movement in time of the sequence.
  13. The apparatus of claim 10 in which the direction is leftward and the first, second, third, and fourth frames depict forward movement in time of the sequence.
  14. The apparatus of claim 10 wherein the display of frames of the first multiframe animated sequence is continuous in response to continuous motion by the user in the touch movement.
  15. The apparatus of claim 14 wherein the continuous display in response to continuous motion by the user in the touch movement works both forward in time display and backwards in time display, and the progress and speed of the continuous display is monotonically related to the progress and speed of the user touch movement.
  16. The apparatus of claim 15 wherein the progress and speed of the continuous display is directly proportional to the progress and speed of the user touch movement.
  17. The apparatus of any of claims 10, 14, 15 or 16 wherein when the user touch input movement stops, the animated display pauses.
  18. The apparatus of claim 13, the means further characterized in that the means displays within the display screen a second display area in a position on the display screen relative to the first display area, the second display area comprising less than all of the area of the display screen, the second display area displaying a first frame of a second multiframe animated sequence;
    the means further responsive to the touch of the user at the first display area by moving the second display area according to the moving of the touch, the response further comprising displaying a second frame of the second multiframe animated sequence;
    the means further responsive to the touch moving a second distance in the direction, the response comprising moving the second display area according to the moving of the touch, the response further comprising displaying a third frame of the second multiframe animated sequence;
    the means further responsive to the touch moving a third distance in the direction, the response comprising moving the second display area according to the moving of the touch, the response further comprising displaying a fourth frame of the second multiframe animated sequence.
  19. The apparatus of claim 10 wherein the apparatus is a consumer electronic device.
PCT/CN2014/090689 2014-11-10 2014-11-10 Scrolling animation or video playback controlled by touch user interface WO2016074120A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/090689 WO2016074120A1 (en) 2014-11-10 2014-11-10 Scrolling animation or video playback controlled by touch user interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/090689 WO2016074120A1 (en) 2014-11-10 2014-11-10 Scrolling animation or video playback controlled by touch user interface

Publications (1)

Publication Number Publication Date
WO2016074120A1 true WO2016074120A1 (en) 2016-05-19

Family

ID=55953529

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/090689 WO2016074120A1 (en) 2014-11-10 2014-11-10 Scrolling animation or video playback controlled by touch user interface

Country Status (1)

Country Link
WO (1) WO2016074120A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022183967A1 (en) * 2021-03-01 2022-09-09 腾讯科技(深圳)有限公司 Video picture display method and apparatus, and device, medium and program product

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080034316A1 (en) * 2006-08-01 2008-02-07 Johan Thoresson Scalable scrollbar markers
US20090251423A1 (en) * 2008-04-04 2009-10-08 Lg Electronics Inc. Mobile terminal using proximity sensor and control method thereof
CN102722590A (en) * 2012-06-25 2012-10-10 宇龙计算机通信科技(深圳)有限公司 Terminal and image acquisition method
CN103473008A (en) * 2012-06-05 2013-12-25 Lg电子株式会社 Mobile terminal and method for controlling the same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080034316A1 (en) * 2006-08-01 2008-02-07 Johan Thoresson Scalable scrollbar markers
US20090251423A1 (en) * 2008-04-04 2009-10-08 Lg Electronics Inc. Mobile terminal using proximity sensor and control method thereof
CN103473008A (en) * 2012-06-05 2013-12-25 Lg电子株式会社 Mobile terminal and method for controlling the same
CN102722590A (en) * 2012-06-25 2012-10-10 宇龙计算机通信科技(深圳)有限公司 Terminal and image acquisition method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022183967A1 (en) * 2021-03-01 2022-09-09 腾讯科技(深圳)有限公司 Video picture display method and apparatus, and device, medium and program product

Similar Documents

Publication Publication Date Title
CN111405299B (en) Live broadcast interaction method based on video stream and corresponding device thereof
CN107783830B (en) Multitask management method and terminal equipment
EP2942779B1 (en) Multi-level progress bars, progress control method and apparatus
US9009594B2 (en) Content gestures
KR101233562B1 (en) Gui applications for use with 3d remote controller
EP2631767B1 (en) Method, computer readable medium and portable apparatus for scrolling a screen in a touch screen display apparatus
US20110087992A1 (en) Thumbnail image substitution
US11259090B2 (en) Method for adjusting multimedia playing progress
KR101866350B1 (en) Computing device and control method thereof
CN106796810B (en) On a user interface from video selection frame
CN103677628A (en) Image processing apparatus and control method thereof
CN112433693B (en) Split screen display method and device and electronic equipment
CN106020590B (en) View display method and device
US9349349B2 (en) Computer readable medium having program recorded therein, information processing apparatus, information processing method, and information processing system
KR102118091B1 (en) Mobile apparatus having fuction of pre-action on object and control method thereof
US20160364031A1 (en) Storage medium, display control device, display control system, and display method
WO2016074120A1 (en) Scrolling animation or video playback controlled by touch user interface
US11656752B2 (en) Electronic device for displaying slider track and slider and method of operating same
CN115550741A (en) Video management method and device, electronic equipment and readable storage medium
CN115002551A (en) Video playing method and device, electronic equipment and medium
CN113076495A (en) Content display method, content display device, electronic equipment and storage medium
CN109040823B (en) Bookmark display method and device
KR101230210B1 (en) Method, device for controlling user terminal having touch screen, recording medium for the same, and user terminal comprising the same
CN113691869B (en) Video playing method and device and electronic equipment
CN112732214B (en) Control method, electronic device, and readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14905825

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205N DATED 021017)

122 Ep: pct application non-entry in european phase

Ref document number: 14905825

Country of ref document: EP

Kind code of ref document: A1