US20170177210A1 - Pattern based video frame navigation aid - Google Patents

Pattern based video frame navigation aid Download PDF

Info

Publication number
US20170177210A1
US20170177210A1 US15/447,399 US201715447399A US2017177210A1 US 20170177210 A1 US20170177210 A1 US 20170177210A1 US 201715447399 A US201715447399 A US 201715447399A US 2017177210 A1 US2017177210 A1 US 2017177210A1
Authority
US
United States
Prior art keywords
data
navigation
data file
file
viewing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/447,399
Inventor
Barry A. Kritt
Sarbajit K. Rakshit
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Maplebear Inc
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Priority to US15/447,399 priority Critical patent/US20170177210A1/en
Assigned to INTERNATIONAL BUSINESS MACHINES CORPORATION reassignment INTERNATIONAL BUSINESS MACHINES CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KRITT, BARRY A., RAKSHIT, SARBAJIT K.
Publication of US20170177210A1 publication Critical patent/US20170177210A1/en
Assigned to MAPLEBEAR INC. reassignment MAPLEBEAR INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: INTERNATIONAL BUSINESS MACHINES CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/453Help systems
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
    • G11B27/322Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/73Querying
    • G06F16/738Presentation of query results
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/7867Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, title and artist information, manually generated time, location and usage information, user ratings
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Definitions

  • the present invention relates generally to capturing data associated with an access file. More specifically, the invention relates to detecting a pattern of the accessed file, and capturing the pattern in the form of a navigation profile.
  • data in various forms is generally stored in some form of an electronic storage device.
  • Examples of such data include image data, such as still images and video.
  • Data pertaining to viewing of such files may be tracked and/or recorded.
  • the captured data can be used to determine the popularity of the file, demographics associated with viewers of the file, and/or recommendations to viewers for viewing the file.
  • This invention comprises a system and computer program product for capturing a navigation profile of a viewed data file and use of the captured profile.
  • a system, computer program product, and method are provided for capturing a first and second set of data, wherein the first set comprises data corresponding to a viewed portion of a data file by a first entity, and the second set comprises data corresponding to the viewed portion of the data file by a second entity.
  • a first navigation profile is formed from the first set of data and a second navigation profile is formed from the second set of data.
  • the formed navigation profiles are aggregated and a combined viewing pattern is identified from the aggregation.
  • a portion of the data file is recommended for a second viewing based on the combined viewing pattern and the recommended portion of the data file is displayed on a visual display. Gestural navigation is supported within the displayed portion including navigating between graphical representations of data categories.
  • FIG. 1 is a flow chart illustrating a method for creating a navigation profile for viewed data file.
  • FIG. 2 is a flow chart illustrating a method for graphically representing the navigation profile(s).
  • FIG. 3 is a flow chart illustrating a method for autonomous data file recommendation.
  • FIG. 4A and 4B are block diagrams illustrating examples of graphical representations of viewed files and associated captured view patterns.
  • FIG. 5 depicts a block diagram illustrating a system for automated data file recommendation.
  • FIG. 6 depicts a block diagram showing a system for implementing an embodiment of the present invention.
  • the functional unit described in this specification has been labeled with tools, modules, and/or managers.
  • the functional unit may be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices, or the like.
  • the functional unit may also be implemented in software for execution by various types of processors.
  • An identified functional unit of executable code may, for instance, comprise one or more physical or logical blocks of computer instructions which may, for instance, be organized as an object, procedure, function, or other construct. Nevertheless, the executable of an identified functional unit need not be physically located together, but may comprise disparate instructions stored in different locations which, when joined logically together, comprise the functional unit and achieve the stated purpose of the functional unit.
  • a functional unit of executable code could be a single instruction, or many instructions, and may even be distributed over several different code segments, among different applications, and across several memory devices.
  • operational data may be identified and illustrated herein within the functional unit, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices, and may exist, at least partially, as electronic signals on a system or network.
  • FIG. 1 is a flow chart ( 100 ) illustrating a method for creating the profile.
  • a file is selected ( 102 ) and a counting variable X is initialized ( 104 ).
  • the counting variable represents portions of the file.
  • portions may include but are not limited to segments, bytes, bits, ranges, etc.
  • the file is a video file and the counting variable pertains to individual frames that comprise the video.
  • the variable X total is assigned to the total number of portions in the selected file ( 106 ).
  • the file being described is a video file and the individual portions are individual frames of the video file.
  • One or more attributes associated with a selected frame, frame X are captured ( 108 ). These attributes may include whether frame x was subject to viewing or omitted from viewing.
  • frame x has an audio attribute capturing the decibel level in which frame x was viewed.
  • frame x has a speed attribute capturing the speed in which select frames were viewed. Accordingly, for each selected frame of the file, attribute data associated with the selection is collected.
  • a pattern associated with the collected attribute data may be formed from one or more selections of the file. Following the selected access of a data file, any pattern(s) associated with the captured attributes are identified ( 110 ). Examples of such patterns include, but are not limited to, a collection of selected portions having the greatest number of accesses or least number of accesses, and a collection of selected portions that were viewed at an increased viewing speed and/or volume etc. Following the capturing of attributes at step ( 108 ) and the pattern identification at step ( 110 ), the counting variable X is incremented ( 112 ). It is then determined if all of the portions of the selected file have been assessed ( 114 ). A negative response to the determination at step ( 114 ) is followed by a return to step ( 108 ).
  • a positive response to the determination at step ( 114 ) is following by evaluating the captured data. More specifically, the patterns are compiled into a navigation profile ( 116 ); the profile is a collection of the acquired attributes of the individual frames of the file. The navigation profile(s) is stored in an electronic storage device ( 118 ). Accordingly, individual frames of a video are separately assessed for attributes, and the collected attributes are summarized in a navigation profile identifying any associated attribute patterns.
  • FIG. 2 is a flow chart ( 200 ) illustrating a method for representing the navigation profile for a select file.
  • the navigation profile is an aggregation of one or more file accesses and frame selections.
  • multiple selections are compiled into a single navigation profile.
  • the navigation profile is unique to a viewing pattern of a user, and the compilation of multiple navigation profiles is a joining of the viewing patterns of various users into a single combined view.
  • multiple navigation profiles are compiled such that the attribute data of two or more viewing navigation profiles are aggregated into a single presentation ( 202 ).
  • the aggregated data is organized for presentation ( 204 ). Examples of the presentation include, but are not limited to, a histogram, a bar graph, and a line graph. Accordingly, file access and frame selections are aggregated and graphically represented in a compilation to form a navigation profile.
  • the amount of information presented by the navigation profiles may be too great to reasonably represent graphically on a single display. Therefore, identified patterns within the graphical display may be limited to select portions of the data to support ease of navigation.
  • the graphical representation may display the data associated with every frame of the video, or only a select portion of the frames.
  • the navigation profile is available on visual display and includes an interactive mode wherein a portion of data represented in the profile may be selected in response to a gesture, such as in a finger gesture or a stylus on a touch sensitive display ( 206 ).
  • a select portion of the displayed data may be expanded with the gesture.
  • the magnitude of expansion is proportional to the distance of movement created by the gesture.
  • gesture(s) may be used to navigate between selected portions of displayed data, or between specific attributes associated with the frames. For example, a gesture may be used to navigate between a graphical representation of volume with respect to a frame number and a graphical representation of the quantity of frames viewed with respect to a frame number. Accordingly, on a graphical display, the navigation profiles can be effectively displayed and manipulated through selected portions.
  • FIG. 3 is a flow chart ( 300 ) illustrating a method for autonomous recommendation of one or more files.
  • Activity associated with a file is tracked and compiled into a navigation profile ( 302 ), and patterns are identified from the compiled navigation profiles ( 304 ).
  • Recommendations are determined based on the identified patterns ( 306 ), followed by communicating the recommendation of one or more portions of the data file ( 308 ). For example, a prior pattern assessment may yield recommendation of select frames for viewing.
  • the data file is a video
  • videos with action scenes may be recommended or portions of the video containing action scenes may be recommended.
  • recommendations may employ characteristics of a profile. For example, file portions containing highly viewed frames may be recommended from navigation profiles of other users who share a similar profile characteristic. Accordingly, data files are recommended based on various qualifiers, including acquired viewing patterns sensitive to frame rates and their associated attributes.
  • FIGS. 4A and 4B are two examples of these graphical representations.
  • FIG. 4A is a line graph ( 400 ) representing the number of views for a frame ( 402 ) with respect to a frame number ( 404 ). While this example depicts the views for a frame with respect to frame number, various other attributes may be plotted on the line graph in addition to or instead of frame views. For example, volume level or viewing rate may be plotted attributes. Additionally, and as described above, portions of the data may be expanded.
  • the portion containing frames 700-1,000 ( 406 ) and the portion containing frames 1,200-1,300 ( 408 ) may for instance be of particular interest due to the exceptionally high views of each frame in those portions. Those areas may therefore be expanded to be viewed in greater detail through the various methods as described in FIG. 2 . Accordingly, attributes compiled by frame number are displayed on a line graph for viewing and navigation.
  • FIG. 4B is a histogram ( 410 ) depicting an alternative layout for displaying the data as shown in FIG. 4A .
  • the histogram ( 410 ) summarizes the data of FIG. 4A by grouping frame numbers between various ranges to depict data spikes. For example, the data spike between frames 700-1,000 ( 406 ) and frames 1,200-1,300 ( 408 ) are depicted in histogram bars ( 412 ) and ( 416 ), respectively.
  • FIGS. 4A and 4B depict a line graph ( 400 ) and a histogram ( 410 ) respectively, various other graphical forms may be implemented for data depiction. Accordingly, data may be graphically displayed in different formats for identification of trends among various attributes.
  • aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware based embodiment, an entirely software based embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wire line, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • FIG. 5 is a system ( 500 ) for creating a navigation profile, displaying the navigation profile, and tools to support manipulation of the data through the graphical representation.
  • a computer ( 502 ) is shown in communication with data storage ( 520 ), with the storage containing one or more navigation profile(s) ( 522 ).
  • the computer has a processing unit ( 504 ) in communication with memory ( 508 ) across a bus ( 506 ).
  • the computer ( 502 ) may further be in communication with any number of compute nodes across a network ( 505 ).
  • computer ( 502 ) is in communication with node 0 ( 530 ) and node 1 ( 540 ), both in communication with local data storage, ( 532 ) and ( 542 ) respectively.
  • the computer ( 502 ) is in communication with a visual display ( 580 ).
  • computer ( 502 ) includes a functional unit ( 510 ) having tools embedded therewith.
  • the tools include but are not limited to, a profile manager ( 512 ), a storage manager ( 514 ), and a recommendation manager ( 516 ).
  • a graphics manager ( 518 ) is also provided as one of the tools supported by the functional unit ( 510 ).
  • the profile manager ( 512 ) functions to capture a navigation profile ( 522 ) of a viewed file.
  • the navigation profile ( 522 ) which is a collection of data with information regarding various attributes of viewed frames, portrays or is compiled to portray a viewing pattern of the data file. An example of such an attribute includes whether a frame has been subject to viewing or omitted from viewing.
  • the navigation profile ( 522 ) may be a collection of data from a single viewing or multiple viewings of the file.
  • the navigation profile may be stored locally in data storage ( 520 ) or in remote data storage (not shown) across a network ( 505 ).
  • the storage manager ( 514 ) aggregates a plurality of stored navigation profiles ( 522 ) for one or more accessed or viewed files. Accordingly, the profile manager ( 512 ) captures any number of navigation profiles ( 522 ) which are subsequently stored by the storage manager ( 514 ).
  • the recommendation manager ( 516 ) is provided in communication with the storage manager ( 514 ).
  • the recommendation manager ( 516 ) functions to support selection of a portion of the data file for a second viewing based on the one or more captured and stored navigation profile(s) ( 522 ). Where multiple navigation profiles ( 522 ) are aggregated in storage, the recommendation manager ( 516 ) recommends one or more frames or selected portions within the file for viewing based on the aggregation.
  • a graphics manager ( 518 ) is further provided to graphically represent the aggregated profiles ( 522 ). In one embodiment, the graphics manager ( 518 ) displays a graphical representation that captures a frequency of frames viewed in a data file based upon the aggregation of captured profiles.
  • the recommendation manager ( 516 ) assess select portions of a file based on a navigation profile ( 522 ).
  • the navigation profile ( 522 ) employs characteristics of users of the file. More specifically, the navigation profile ( 522 ) gathers and maintains characteristics of users, such as likes and dislikes.
  • the recommendation manager ( 516 ) may recommend one or more portions of the file based on common characteristics of a current user and a prior user.
  • the recommendation manager ( 516 ) provides a recommendation on a combination of the navigation profile ( 522 ) and the common characteristics among users of the file. Accordingly, the recommendation manager ( 516 ) may employ various characteristics associated with the file as a basis for the recommendation.
  • the profile manager ( 512 ), storage manager ( 514 ), recommendation manager ( 516 ), and graphics manager ( 518 ), are shown residing in the functional unit ( 510 ).
  • the functional unit ( 510 ) and managers ( 512 )-( 518 ) may reside as hardware tools external to the memory ( 508 ).
  • the managers ( 512 )-( 518 ) may be implemented as a combination of hardware and software resources.
  • the managers ( 512 )-( 518 ) may be combined into a single functional item that incorporates the functionality of the separate items.
  • each of the manager(s) ( 512 )-( 518 ) are shown local to one client ( 502 ), e.g. compute node. However, in one embodiment they may be collectively or individually distributed across a shared pool of configurable computer resources and function as a unit to enable pattern detection to support one or more recommendations. Accordingly, managers may be implemented as software tools, hardware tools, or a combination of software and hardware tools.
  • the functional unit described above in FIG. 5 has been labeled with managers.
  • the managers may be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices, or the like.
  • the manager(s) may also be implemented in software for processing by various types of processors.
  • An identified manager of executable code may, for instance, comprise one or more physical or logical blocks of computer instructions which may, for instance, be organized as an object, procedure, function, or other construct. Nevertheless, the executable of an identified manager need not be physically located together, but may comprise disparate instructions stored in different locations which, when joined logically together, comprise the managers and achieve the stated purpose of the managers.
  • a manager of executable code could be a single instruction, or many instructions, and may even be distributed over several different code segments, among different applications, and across several memory devices.
  • operational data may be identified and illustrated herein within the manager, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices, and may exist, at least partially, as electronic signals on a system or network.
  • the computer system includes one or more processors, such as a processor ( 602 ).
  • the processor ( 602 ) is connected to a communication infrastructure ( 604 ) (e.g., a communications bus, cross-over bar, or network).
  • a communication infrastructure e.g., a communications bus, cross-over bar, or network.
  • the computer system can include a display interface ( 606 ) that forwards graphics, text, and other data from the communication infrastructure ( 604 ) (or from a frame buffer not shown) for display on a display unit ( 608 ).
  • the computer system also includes a main memory ( 610 ), preferably random access memory (RAM), and may also include a secondary memory ( 612 ).
  • the secondary memory ( 612 ) may include, for example, a hard disk drive ( 614 ) (or alternative persistent storage device) and/or a removable storage drive ( 616 ), representing, for example, a floppy disk drive, a magnetic tape drive, or an optical disk drive.
  • the removable storage drive ( 616 ) reads from and/or writes to a removable storage unit ( 618 ) in a manner well known to those having ordinary skill in the art.
  • Removable storage unit ( 618 ) represents, for example, a floppy disk, a compact disc, a magnetic tape, or an optical disk, etc., which is read by and written to by a removable storage drive ( 616 ).
  • the removable storage unit ( 618 ) includes a computer readable medium having stored therein computer software and/or data.
  • the secondary memory ( 612 ) may include other similar means for allowing computer programs or other instructions to be loaded into the computer system.
  • Such means may include, for example, a removable storage unit ( 620 ) and an interface ( 622 ).
  • Examples of such means may include a program package and package interface (such as that found in video game devices), a removable memory chip (such as an EPROM, or PROM) and associated socket, and other removable storage units ( 620 ) and interfaces ( 622 ) which allow software and data to be transferred from the removable storage unit ( 620 ) to the computer system.
  • the computer system may also include a communications interface ( 624 ).
  • Communications interface ( 624 ) allows software and data to be transferred between the computer system and external devices. Examples of communications interface ( 624 ) may include a modem, a network interface (such as an Ethernet card), a communications port, or a PCMCIA slot and card, etc.
  • Software and data transferred via communications interface ( 624 ) are in the form of signals which may be, for example, electronic, electromagnetic, optical, or other signals capable of being received by communications interface ( 624 ). These signals are provided to communications interface ( 624 ) via a communications path (i.e., channel) ( 626 ).
  • This communications path ( 626 ) carries signals and may be implemented using wire or cable, fiber optics, a phone line, a cellular phone link, a radio frequency (RF) link, and/or other communication channels.
  • RF radio frequency
  • computer program medium “computer usable medium,” and “computer readable medium” are used to generally refer to media such as main memory ( 610 ) and secondary memory ( 612 ), removable storage drive ( 616 ), and a hard disk installed in hard disk drive or alternative persistent storage device ( 614 ).
  • Computer programs are stored in main memory ( 610 ) and/or secondary memory ( 612 ). Computer programs may also be received via a communication interface ( 624 ). Such computer programs, when run, enable the computer system to perform the features of the present invention as discussed herein. In particular, the computer programs, when run, enable the processor ( 602 ) to perform the features of the computer system. Accordingly, such computer programs represent controllers of the computer system.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A system, computer program product, and method are provided for capturing a first and second set of data. A first navigation profile is formed from the first set of data and a second navigation profile is formed from the second set of data. The formed navigation profiles are aggregated and a combined viewing pattern is identified from the aggregation. A portion of the data file is recommended for a second viewing based on the combined viewing pattern and the recommended portion of the data file is displayed on a visual display. Gestural navigation is supported within the displayed portion including navigating between graphical representations of data categories.

Description

    CROSS REFERENCE TO RELATED APPLICATION(S)
  • This application is a continuation patent application claiming the benefit of the filing date of U.S. patent application Ser. No. 14/068,323, filed on Oct. 31, 2013, and titled “Pattern Based Video Frame Navigation Aid”, now pending, which is a continuation of U.S. patent application Ser. No. 13/921,803, filed on Jun. 19, 2013, and titled “Pattern Based Video Frame Navigation Aid” now abandoned, both which are hereby incorporated by reference.
  • BACKGROUND
  • Technical Field
  • The present invention relates generally to capturing data associated with an access file. More specifically, the invention relates to detecting a pattern of the accessed file, and capturing the pattern in the form of a navigation profile.
  • Background
  • With the development of technology and electronic storage devices, data in various forms is generally stored in some form of an electronic storage device. Examples of such data include image data, such as still images and video. Data pertaining to viewing of such files may be tracked and/or recorded. In one embodiment, the captured data can be used to determine the popularity of the file, demographics associated with viewers of the file, and/or recommendations to viewers for viewing the file.
  • Data files however, are not always viewed in their entirety. For example, a video file may be watched non-continuously and select segments of the file may be skipped while other segments may be viewed more than one time. Accordingly, there is a macroscopic manner in tracking file access, and there is a microscopic manner of ascertaining file access on a segment basis.
  • SUMMARY OF THE INVENTION
  • This invention comprises a system and computer program product for capturing a navigation profile of a viewed data file and use of the captured profile.
  • A system, computer program product, and method are provided for capturing a first and second set of data, wherein the first set comprises data corresponding to a viewed portion of a data file by a first entity, and the second set comprises data corresponding to the viewed portion of the data file by a second entity. A first navigation profile is formed from the first set of data and a second navigation profile is formed from the second set of data. The formed navigation profiles are aggregated and a combined viewing pattern is identified from the aggregation. A portion of the data file is recommended for a second viewing based on the combined viewing pattern and the recommended portion of the data file is displayed on a visual display. Gestural navigation is supported within the displayed portion including navigating between graphical representations of data categories.
  • Other features and advantages of this invention will become apparent from the following detailed description of the presently preferred embodiment of the invention, taken in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The drawings referenced herein form a part of the specification. Features shown in the drawings are meant as illustrative of only some embodiments of the invention, and not of all embodiments of the invention unless otherwise explicitly indicated. Implications to the contrary are otherwise not to be made.
  • FIG. 1 is a flow chart illustrating a method for creating a navigation profile for viewed data file.
  • FIG. 2 is a flow chart illustrating a method for graphically representing the navigation profile(s).
  • FIG. 3 is a flow chart illustrating a method for autonomous data file recommendation.
  • FIG. 4A and 4B are block diagrams illustrating examples of graphical representations of viewed files and associated captured view patterns.
  • FIG. 5 depicts a block diagram illustrating a system for automated data file recommendation.
  • FIG. 6 depicts a block diagram showing a system for implementing an embodiment of the present invention.
  • DETAILED DESCRIPTION
  • It will be readily understood that the components of the present invention, as generally described and illustrated in the Figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the apparatus, system, and method of the present invention, as presented in the Figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention.
  • The functional unit described in this specification has been labeled with tools, modules, and/or managers. The functional unit may be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices, or the like. The functional unit may also be implemented in software for execution by various types of processors. An identified functional unit of executable code may, for instance, comprise one or more physical or logical blocks of computer instructions which may, for instance, be organized as an object, procedure, function, or other construct. Nevertheless, the executable of an identified functional unit need not be physically located together, but may comprise disparate instructions stored in different locations which, when joined logically together, comprise the functional unit and achieve the stated purpose of the functional unit.
  • Indeed, a functional unit of executable code could be a single instruction, or many instructions, and may even be distributed over several different code segments, among different applications, and across several memory devices. Similarly, operational data may be identified and illustrated herein within the functional unit, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices, and may exist, at least partially, as electronic signals on a system or network.
  • Reference throughout this specification to “a select embodiment,” “one embodiment,” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “a select embodiment,” “in one embodiment,” or “in an embodiment” in various places throughout this specification are not necessarily referring to the same embodiment.
  • Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided, such as examples of managers, to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
  • The illustrated embodiments of the invention will be best understood by reference to the drawings, wherein like parts are designated by like numerals throughout. The following description is intended only by way of example, and simply illustrates certain selected embodiments of devices, systems, and processes that are consistent with the invention as claimed herein.
  • In the following description of the embodiments, reference is made to the accompanying drawings that form a part hereof, and which shows by way of illustration the specific embodiment in which the invention may be practiced. It is to be understood that other embodiments may be utilized because structural changes may be made without departing from the scope of the present invention.
  • Data files come in different forms, including but not limited to an image format and a video format. Regardless of the format, the file is comprised of an aggregation of data. When the file is opened or otherwise accessed, different sections of the data aggregation may be selected and/or viewed. To determine detailed aspects of an accessed file, a navigation profile is created based on a summary of data pertaining accessed portions of the file. FIG. 1 is a flow chart (100) illustrating a method for creating the profile. A file is selected (102) and a counting variable X is initialized (104). The counting variable represents portions of the file. In one embodiment, portions may include but are not limited to segments, bytes, bits, ranges, etc. In one embodiment, the file is a video file and the counting variable pertains to individual frames that comprise the video. The variable Xtotal is assigned to the total number of portions in the selected file (106). For ease of description, the file being described is a video file and the individual portions are individual frames of the video file. One or more attributes associated with a selected frame, frameX, are captured (108). These attributes may include whether framex was subject to viewing or omitted from viewing. In one embodiment, framex has an audio attribute capturing the decibel level in which framex was viewed. Similarly, in one embodiment, framex has a speed attribute capturing the speed in which select frames were viewed. Accordingly, for each selected frame of the file, attribute data associated with the selection is collected.
  • A pattern associated with the collected attribute data may be formed from one or more selections of the file. Following the selected access of a data file, any pattern(s) associated with the captured attributes are identified (110). Examples of such patterns include, but are not limited to, a collection of selected portions having the greatest number of accesses or least number of accesses, and a collection of selected portions that were viewed at an increased viewing speed and/or volume etc. Following the capturing of attributes at step (108) and the pattern identification at step (110), the counting variable X is incremented (112). It is then determined if all of the portions of the selected file have been assessed (114). A negative response to the determination at step (114) is followed by a return to step (108). Conversely, a positive response to the determination at step (114) is following by evaluating the captured data. More specifically, the patterns are compiled into a navigation profile (116); the profile is a collection of the acquired attributes of the individual frames of the file. The navigation profile(s) is stored in an electronic storage device (118). Accordingly, individual frames of a video are separately assessed for attributes, and the collected attributes are summarized in a navigation profile identifying any associated attribute patterns.
  • The attributes of individual frames in the data file are compiled into a navigation profile as described in FIG. 1. FIG. 2 is a flow chart (200) illustrating a method for representing the navigation profile for a select file. The navigation profile is an aggregation of one or more file accesses and frame selections. In one embodiment, multiple selections are compiled into a single navigation profile. Similarly, in one embodiment, the navigation profile is unique to a viewing pattern of a user, and the compilation of multiple navigation profiles is a joining of the viewing patterns of various users into a single combined view. In one embodiment, multiple navigation profiles are compiled such that the attribute data of two or more viewing navigation profiles are aggregated into a single presentation (202). The aggregated data is organized for presentation (204). Examples of the presentation include, but are not limited to, a histogram, a bar graph, and a line graph. Accordingly, file access and frame selections are aggregated and graphically represented in a compilation to form a navigation profile.
  • The amount of information presented by the navigation profiles may be too great to reasonably represent graphically on a single display. Therefore, identified patterns within the graphical display may be limited to select portions of the data to support ease of navigation. For example, the graphical representation may display the data associated with every frame of the video, or only a select portion of the frames. In one embodiment, the navigation profile is available on visual display and includes an interactive mode wherein a portion of data represented in the profile may be selected in response to a gesture, such as in a finger gesture or a stylus on a touch sensitive display (206). In one embodiment, a select portion of the displayed data may be expanded with the gesture. In one embodiment, the magnitude of expansion is proportional to the distance of movement created by the gesture. In another embodiment, gesture(s) may be used to navigate between selected portions of displayed data, or between specific attributes associated with the frames. For example, a gesture may be used to navigate between a graphical representation of volume with respect to a frame number and a graphical representation of the quantity of frames viewed with respect to a frame number. Accordingly, on a graphical display, the navigation profiles can be effectively displayed and manipulated through selected portions.
  • Navigation profiles are generated and stored sensitive to a viewing pattern as described in FIG. 1. With these navigation profiles, a file can be autonomously recommended based on previous navigation of the profiles. FIG. 3 is a flow chart (300) illustrating a method for autonomous recommendation of one or more files. Activity associated with a file is tracked and compiled into a navigation profile (302), and patterns are identified from the compiled navigation profiles (304). Recommendations are determined based on the identified patterns (306), followed by communicating the recommendation of one or more portions of the data file (308). For example, a prior pattern assessment may yield recommendation of select frames for viewing. In one embodiment, where the data file is a video, if identified patterns show a high view rate for action scenes in the video, videos with action scenes may be recommended or portions of the video containing action scenes may be recommended. In one embodiment, recommendations may employ characteristics of a profile. For example, file portions containing highly viewed frames may be recommended from navigation profiles of other users who share a similar profile characteristic. Accordingly, data files are recommended based on various qualifiers, including acquired viewing patterns sensitive to frame rates and their associated attributes.
  • As shown and described in FIG. 2, the navigation profiles are graphically represented, and in one embodiment, may be manipulated for frame selection. FIGS. 4A and 4B are two examples of these graphical representations. FIG. 4A is a line graph (400) representing the number of views for a frame (402) with respect to a frame number (404). While this example depicts the views for a frame with respect to frame number, various other attributes may be plotted on the line graph in addition to or instead of frame views. For example, volume level or viewing rate may be plotted attributes. Additionally, and as described above, portions of the data may be expanded. For example, the portion containing frames 700-1,000 (406) and the portion containing frames 1,200-1,300 (408) may for instance be of particular interest due to the exceptionally high views of each frame in those portions. Those areas may therefore be expanded to be viewed in greater detail through the various methods as described in FIG. 2. Accordingly, attributes compiled by frame number are displayed on a line graph for viewing and navigation.
  • FIG. 4B is a histogram (410) depicting an alternative layout for displaying the data as shown in FIG. 4A. In this example, the histogram (410) summarizes the data of FIG. 4A by grouping frame numbers between various ranges to depict data spikes. For example, the data spike between frames 700-1,000 (406) and frames 1,200-1,300 (408) are depicted in histogram bars (412) and (416), respectively. While FIGS. 4A and 4B depict a line graph (400) and a histogram (410) respectively, various other graphical forms may be implemented for data depiction. Accordingly, data may be graphically displayed in different formats for identification of trends among various attributes.
  • As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware based embodiment, an entirely software based embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wire line, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • Aspects of the present invention are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • Aspects of the present invention are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • The processes shown in FIGS. 1-3 and supported in FIGS. 4 and 5 may be embodied as hardware components. FIG. 5 is a system (500) for creating a navigation profile, displaying the navigation profile, and tools to support manipulation of the data through the graphical representation. A computer (502) is shown in communication with data storage (520), with the storage containing one or more navigation profile(s) (522). The computer has a processing unit (504) in communication with memory (508) across a bus (506). The computer (502) may further be in communication with any number of compute nodes across a network (505). In this instance, computer (502) is in communication with node0 (530) and node1 (540), both in communication with local data storage, (532) and (542) respectively. The computer (502) is in communication with a visual display (580). In addition, computer (502) includes a functional unit (510) having tools embedded therewith. The tools include but are not limited to, a profile manager (512), a storage manager (514), and a recommendation manager (516). In one embodiment, a graphics manager (518) is also provided as one of the tools supported by the functional unit (510).
  • The profile manager (512) functions to capture a navigation profile (522) of a viewed file. The navigation profile (522), which is a collection of data with information regarding various attributes of viewed frames, portrays or is compiled to portray a viewing pattern of the data file. An example of such an attribute includes whether a frame has been subject to viewing or omitted from viewing. The navigation profile (522) may be a collection of data from a single viewing or multiple viewings of the file. Once captured, the navigation profile (522) is stored in data storage (520) by a storage manager (514) in communication with the profile manager (512). The navigation profile may be stored locally in data storage (520) or in remote data storage (not shown) across a network (505). In one embodiment, the storage manager (514) aggregates a plurality of stored navigation profiles (522) for one or more accessed or viewed files. Accordingly, the profile manager (512) captures any number of navigation profiles (522) which are subsequently stored by the storage manager (514).
  • The recommendation manager (516) is provided in communication with the storage manager (514). The recommendation manager (516) functions to support selection of a portion of the data file for a second viewing based on the one or more captured and stored navigation profile(s) (522). Where multiple navigation profiles (522) are aggregated in storage, the recommendation manager (516) recommends one or more frames or selected portions within the file for viewing based on the aggregation. In one embodiment, a graphics manager (518) is further provided to graphically represent the aggregated profiles (522). In one embodiment, the graphics manager (518) displays a graphical representation that captures a frequency of frames viewed in a data file based upon the aggregation of captured profiles. The graphics manger (518), as described in further detail in FIG. 2, may display all or only a portion of the aggregated data. Where only a portion of the navigation profiles (522) are displayed, a gesture may be used for selection of a frame within the file. A gesture may also be implemented to expand a portion of viewable area of the graphical representation, or to move between select portions of the file as reflected in the graphical representation(s). Accordingly, the recommendation manager (516) functions to recommend a portion of the file for a second viewing based on the navigation profile(s) (522), and where more than one navigation profile exists, the navigation profiles are graphically represented by the graphics manager (518).
  • As shown and described above, the recommendation manager (516) assess select portions of a file based on a navigation profile (522). In one embodiment, the navigation profile (522) employs characteristics of users of the file. More specifically, the navigation profile (522) gathers and maintains characteristics of users, such as likes and dislikes. The recommendation manager (516) may recommend one or more portions of the file based on common characteristics of a current user and a prior user. In one embodiment, the recommendation manager (516) provides a recommendation on a combination of the navigation profile (522) and the common characteristics among users of the file. Accordingly, the recommendation manager (516) may employ various characteristics associated with the file as a basis for the recommendation.
  • As identified above, the profile manager (512), storage manager (514), recommendation manager (516), and graphics manager (518), are shown residing in the functional unit (510). Although in one embodiment, the functional unit (510) and managers (512)-(518) may reside as hardware tools external to the memory (508). In another embodiment, the managers (512)-(518) may be implemented as a combination of hardware and software resources. Similarly, in one embodiment, the managers (512)-(518) may be combined into a single functional item that incorporates the functionality of the separate items. As shown herein, each of the manager(s) (512)-(518) are shown local to one client (502), e.g. compute node. However, in one embodiment they may be collectively or individually distributed across a shared pool of configurable computer resources and function as a unit to enable pattern detection to support one or more recommendations. Accordingly, managers may be implemented as software tools, hardware tools, or a combination of software and hardware tools.
  • Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Examples of the managers have been provided to lend a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
  • The functional unit described above in FIG. 5 has been labeled with managers. The managers may be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices, or the like. The manager(s) may also be implemented in software for processing by various types of processors. An identified manager of executable code may, for instance, comprise one or more physical or logical blocks of computer instructions which may, for instance, be organized as an object, procedure, function, or other construct. Nevertheless, the executable of an identified manager need not be physically located together, but may comprise disparate instructions stored in different locations which, when joined logically together, comprise the managers and achieve the stated purpose of the managers.
  • Indeed, a manager of executable code could be a single instruction, or many instructions, and may even be distributed over several different code segments, among different applications, and across several memory devices. Similarly, operational data may be identified and illustrated herein within the manager, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices, and may exist, at least partially, as electronic signals on a system or network.
  • Referring now to the block diagram (600) of FIG. 6, additional details are now described with respect to implementing an embodiment of the present invention. The computer system includes one or more processors, such as a processor (602). The processor (602) is connected to a communication infrastructure (604) (e.g., a communications bus, cross-over bar, or network).
  • The computer system can include a display interface (606) that forwards graphics, text, and other data from the communication infrastructure (604) (or from a frame buffer not shown) for display on a display unit (608). The computer system also includes a main memory (610), preferably random access memory (RAM), and may also include a secondary memory (612). The secondary memory (612) may include, for example, a hard disk drive (614) (or alternative persistent storage device) and/or a removable storage drive (616), representing, for example, a floppy disk drive, a magnetic tape drive, or an optical disk drive. The removable storage drive (616) reads from and/or writes to a removable storage unit (618) in a manner well known to those having ordinary skill in the art. Removable storage unit (618) represents, for example, a floppy disk, a compact disc, a magnetic tape, or an optical disk, etc., which is read by and written to by a removable storage drive (616). As will be appreciated, the removable storage unit (618) includes a computer readable medium having stored therein computer software and/or data.
  • In alternative embodiments, the secondary memory (612) may include other similar means for allowing computer programs or other instructions to be loaded into the computer system. Such means may include, for example, a removable storage unit (620) and an interface (622). Examples of such means may include a program package and package interface (such as that found in video game devices), a removable memory chip (such as an EPROM, or PROM) and associated socket, and other removable storage units (620) and interfaces (622) which allow software and data to be transferred from the removable storage unit (620) to the computer system.
  • The computer system may also include a communications interface (624). Communications interface (624) allows software and data to be transferred between the computer system and external devices. Examples of communications interface (624) may include a modem, a network interface (such as an Ethernet card), a communications port, or a PCMCIA slot and card, etc. Software and data transferred via communications interface (624) are in the form of signals which may be, for example, electronic, electromagnetic, optical, or other signals capable of being received by communications interface (624). These signals are provided to communications interface (624) via a communications path (i.e., channel) (626). This communications path (626) carries signals and may be implemented using wire or cable, fiber optics, a phone line, a cellular phone link, a radio frequency (RF) link, and/or other communication channels.
  • In this document, the terms “computer program medium,” “computer usable medium,” and “computer readable medium” are used to generally refer to media such as main memory (610) and secondary memory (612), removable storage drive (616), and a hard disk installed in hard disk drive or alternative persistent storage device (614).
  • Computer programs (also called computer control logic) are stored in main memory (610) and/or secondary memory (612). Computer programs may also be received via a communication interface (624). Such computer programs, when run, enable the computer system to perform the features of the present invention as discussed herein. In particular, the computer programs, when run, enable the processor (602) to perform the features of the computer system. Accordingly, such computer programs represent controllers of the computer system.
  • The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
  • The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
  • The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed.
  • Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
  • It will be appreciated that, although specific embodiments of the invention have been described herein for purposes of illustration, various modifications may be made without departing from the spirit and scope of the invention. Specifically, the autonomous detection and selection notification for a mobile device is not limited to a mobile phone. Accordingly, the scope of protection of this invention is limited only by the following claims and their equivalents.

Claims (20)

What is claimed is:
1. A system comprising:
a processing unit in communication with memory;
a functional unit in communication with the processing unit, the functional unit having tools to support file navigation, the tools comprising:
a profile manager to capture a first and second set of data, wherein the first set comprises data corresponding to a viewed portion of a data file by a first entity, and the second set comprises data corresponding to the viewed portion of the data file by a second entity;
the profile manager to form a first navigation profile from the first set of data and a second navigation profile from the second set of data;
a storage manager to aggregate the formed navigation profiles;
the profile manager to identify a combined viewing pattern from the aggregation; and
a recommendation manager in communication in the storage manager, the recommendation manager to recommend a portion of the data file for a second viewing based on the combined viewing pattern;
a visual display to display the recommended portion of the data file; and
the visual display to support gestural navigation within the displayed portion including navigation between graphical representations of data categories.
2. The system of claim 1, further comprising the visual display to graphically represent the aggregation of navigation profiles, including a graphics manager to capture a viewing frequency of data within the data file based upon the aggregation of captured viewed portions of the data file.
3. The system of claim 2, further comprising the graphics manager to support gestural selection of at least one subset within the data file from the graphical representations.
4. The system of claim 3, wherein the gestural selection is selected from the group consisting of: physical contact between the user and the visual display, and physical contact between a stylus and the visual display.
5. A computer program product comprising a computer readable storage device having program code embodied therewith, the program code executable by a processor to:
capture a first and second set of data, wherein the first set comprises data corresponding to a viewed portion of a data file by a first entity, and the second set comprises data corresponding to the viewed portion of the data file by a second entity;
form a first navigation profile from the first set of data and a second navigation profile from the second set of data;
aggregate the formed navigation profiles;
identify a combined viewing pattern from the aggregation;
recommend a portion of the data file for a second viewing based on the combined viewing pattern;
display the recommended portion of the data file on a visual display; and
support gestural navigation within the displayed portion including navigation between graphical representations of data categories.
6. The computer program product of claim 5, further comprising program code to graphically represent the aggregation of navigation profiles, including capture a viewing frequency of data within the data file based upon the aggregation of captured viewed portions of the data file.
7. The computer program product of claim 6, wherein the gestural navigation further comprises gestural selection of at least one subset within the data file from the graphical representations.
8. The computer program product of claim 7, wherein the gestural selection is selected from the group consisting of: physical contact between the user and the visual display, and physical contact between a stylus and the visual display.
9. The computer program product of claim 5, further comprising program code to maintain a profile of one or more viewing preferences based on view characteristics, and recommend at least one portion of the file for viewing based on a common characteristic.
10. The computer program product of claim 5, wherein the data file is a video file and the navigation profile includes data selected from the group consisting of: a skipped portion, a fast-forwarded portion, a re-played portion, a volume increase, a volume decrease, and combinations thereof.
11. A method comprising:
capturing a first and second set of data, wherein the first set comprises data corresponding to a viewed portion of a data file by a first entity, and the second set comprises data corresponding to the viewed portion of the data file by a second entity;
forming a first navigation profile from the first set of data and a second navigation profile from the second set of data;
aggregating the formed navigation profiles;
identifying a combined viewing pattern from the aggregation;
recommending a portion of the data file for a second viewing based on the combined viewing pattern;
displaying the recommended portion of the data file on a visual display; and
supporting gestural navigation within the displayed portion including navigating between graphical representations of data categories.
12. The method of claim 11, further comprising graphically representing the aggregation of navigation profiles, including capturing a viewing frequency of data within the data file based upon the aggregation of captured viewed portions of the data file.
13. The method of claim 12, further comprising gesturally selecting at least one subset within the data file from the graphical representations.
14. The method of claim 13, wherein the gesture is selected from the group consisting of: physical contact between the user and the visual display, and physical contact between a stylus and the visual display.
15. The method of claim 11, further comprising maintaining a profile of one or more viewing preferences based on view characteristics, and recommending at least one portion of the file for viewing based on a common characteristic.
16. The method of claim 11, wherein the data file is a video file and the navigation profile includes data selected from the group consisting of: a skipped portion, a fast-forwarded portion, a re-played portion, a volume increase, a volume decrease, and combinations thereof.
17. The method of claim 11, wherein the gestural navigation further comprises a selection selected from the group consisting of: selecting separate portions of a recommended portion of the data file and selecting separate attributes associated with a recommended portion of the data file.
18. The method of claim 11, further comprising capturing viewing frequency of data within the file based upon the aggregation of captured viewed portions of the data file.
19. The method of claim 11, wherein the formed first navigation profile comprises a viewing pattern of viewed portions of the first set of data and at least one portion omitted from viewing, and the formed second navigation profile comprises a pattern of viewed portions of the second set of data and at least one portion omitted from viewing.
20. The method of claim 11, wherein the graphical representations are selected from the group consisting of:
a first category with respect to a second category; and
a third category with respect to the second category.
US15/447,399 2013-06-19 2017-03-02 Pattern based video frame navigation aid Abandoned US20170177210A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/447,399 US20170177210A1 (en) 2013-06-19 2017-03-02 Pattern based video frame navigation aid

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/921,803 US20140379710A1 (en) 2013-06-19 2013-06-19 Pattern based video frame navigation aid
US14/068,323 US9645713B2 (en) 2013-06-19 2013-10-31 Pattern based video frame navigation aid
US15/447,399 US20170177210A1 (en) 2013-06-19 2017-03-02 Pattern based video frame navigation aid

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/068,323 Continuation US9645713B2 (en) 2013-06-19 2013-10-31 Pattern based video frame navigation aid

Publications (1)

Publication Number Publication Date
US20170177210A1 true US20170177210A1 (en) 2017-06-22

Family

ID=52111822

Family Applications (4)

Application Number Title Priority Date Filing Date
US13/921,803 Abandoned US20140379710A1 (en) 2013-06-19 2013-06-19 Pattern based video frame navigation aid
US14/068,323 Expired - Fee Related US9645713B2 (en) 2013-06-19 2013-10-31 Pattern based video frame navigation aid
US15/445,405 Abandoned US20170168858A1 (en) 2013-06-19 2017-02-28 Pattern based video frame navigation aid
US15/447,399 Abandoned US20170177210A1 (en) 2013-06-19 2017-03-02 Pattern based video frame navigation aid

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US13/921,803 Abandoned US20140379710A1 (en) 2013-06-19 2013-06-19 Pattern based video frame navigation aid
US14/068,323 Expired - Fee Related US9645713B2 (en) 2013-06-19 2013-10-31 Pattern based video frame navigation aid
US15/445,405 Abandoned US20170168858A1 (en) 2013-06-19 2017-02-28 Pattern based video frame navigation aid

Country Status (1)

Country Link
US (4) US20140379710A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9635096B2 (en) * 2013-11-26 2017-04-25 Google Inc. Selecting a content item based on a view profile
US10878030B1 (en) * 2018-06-18 2020-12-29 Lytx, Inc. Efficient video review modes

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060041902A1 (en) * 2004-08-23 2006-02-23 Microsoft Corporation Determining program boundaries through viewing behavior
US20100199295A1 (en) * 2009-02-02 2010-08-05 Napo Enterprises Dynamic video segment recommendation based on video playback location
US20110194839A1 (en) * 2010-02-05 2011-08-11 Gebert Robert R Mass Participation Movies
US8132200B1 (en) * 2009-03-30 2012-03-06 Google Inc. Intra-video ratings
US8176191B2 (en) * 2006-11-30 2012-05-08 Red Hat, Inc. Automated identification of high/low value content based on social feedback
US20130086509A1 (en) * 2011-09-29 2013-04-04 Microsoft Corporation Alternative query suggestions by dropping query terms
US20130259399A1 (en) * 2012-03-30 2013-10-03 Cheng-Yuan Ho Video recommendation system and method thereof
US20140074866A1 (en) * 2012-09-10 2014-03-13 Cisco Technology, Inc. System and method for enhancing metadata in a video processing environment
US8762867B1 (en) * 2011-05-16 2014-06-24 Mellmo Inc. Presentation of multi-category graphical reports
US8942542B1 (en) * 2012-09-12 2015-01-27 Google Inc. Video segment identification and organization based on dynamic characterizations

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4195317A (en) 1977-10-14 1980-03-25 Arvin Industries, Inc. Video recording and playback editing system with displayed cue signals
US5920842A (en) 1994-10-12 1999-07-06 Pixel Instruments Signal synchronization
WO2007128003A2 (en) 2006-03-28 2007-11-08 Motionbox, Inc. System and method for enabling social browsing of networked time-based media
CN102487456B (en) 2009-11-30 2015-06-17 国际商业机器公司 Method for providing visit rate of online video and device thereof
US8392450B2 (en) 2011-02-08 2013-03-05 Autonomy Corporation Ltd. System to augment a visual data stream with user-specific content

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060041902A1 (en) * 2004-08-23 2006-02-23 Microsoft Corporation Determining program boundaries through viewing behavior
US8176191B2 (en) * 2006-11-30 2012-05-08 Red Hat, Inc. Automated identification of high/low value content based on social feedback
US20100199295A1 (en) * 2009-02-02 2010-08-05 Napo Enterprises Dynamic video segment recommendation based on video playback location
US8132200B1 (en) * 2009-03-30 2012-03-06 Google Inc. Intra-video ratings
US20110194839A1 (en) * 2010-02-05 2011-08-11 Gebert Robert R Mass Participation Movies
US8762867B1 (en) * 2011-05-16 2014-06-24 Mellmo Inc. Presentation of multi-category graphical reports
US20130086509A1 (en) * 2011-09-29 2013-04-04 Microsoft Corporation Alternative query suggestions by dropping query terms
US20130259399A1 (en) * 2012-03-30 2013-10-03 Cheng-Yuan Ho Video recommendation system and method thereof
US20140074866A1 (en) * 2012-09-10 2014-03-13 Cisco Technology, Inc. System and method for enhancing metadata in a video processing environment
US8942542B1 (en) * 2012-09-12 2015-01-27 Google Inc. Video segment identification and organization based on dynamic characterizations

Also Published As

Publication number Publication date
US20140379710A1 (en) 2014-12-25
US20140380160A1 (en) 2014-12-25
US20170168858A1 (en) 2017-06-15
US9645713B2 (en) 2017-05-09

Similar Documents

Publication Publication Date Title
CN114071179B (en) Live broadcast preview method, device, equipment and medium
CN1538351B (en) Method and computer for generating visually representative video thumbnails
US8990727B2 (en) Fisheye-based presentation of information for mobile devices
CN102750076B (en) Information processing apparatus, and control method thereof
US9560415B2 (en) Method and system for interactive selection of items for purchase from a video
US9009750B2 (en) Post processing video to identify interests based on clustered user interactions
US8990701B2 (en) Gathering and organizing content distributed via social media
CN110851712B (en) Recommended methods, devices, and computer-readable media for book information
US11627362B2 (en) Touch gesture control of video playback
CN102043825B (en) Moving image providing device, moving image providing method, and program
KR101507272B1 (en) Interface and method for semantic annotation system for moving objects in the interactive video
JP6064815B2 (en) Method for temporarily stopping video presentation, calculation processing system and program
TW201222326A (en) Presentation of advertisements based on user interactivity with a web page
US10055099B2 (en) User-programmable channel store for video
EP4543010A1 (en) Push processing method and apparatus, and device and medium
US20130212039A1 (en) Review timeline for ownership lifecycle experience
US20180220199A1 (en) Identifying skipped offers of interest
KR20110073539A (en) Interest Manager
CN110505502A (en) A video processing method, device and computer-readable storage medium
US20140075384A1 (en) Context Aware Non-Linear Task Bar Orientation
US20170177210A1 (en) Pattern based video frame navigation aid
US20250240491A1 (en) Comment presentation method and apparatus, electronic device, and computer readable medium
KR101833806B1 (en) Method for registering advertising product at video contents and server implementing the same
CN113709565B (en) Method and device for recording facial expression of watching video
WO2024012059A1 (en) Object display method and apparatus, and electronic device and computer-readable medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KRITT, BARRY A.;RAKSHIT, SARBAJIT K.;REEL/FRAME:041438/0622

Effective date: 20130618

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: MAPLEBEAR INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTERNATIONAL BUSINESS MACHINES CORPORATION;REEL/FRAME:055155/0943

Effective date: 20210126