WO2019106658A1 - System and method of generating media content and related data structures - Google Patents

System and method of generating media content and related data structures Download PDF

Info

Publication number
WO2019106658A1
WO2019106658A1 PCT/IL2018/051285 IL2018051285W WO2019106658A1 WO 2019106658 A1 WO2019106658 A1 WO 2019106658A1 IL 2018051285 W IL2018051285 W IL 2018051285W WO 2019106658 A1 WO2019106658 A1 WO 2019106658A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
user
event
degree
media content
Prior art date
Application number
PCT/IL2018/051285
Other languages
French (fr)
Inventor
Alexander Wiesmaier
Konstantinos MATHIOUDAKIS
Alessandro Leonardi
Original Assignee
Agt Global Media Gmbh
Reinhold Cohn And Partners
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agt Global Media Gmbh, Reinhold Cohn And Partners filed Critical Agt Global Media Gmbh
Publication of WO2019106658A1 publication Critical patent/WO2019106658A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F7/00Methods or arrangements for processing data by operating upon the order or content of the data handled
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25866Management of end-user data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42201Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] biosensors, e.g. heat sensor for presence detection, EEG sensors or any limb activity sensors worn by the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6587Control parameters, e.g. trick play commands, viewpoint selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8549Creating video summaries, e.g. movie trailer

Definitions

  • Embodiments described herein generally relate to systems and methods for media content creation or generation.
  • Events may be documented by the use of media capture device(s), such as cellphones, cameras, video camcorders and the like .
  • the media capture device(s) may capture media at the time of an event to provide a captured media.
  • the captured media content may be viewed and shared by a user to his/her friends through social networks and other sharing services such as, for example, an Email service, a land mail service, cellphones, the Internet and/or the like.
  • a content creation system to generate one or more output media contents associated to an event of interest of a user, the content creation system comprising: one or more media databases to store one or more input media content and media metadata related to a media associated with the event and received from one or more media sources;
  • an event database to store data associated with the event
  • a physiological database to store user physiological data collected from one or more sensors
  • a user profile database to store social and personal data of the user and one or more user preferences comprising a type of media, the user physiological data, a degree of user social relationships parameter and social network data; and hardware processing circuitry' to receive an event indication to indicate the event and to generate the one or more output media contents by processing at least one of the input media content, the user physiological data, social and personal data and the event metadata, based on a degree of relationship of the one or more user preferences with a time of creation of the user physiological data and a time of creation of the one or more media contents, and to provide the user with tire one or more output media contents.
  • system can include one or more of features (i) to (xv) listed below, in any desired combination or permutation which is technically possible:
  • the hardware processing circuitry ⁇ is configured to set a degree of relationship of the one or more user preferences of the user preferences with the time of creation of the user physiological data and a time of creation of the one or more media content by attaching to the one or more media content one or more weights to indicate the degree of relationship of the one or more media content to the one or more user preferences.
  • one or more engine modules and one or more handler modules to generate one or more data structures based on data received from the at least one of the one or more media databases, the event database, the physiological database or the user profile database;
  • an orchestrator module operably coupled to the one or more engine modules and to the one or more handler modules, the orchestrator module is configured to call the one or more engines modules and the one or more handler modules according to a predetermined sequence and to generate the one or more output media contents based on a degree of relationship between one or more data portions of data generated by the one or more engine modules and the one or more handler modules.
  • the one or more output media contents comprise selected m oments of the event data structure, which include one or more moments of the event selected according to a degree of relationship of the user physiological data generated at the time of the event with the one or more media contents.
  • the input media contents comprise one or more media contents gathered from at least one of one or more social networks, one or more content platfonns and one or more websites by the one or more media databases.
  • the hardware processing circuitry is configured to generate the one or more output media content according to one or more data structures generated by at least one of the one or more engine modules and the one or more handier modules.
  • the one or more engine modules comprise a first engine module to generate a data structure comprising information of one or more users association with the event, prioritized by a degree of social relationship between the user and one or more other users, and a degree of relationship between the user and the one or more other users to the event metadata, according to the event indication, an indication on a user name, an indication on one or more other users names, and the event metadata.
  • the one or more engine modules comprise a second engine module to generate a data structure comprising one or more media types and one or media sizes associated with the one or more media types, the one or more media types are prioritized according to a degree of relationship of the one or more users to the event metadata, and the data structure is generated according to the event indication and a user indication.
  • the one or more engine modules comprise a third engine module to generate a data structure comprising one or more available media contents generated at one or more points of time during the event according to data applied by an available media data structure and a friends physiological data structure, wherein the one or more available media contents are prioritized according to the physiological data of the user and physiological data of friends of the user, which are generated at the one or more points of time during the event.
  • the one or more engine modules comprise a fourth engine module to generate a data structure comprising moments of the event according to one or more users physiological data sharing the event prioritized according to degree of relation of the one or more users physiological data to the moments of the event, wherein the data structure is generated according to data applied by a friends physiological data structure.
  • the one or more engine modules comprise a fifth engine module to generate a data structure comprising indication on media generated at one or more moments of the event, wherein the media is associated to the user physiological data and friends of the user physiological data, and prioritized according to a degree of relationship of friends available media, a preferred media and the moments to the event and the user
  • the one or more engine modules comprise a sixth engine module to generate a data structure comprising a content personalized to the user associated to the event, according to user preferences, moments of the event of a user interest, physiological data of the user and a media generated at the moments of the event of the user interest.
  • the one or more handler modules comprise a first handler module to generate a data structure comprising media related to the event and the user prioritized according to a degree of relationship between the event and the user
  • the one or more handier modules comprise a second handler module to generate a data structure comprising the physiological data of the user associated to the event metadata and prioritized according to a degree of relationship between the event and the user
  • the hardware processing circuitry is configured to set the degree of relationship of the one or more user preferences of the user preferences with the time of creation of the user physiological data and the time of viewing of the one or more input media contents by determining one or more weights to indicate the degree of relationship of the one or more media content to the one or more user preferences.
  • one or more engine modules and one or more handler modules configured to generate one or more data structures based on data received from the at least one data entity of the plurality of data entities;
  • an orchestrator module operably coupled to the one or more engine modules and to the one or more handler modules, the orchestrator module is configured to coordinate an operation of the one or more engine modules and the one or more handler modules according to a predetermined sequence.
  • the orchestrator module is configured to: provide to the user the one or more output media contents generated based on one or more data portions of data generated by the one or more engine modules and the one or more handier modules.
  • the one or more output media contents comprises one or more media of selected moments of the event of common interest
  • the one or more media is selected according to a degree of relationship of the user physiological data generated at a time of the event of common interest with the one or more media contents and the degree of the relation of the user to the other user and to the event of common interest.
  • the one or more engine modules comprises a social links engine module configured to generate a data entity comprising information the user and one or more users association with the event of common interest, prioritized by a degree of social relationship between the user and the one or more other users, according to the event indication, a user name indication, an indication on one or more other users’ profiles, and the event metadata.
  • the one or more engine modules comprises a content preferences engine module to generate a data entity comprising one or more media types and one or media sizes associated with the one or more media types, the data entity is prioritized according to a degree of relation to the event of common interest and the user.
  • the one or more engine modules comprise an internet of things (IoT)-media association engine module to generate a data entity comprising one or more available media contents generated at one or more points of time during the event of the common interest prioritized according to the physiological data of the user and physiological data of the one or more other users.
  • IoT internet of things
  • the one or more engine modules comprise a moments determination engine module to generate a data entity comprising points in a time of interest of the event of the common interest according to one or more users physiological data sharing the event of the common interest prioritized according to a degree of relation of input data to the event of the common interest and the user.
  • the one or more engine modules comprises a media selection engine module to generate a data entity comprising an indication on a preselected media generated at one or more point of times of the event of the common interest corresponded to the user physiological data, and prioritized according to a degree of relation of input data to the event of the common interest and the user
  • the one or more engine modules comprise a content creation engine module to generate a data entity comprising a content personalized to the user in a context of to the event of the common interest, according to user preferences, detected points of time of the event of the common interest, physiological data of the user and a media generated at the point of rimes of the event of the common interest.
  • the one or more handler modules comprise a media handler module to generate a data entity comprising media related to the event of the common interest and the user priori tized according to a degree of relation between the even t of the common interest and the user.
  • the one or more handler modules comprise an Internet of things (IoT) data handler module to generate a data entity comprising the physiological data of the user related to the event of the common interest and the user, prioritized according to a degree of relation of the event of the common interest and the user.
  • IoT Internet of things
  • a new data entity comprising a plurality of fields according to a predetermined criteria, wherein the plurality of fields comprising at least one of a keyword field, a value field and a weight field;
  • the generating comprises generating by at least one of one or more engine modules and one or more handler modules one or more new' data entities based on data received from the at least one data entity of the plurality of data entities;
  • an orchestrator module operably coupled to the one or more engine modules and to the one or more handler modules.
  • the generating comprises providing to the user, by the orchestrator module, the one or more output media contents generated based on a degree of relationship between one or more data portions of data generated by the one or more engine modules and the one or more handler modules.
  • weight value is normalized at least by one of fusing the weight values, multiplying the weights values, merging weight values by addition or merging weight values by weighted averaging.
  • a method for generating one or more output media contents associated with an event selected by a user by a content creation system comprising: storing one or more input media content and media metadata related to a media associated with the event and received from one or more media sources at one or more media databases;
  • social and personal data of tire user and one or more user preferences comprising a type of media, the user physiological data, a degree of user social relationships parameter and social network data at a user profile database; receiving an event indication to indicate the event and generating the one or more output media contents by processing at least one of the input media content, the user physiological data, social and personal data and the event metadata, based on a degree of relationship of the one or more user preferences with a time of creation of the user physiological data and a time of creation of the one or more media contents, and to provide the user with the one or more output media contents; and
  • This aspect of the disclosed subject matter can optionally include one or more of features (i) to (xxxii) listed above, mutatis mutandis, in any desired combination or permutation which is technically possible.
  • a method for generating one or more output media contents associated with an event of the common interest selected by a user by a content creation system comprising: receiving from a plurality of data entities, data associate with the event of the common interest of the user and the one or more other users, the data comprising a first indication on a degree of a relation of the user to the event of the common interest and a second indication of a degree of relation of the other users to the event of the common interest;
  • an event indication to indicate the event of the comm on interest and generating one or more output media contents by processing at least one of: an input media content, user physiological data, social and personal data, event metadata, at least one of user preferences and media metadata related to the input media content, based on the first indication on the degree of relation of the user to the event of the common interest and the second indication of the degree of relation of the other users to the event of the common interest and a degree of relationship of the one or more user preferences with a time of creation of the user physiological data and a time of viewing of the one or more input media contents, wherein the one or more output media contents includes at least a portion of the input media content whose associated metadata corresponds at least partially to the user preference; and
  • This aspect of the disclosed subject mater can optionally include one or more of features (i) to (xxxii) listed above, mutatis mutandis, in any desired combination or permutation which is technically possible.
  • a computing system configured to generate one or more output media contents associated to an event of a common interest of a user and one or more other users
  • the computing system comprising hardware processing circuitry configured to: receive from a plurality of data entities, data associate with the event of the common interest of the user and the one or more other users, the data comprising a first indication on a degree of a relation of the user to the event of the common interest and a second indication of a degree of relation of the other users to tire event of the common interest;
  • an event indication to indicate the event of the common interest and to generate one or more output media contents by processing at least one of: an input media content, user physiological data, social and personal data, event metadata, at least one of user preferences and media metadata related to the input media content, based on the first indication on the degree of relation of the user to the event of the common interest and the second indication of the degree of relation of the other users to the event of the common interest and a degree of relationship of the one or more user preferences with a time of creation of the user physiological data and a time of viewing of the one or more input media contents, wherein the one or more output media contents includes at least a portion of the input media content whose associated metadata corresponds at least partially to the user preference; and
  • This aspect of the disclosed subject matter can optionally include one or more of features (i) to (xxxii) listed above, mutatis mutandis, in any desired combination or permutation which is technically possible.
  • a content creation system to generate one or more output media contents associated with an event selected by a user
  • the content creation system comprising: one or more media databases to store one or more data structures associated wtith the event, a data structure of the one or more data structures comprising a keyword field to establish relationships between one or more entities of the content creation system, a value field to provide a value to the keyword and a weight field to weight tiie relationships between the one or more entities; and hardware processing circuitry to generate a data structure based on portions of data of the one or more data structures and an event selected by a user by using one or more data aggregation methods to manipulate the data of the keyword field and the value field and the event field by applying a weight value to the weight field based on the relationship between the one or more entities.
  • system can include one or more of features (xxxiii) to (xxxvib) listed below, in any desired combination or permutation which is technically possible:
  • weight value is configured to indicate a strength of relationships between two or more fields of the data structure.
  • the keyword to establish a relationship between one or more entities which provided input data to the content creation system.
  • a row of the one or more rows comprises:
  • the hardware processor circuitry is configured to:
  • weight value set the weight value to a row of the data structure by summing weights values of row's of the data structure and normalized the weight value, wherein the sum of the normalized weight values of the rows of the data structure are equal to 1.
  • weight value is normalized at least by one of fusing the weight values, multiplying the weights values, merging weight values by addition, or merging weight values by weighted averaging.
  • weight value comprises 0 for no influence or relation of the data applied by the row*. 0.5 for medium influence or relation of the data applied by the row, and 1 for high influence or relation of the data applied by the row.
  • a computing system for generating a data structure, the computing system comprising: an input/output (I/O) interface configured to gather and distribute data to one or more data processing devices;
  • I/O input/output
  • the hardware processing circuitry to generate a data structure based on portions of the data according to one or more data aggregation methods, the data structure comprising one or more entity fields, a keyword field, a value field and a weight field, wherein the weight field includes a weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the value field;
  • a memory' configured to store the one or more data structures.
  • This aspect of the disclosed subject matter can optionally include one or more of features (xxxiii) to (xxxvib) listed above, mutatis mutandis, in any desired combination or permutation which is technically possible.
  • a product comprising one or more tangible computer-readable non- transitory storage media comprising computer-executable instructions operable, when executed by hardware processing circuitry, to cause a content creation system to: store at one or more media databases one or more data structures associated with the event, data structure of the one or more data structures comprising a keyword field to establish relationships between one or more entities of the content creation system, a value field to provide a value to the keyword and a weight field to weight the relationships between the one or more entities; and
  • This aspect of the disclosed subject matter can optionally include one or more of features (xxxiii) to (xxxvib) listed above, mutatis mutandis, in any desired combination or permutation which is technically possible.
  • a product comprising one or more tangible computer-readable non- transitory storage media comprising computer-executable instructions operable, when executed by hardware processing circuitry, to cause a computing system to: gather data from one or more data processing devices and distribute data to the one or more data processing devices;
  • the data structure comprising one or more entity fields, a keyword field, a value field and a weight field, wherein the weight field includes a weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the value field;
  • a computing system configured to generate one or more output media contents for a user, the computing system comprising a processing circuitry ' and configured to: receive an indication to generate one or more output media contents for the user, wherein the indication is associated with at least one of: an event of interest to the user: at least one area of interest of the user indicative of the event of interest; receive, from at least one data source, event metadata associated with the event of interest to die user, the event metadata comprising a time associated with the event; receive, from the at least one data source, at least one input media content; receive, from the at least one data source, media metadata related to the at least one input media content, comprising a time associated with the at least one input media content;
  • user preference data associated with the at least one other user comprises a second indication of a degree of a relation of the at least one other user to the at least one area of interest, wherein the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the at least one other user, the user preference data associated with the user, and the user preference data associated with the at least one other user constitute first data;
  • the identifying comprises combining and reducing data of at least the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the at least one other user, the user preference data associated with the user, and the user preference data associated with the at least one other user,
  • the combining and reducing data is based at least partly on a degree of relationship of the time associated with the event, the time associated with the at least one input media content and the time associated with the degree of excitement data, wherein the at least one candidate output media content includes at least a portion of the at least one input media content;
  • At least one output media content includes at least a portion of the at least one candidate output media content
  • system according to this aspect of the presently disclosed subject matter can include one or more of features (xxxvii) to (ixviii) fisted below, in any desired combination or permutation which is technically possible:
  • the identifying at least one candidate output media content further comprises normalizing data.
  • the generating of the at least one output media content comprises selecting the at least one output media content from the at least one candidate output media content using at least one output media selection criterion
  • the event metadata comprises a third indication of a degree of relation of the event to the at least one area of interest
  • the media metadata related to the at least one input media content comprises a fourth indication of a degree of relation of the at least one input media content to the at least one area of interest
  • the degree of excitement metadata of the at least one other user comprises a fifth indication of a degree of relation of the d egree of excitement data to the at least one area of interest
  • the combining and reducing data is further based on a degree of relationship of the at least one area of interest of the user indicative of the event of interest, the first indication of a degree of relation of the user to the at least one area of interest, the second ind ication of a degree of a relation of the at least one other user to the at least one area of interest, the third indication of a degree of relation of the event to the at least one area of interest, the fourth indication of a degree of relation of the at least one input media content to the at least one area of interest, and the fi
  • (xl) further configured to receive, from the at least one data source, data indicative of a degree of relationship of the user to the at least one other user, wherein the identifying at least one candidate output media content further comprises combining and reducing the data indicative of a degree of relationship of the user to the at least one other user and the first data.
  • the degree of excitement data of the at least one other user is based on physiological data of the at least one other user.
  • the identifying at least one candidate output media content further comprises combining and reducing the data of the degree of excitement metadata of the user and the first data.
  • the user preference data associated with the user further comprises user media preference data, the user media preference data comprising at least one of media type preference and media size preference,
  • the media metadata related to the at least one input media content comprises at least one of input media content type and input media content size
  • identifying at least one candidate output media content further comprises combining and reducing the first data and the user media preference data, based on at least one of input media content type and media content size.
  • the at least one input media content comprise at least one media content gathered by the at least one data source from at least one of one or more social networks, one or more content platforms, one or more feeds from the Internet and one or more websites.
  • At least part of the first data comprises at least one data structure comprising one or more entity fields, a keyword field, a value field and a weight field, wherein the weight field includes a weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the value field, the at least one data structure constituting at least one Keyword-Value-Weight (KVW) data structure, and wherein the combining and reducing the first data comprises matching Keyword- V alue patrs of the at least one Keyword -Value- Weight (KVW) data structure.
  • (xlvii) the combining and reducing data comprising seting the weight value by at least one of fusing the weight values of records, multiplying the weights values, merging weight values by addition and merging weight values by weighted averaging.
  • (xlix) tire identifying comprises generating a data entity indicative of the association of the user and the at least one other user with the event of interest to the user, based on the degree of relationship between the user to the at least one other user.
  • the generating of the data entity indicative of the association of the user and the at least one other user with the event of interest to the user comprises combining and reducing data of at least the event metadata associated with the event of interest to the user, the user preference data associated with the user, the user preference data associated with the at least one other user, and the data indicative of the degree of relationship of the user to the at least one other user.
  • the identifying comprises generating a data entity indicative of one or more media types and of one or media sizes associated with the one or more media types, according to user media preference data.
  • the i dentifying compri ses generating a first data entity indicative of the at least one input media content, based on at least one of: association of the one or more available media contents with the event, and association of the one or more available media contents with the user preference data associated with the user and with the user preference data associated with the at least one other user.
  • the generating of the first data entity indicative of one or more available media contents comprises combining and reducing data of at least the media metadata related to the at least one input media content, the event metadata associated with the event of interest to the user and the data entity indicative of the association of the user and the at least one other user with the event of interest to the user.
  • the identifying comprises generating a data entity indicative of the degree of excitement data of at least one other user, based on at least one of: association of the degree of excitement data with the event, and association of the degree of excitement data with at least one of the user and the at least one other user.
  • the generating of the data entity indicative of the degree of excitement data of at least one other user comprises combining and reducing data of at least the degree of excitement metadata of the at least one other user, the event metadata associated with the event of interest to the user and the data entity indicative of the association of the user and the at least one other user with the event of interest to the user.
  • the identifying comprises generating a data entity indicative of association between the at least one input media content and the degree of excitement data of at least one other user, the data entity indicative of association between the at least one input media content and the degree of excitement data of at least one other user generated based on one or more points in time during the event of the interest to the user.
  • the generating of the data entity indicative of association between the at least one input media content and the degree of excitement data of at least one other user comprises combining and reducing the first data entity indicative of the at least one input media content and the data entity indicative of association between the at least one input media content and the degree of excitement data of the at least one other user.
  • the identifying comprises generating a data entity indicative of points in time within at least one time of interest of the event of interest to the user, according to the degree of excitement data.
  • (lix) the generating the data entity indicative of points in time is according to an indication of relevance that is based on the degree of excitement data of the at least one other user.
  • (lx) for each time of interest of the at least one time of interest generating the data entity indicative of points in time is based on the most highly weighted degree of excitement data of the degree of excitement data corresponding to the each time of interest.
  • the identifying comprises generating a second data entity indicative of the one or more available media contents, wherein the one or more available media contents correspond to the points in time within the at least one time of interest of the event of interest to the user.
  • the second data entity is weighted based on the preferences of the user.
  • the generating of second data entity indicative of the one or more available media contents comprises combining and reducing the data entity indicative of points in time within at least one time of interest, the data entity indicative of association between the at least one input media content and the degree of excitement data of at least one other user and the data entity indicative of one or more media types and of one or media sizes associated with the one or more media types.
  • (Ixiv) generating the at least one output media content comprises generating a third data entity indicative of the one or more available media contents, wherein the generation of the third data entity is based on a relevance field associated with the one or more available media contents and on a weight field associated with the one or more available media contents.
  • the third data entity is indicative of the one or more media sizes associated with the one or more media types.
  • the event of interest to the user comprises at least one of a sports match, a contest, a concert, a show, a happening, a vacation or a trip.
  • the physiological data of the at least one other user comprise at least one of a blood pressure, a heartbeat, a heart rate, a breathing rate, a step counter and a body temperature.
  • the output media content comprises at least one of a video clip, an audio recording, an image and a text captured at the time of the event interest to the user, according to the user media preference data.
  • a non-transitory program storage device readable by machine, tangibly embodying a program of instructions executable by the machine to perform the method of the above computing system for generating one or more output media contents for a user.
  • a computing system configured to create a compilation output from a first data set, the computing system comprising a processing circuitry and configured to: receive an indication to generate a compilation;
  • first data set comprising a time associated with first data of the first data set, wherein die first data set is relation-aware
  • the identifying comprises combining and reducing data of at least the firs data set and the second data set
  • the combining and reducing data is based on a degree of relationship of the time associated with the first data and the time associated with the second data, wlierein the at least one candidate compilation output includes at least a portion of the first data set;
  • At least one compilation output media content includes at least a portion of the at least one candidate compilation output
  • the computerized method, the non-transitory' program storage device and the computing system disclosed herein according to the above five aspects, can optionally further comprise one or more of features (xxxvii) to (Ixviii) listed above, mutatis mutandis, in any desired combination or permutation which is technically possible.
  • a computing system for generating a data structure comprising: an input/output (I/O) interface configured to gather from and distribute data to one or more data processing devices;
  • a processor to generate at least one data structure based on portions of the data according to one or more data aggregation methods, the at least one data structure comprising one or more entity fields, a keyword field, a value field and a weight field, wherein the weight field includes a weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the value field, the at least one data structure constituting at least one Keyword- Value-Weight (KVW) data structure; and a memory ' configured to store the at least one data structure, wherein the at least one data structure is capable of being utilized by the processing circuitry' ⁇ to generate at least one of: at least one new' data structure, at least one modified data structure, by combining the at least one data structure with at least one other data structure.
  • KVW Keyword- Value-Weight
  • the computerized method, the non-transitory program storage device and the computing system disclosed herein according to the above three aspects, can optionally further comprise one or more of features (Ixix) to (lxxviii) listed below', mutatis mutandis, in any desired combination or permutation which is technically possible:
  • the KVW data structure comprises at least one record, a record of the at least one record comprises:
  • a value in the value field to be associated to the keyword value to be associated to the keyword value
  • the weight value set by the processor to indicate the degree of relation of the one or more entity fields to the keyword field and the value field.
  • the value in the value field can be equal to an entity value of an entity field of the entity field of the one or more entity fields
  • the one or more data aggregation methods comprise:
  • weight value to a record of the data structure by at least one of: multiplying weights valises of records of the at least two source KVW data structures, summing the weights values, merging the weight values by weighted averaging.
  • the one or more data aggregation methods further comprise: merging two or more data fields of the at least two source KVW data structures.
  • the processor circuitry is further configured to: merge two or more records of the KVW data structure by summing weights values of the two or more records, wherein the combination of the keyword field and the value field in the two or more records match.
  • the generation comprises reducing data.
  • the weight value comprises 0 for no relation of the data comprised in the record, 0.5 for medium relation of the data comprised in the record, and 1 for high relation of the data comprised in the record.
  • the at least one data structure is capable of being utilized by the processing circuitry' to generate one or more output media contents for a user, the computing system further configured to:
  • an indication to generate one or more output media contents for the user wherein the indication is associated with at least one of: an event of interest to the user; at least one area of interest of the user indicative of the event of interest; receive, from at least one data source, event metadata associated with the event of interest to the user, the event metadata comprising a time associated with the event; receive, from the at least one data source, at least one input media content; receive, from the at least one data source, media metadata related to the at least one input media content, comprising a time associated with the at least one input media content;
  • tire user preference data associated with the at least one other user comprises a second indication of a degree of a relation of the at least one other user to the at least one area of interest, wherein the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the at least one other user, the user preference data associated with the user, and the user preference data associated with the at least one other user constitute first data;
  • the identifying comprises combining and reducing data of at least the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the at least one other user, the user preference data associated with the user, and tire user preference data associated with the at least one other user,
  • the combining and reducing data is based on a degree of relationship of the time associated with the event, the time associated with the at least one input media content and the time associated with the degree of excitement data, wherein the combining and reducing the first data comprises matching Keyword-Value pairs of the at least one Keyword- V alue-Weight (KVW) data structure,
  • the at least one candidate output media content includes at least a portion of the at least one input media content; generate the at least one output media content, wherein at least one output media content includes at least a portion of the at least one candidate output media content;
  • FIG. 1 is a schematic illustration of a block diagram of a content creation system to generate a user based media content, in accordance with some demonstrative embodiments.
  • FIG. 2 is a schematic illustration of a flow chart of a method of generating a user based media content associate with an event of common interest, in accordance with some demonstrative embodiments.
  • FIGs. 3a and 3h are schematic illustrations of a flow' chart of a method of generating a friends-event key data entity by a social links engine module, in accordance with some demonstrative embodiments.
  • FIG. 4 is a schematic illustration of a flow chart of a method of generating a preferred media data entity by a content preference engine module, in accordance with some demonstrative embodiments.
  • FIG. 5 is a schematic illustration of a flow ' chart of a method of generating an available media data entity by a media handler module, in accordance with some demon strative embodiments .
  • Fig. 6 is a schematic illustration of a flow chart of a method of generating a friends data entity based on data captured from sensors by an Internet of things (loT) data handier module, in accordance with some demonstrative embodiments.
  • LoT Internet of things
  • FIG. 7 is a schematic illustration of a flow chart of a method of generating a friends available media data entity based on data captured from sensors by a IoT-media association engine module, in accordance with some demonstrative embodiments.
  • Fig. 8 is a schematic illustration of a flow chart of a method of generating a merged friends physiological data entity based on the friend data entity by a moments determination engine module, in accordance with some demonstrative embodiments.
  • Fig. 9 is a schematic illustration of a flow chart of a method of generating a user selected media data entity based on a preferred media data entity and a selected media data entity by a media selection engine module, in accordance with some demonstrative embodiments.
  • FIG. 10 is a schematic illustration of a flo w chart of a method of generating a selected media data entity based on the preferred media data entity by a content creation engine module, in accordance with some demonstrative embodiments.
  • FIG. 11 is a schematic illustration of a block diagram of a computing system for generating one or more data structure, in accordance with some demonstrative embodiments.
  • Fig. 12 is a schematic illustration of a flow chart of a method of generating a data structure by a computing system of Fig. 11 , in accordance with some demonstrative embodiments.
  • FIG. 13 is a schematic illustration of a block diagram of a machine in the form of a computing system, in accordance with some demonstrative embodiments.
  • FIGs. 14A-C are a schematic illustration of a of a data flow for generating output media, in accordance with certain demonstrative embodiments.
  • Fig. 15 is a schematic illustration of a flow chart of a method of generating output media, in accordance with certain demonstrative embodiments.
  • references to“demonstrative embodiment” indicate that the embodiment(s) so described may include a particular feature, structure, or characteristic, but not every embodiment necessarily includes the particular feature, structure, or characteristic. Further, repeated use of the phrase“in some demonstrative embodiments” does not necessarily refer to the same embodiments, although it may.
  • this term may refer to a personal computer, a server, a computing system, a communication device, a processor or processing unit (e.g.
  • DSP digital signal processor
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • non -transitory memory and“non-transitory storage medium” as used herein, include, for example, any volatile or non-volatile computer memory suitable to the presently disclosed subject matter.
  • signal may exclude, for example, transitory propagating signals, but may include, for example, any other signal suitable to some demonstrative embodiments.
  • data aggregation includes, for example, a process in which information is gathered and expressed in a summary ' form such as data structures.
  • the tenn“database (DB)” as used herein, includes, for example, an organized collection of data stored, for example, at a memory' of a DB server, which may be a dedicated computer to hold the DB and to ran a dedicated software to, for example, collect, store and manage the data of the DB
  • the DB may store one or more data structures, which include a collection of schemas, tables, queries, reports, views, and other elements.
  • Databases are presented here as a non-limiting example of data stores. To the extent that systems described in the presently disclosed subject matter receive information from the databases as input for their processing activities, the databases may be in some cases be non-lim iting examples of data sources.
  • module includes, for example, any hardware component, a software or firmware, and/or any combination of software, firmware and hardware, which may be embedded on a computing system, such as, for example, on a processor of a computing system .
  • databases may include, for example, a cloud DB, DB server, an in memory' DB, an active DB, distributed DB, an embedded DB, an End-user DB, SQL DB or the like.
  • a large amount of media content (video, audio, text etc.) related to this game is generated by various platforms. It may be required to send to Bob portions of a small number of media content items concerning the game, e.g. only the items that are expected to be the most interesting and relevant to Bob. In one non-limiting example, this requirement is due to the fact that Bob will not be able to watch the entire game in another example. Bob is not able to watch this particular game at all. In other examples, Bob may have pre-enrolled for a sendee which sends him highlights of the most interesting moments of the game. In still other examples, after the game is over he may initiate a request to a content creation system that it send him highlights.
  • Bob has enrolled to receive, and/or initiates a request to receive, media content related to the team Moscow in general, and not necessarily for the specific game event --- but the team Moscow is indicative of the Final Four game, since Moscow is playing in the game.
  • the request can be made, in some examples, via a user interface (e.g. web site, mobile application, dial-in to a service).
  • This enrolment, or this request are examples of a user indication sent to a content creation system.
  • This enrolment, or tins request are also examples of an event indication sent to a content creation system.
  • the enrolment and requests are also examples of an indication to generate one or more output media contents for the user, where the indication is associated with an event of interest to the user, and or with area(s) of interest of the user which are indicative of an event of interest.
  • the content creation system In choosing which content to send Bob, in some examples it is advantageous for the content creation system to take into consideration at least some of the following considerations or criteria: which media items are relevant to the game event, which friends of Bob have interests in common to Bob's as relates to tins Final Four event, what levels or degrees of excitement are measured by the IoT data of each friend at different points in time during the game. In some examples, the times of the event, the media and the IoT data are considered.
  • degrees of relationship of some or ail of: the event, the input media, the IoT data, Bob and/or his friends , to a particular area of interest (e.g. the team Moscow), are considered.
  • the system may balance the fact that Alice is a closer friend of Bob's, but that Carl has a higher interest in the particular event, and that each may exhibit high levels of excitement during different points of the game.
  • Bob's preferences for media types should be weighed.
  • Bob may be considered the user, and his friends considered the other users.
  • an event of interest to a user may also be an event of a common interest of the user and other users.
  • a content creation system and a method thereof are presented.
  • the content creation system may generate, for example, a media content of an event of interest of the user and some other users according to a degree of relationship between the user and tire other users to moments of the event of interest, user preferences, a degree of excitem ent of the user at some moments of the event.
  • a content creation system may receive information from one or more data sources.
  • the system may receive an indication to generate one or more output media contents for the user (e.g. Bob), where the indication is associated with an event of interest to the user (e.g. the Final Four game);, and/or one or more areas of interest of the user that are indicative of the event of interest (e.g.
  • the system may receive input media content items, along with media metadata characterizing them, e.g. time of capture, media type, information about topic of the media, and in some cases the event associated with the media content item
  • the system may receive degree of excitement data of at least one other user, as well as degree of excitement metadata that characterize this degree of excitement data " fire degree of excitement metadata may include a time associated with the degree of excitement data.
  • the system may in some cases receive user preference data relating to preferences of the user Bob, providing a first indication, of a degree of relation of the user to the area(s) of interest (e.g. sports, basketball, the team Moscow).
  • the system may in some cases receive user preference data relating to preferences of the other users (e.g.
  • the media metadata, and/or degree of excitement metadata may also include third and fourth indications, respectively the a degree of relation of these data items to the area(s) of interest. Some or all of this information input to the system may include weight values.
  • the system may receive other data.
  • the system may also receive data indicative of the degree or strength of the relationship between people, e.g. between the user and some or all of the other users (e.g. Bob and Alice, Bob and Carl).
  • the system may receive degree of excitement data of the user, as well as degree of excitement metadata that characterize this degree of excitement data.
  • the degree of exci tement metadata may include a time associated with the degree of excitement data.
  • the system may in some cases receive user media preference data, including at least media type preference and /or media size preference (e.g. Bob likes short video).
  • the media metadata may include input media content type and input media content size
  • time data disclosed here e.g. associated with a record
  • this time data may include one or more time ranges ("between tl to t2"). In that sense, "time associated with an event”, “ time of capture of media”, “time of degree of excitement data”, etc. may m some examples refer to time ranges.
  • the process to generate the output media content(s) may be as follows: the system may identify at least one candidate output media content.
  • the identification process may involve combining and reducing one or more of these data: the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the other user(s), tire user preference data associated with the user, and the user preference data associated with the at least one other user(s).
  • the combining and reducing data can be based at least partly on a degree of relationship of the time associated with the event, the time associated with the at least one input media content and the time associated with the degree of excitement data.
  • the combining and reducing data can be based at least partly on a degree of relationship of the area(s) of interest of the user indicative of the event of interest, the first indication of a degree of relation of the user to the area(s) of interest, the second indication of a degree of a relation of the other user(s) to the area(s) of the third indication of a degree of relation of the event to area(s) of interest, the fourth indication of a degree of relation input media content to the area(s) of interest, and/or the fifth indication of a degree of relation of the degree of excitement data to the area(s) of interest.
  • the identification process may involve combining and reducing the data indicative of a degree of relationship of the user and the other user(s) with the first data, and/or with other input data.
  • identifying the candidate output media content comprises combining and reducing the first data and the user media preference data, based on at least one of input media content type and media content size.
  • identifying the candidate output media content(s) involves normalizing data
  • the system can in some examples generate the output media content(s). hi some examples, the output media content] s) include at least a portion of the candidate output media content(s). in some examples, the system selects the output media content] s) from the candidate output media content(s) using at one or more output media selection criteria.
  • the system can in some examples provide the user with the output media content(s).
  • some or all of the input data may exist in data entities.
  • the above processing of data, to identify candidate output media content] s may involve:
  • At least part of the above data includes at least one data structure that includes one or more entity fields, a keyword field, a value field and a weight field.
  • the weight field can in some examples include a weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the value field.
  • the combining and reducing of data, and the generation of new data entities, may include matching Keyword-Value pairs of the KVW data structures.
  • the combining and reducing data involved setting a weight value for record of the new or modified data entities, by at least one of fusing the weight values of records of the new data entities, multiplying the weights values, merging weight values by addition or merging weight values by weighted averaging.
  • the setting a weight value for a record of a data entity may also involve normalizing the weight value, where the sum of normalized weight values is equal to 1.
  • the processing of the data to generate output media content is a complex process, involving multiple stages of generating data entities, including fusing and merging.
  • Non-limiting examples of such generation of data entities, so as to generate output media content are disclosed with reference to Figs. 3-10 further herein. More details on KVW data structures and possible methods utilizing them are discussed, for example, with reference to Figs. 1 1 and 12.
  • the input media content may include one or more media contents gathered by the plurality of data entities from at least one of one or more social networks, one or more content platforms, one or more feeds from the Internet and one or more websites.
  • automatic content may be created from captured media and physiological data.
  • the physiological data may provide an indication of excitement of the user and/or of a degree of excitement of the user.
  • such physiological data can be created from measurement of one or more physiological parameters, which may be measured, for example, by one or more sensors, for example embedded in a wearable Internet of Things (ToT) device.
  • physiological parameters may include heartbeats, sweat, body temperature, blood pressure or the like.
  • the automatic content may be created based on physiological data of the user. In some example cases, the automatic content may be created based on physiological data of other users, for example the friends of the user. In some example cases, the automatic content may be created based on physiological data of both the user and of other users.
  • degree of excitement data may in some cases be used to measure the degree of excitement of a user who, for example, is consuming and/or generating content.
  • degree of excitement data may be data based on image, video or audio recordings of a user or other users, recorded for example while the user is consuming content, or content generated by a user or other users, where the recordings have been analyzed to detect excitement based on user movement, gestures, expression, voice etc., for example using known techniques.
  • Still another example of measuring degree of excitement may be analysis of text generated by a user or other users, e.g.
  • Some additional non-limiting example methods of determining or populating the data that is input to the method of the presently disclosed subject mater are disclosed.
  • Data that is not especially provided for the present system to work such as video recordings, twitter feeds, associated metadata, etc. can be gathered and stored using known techniques in the field of media content.
  • Example methods are also disclosed herein for generation of keywords, values, and weights (KVW data) corresponding to entity field values.
  • the KVW data can be filled in manually, preferably keeping the number of possible and assigned keywords as small as possible.
  • the weight of the user in the data structure may be an input value and/or may be part of the user profile, rather than a configuration parameter. For example, users enrolling or registering for a media content service may check off, and/or type in as text, their areas of interest, such as their favorite sports, teams and pastimes, and maybe also enter a degree of interest ("I am a big fan/ 1 am a moderate fan/I have a low' interest"). Similarly, personnel of the content generator organization can in some cases manually enter KVW data for an event or for a media content item.
  • keywords may in some examples help ensure the consistent use of keywords throughout the system.
  • weight values e.g. setting the weight of a user's friendship to 100, and all other weights to 1 , in an automated fashion.
  • content analytics e.g. natural language processing of text media such as twitter feeds to derive the areas of interest and weights.
  • Another examples is as recognition on video or audio.
  • loT MetaData - can, for example, assign default values to all users.
  • the values can be established by organizational means, e.g. handing out sensors only to people present at the event.
  • a video associated event "E” can inherit KVW data from "E".
  • assigned weight values may he dynamic values. For example, the weight for an loT data associated to certain keywords may increase and/or decrease during the event of common interest.
  • data such as the preferred media sizes of users can have a certain range (e.g. + ⁇ /- 3 sec), which can be a system parameter or part of the user profile.
  • preferences such as media type and size can in some examples also be influenced or even overwritten by system preferences. This can be implemented, for example, as configuration parameters.
  • data that is initially entered manually may be modified automatically, in some cases making use of real time data.
  • Bob manually entered that his level of interest in Team Moscow is 0.2.
  • the raw IoT data degree of excitement, time of capture etc._ is populated using state of the art data capturing and processing technologies.
  • the keywords-values-weights of media and IoT data are known, and their relation to for example peoples can be derived from this knowledge.
  • the people's association with areas of interest, and with certain media and IoT data could be known, and the KVW of the media and IoT data could be derived at least in part from data of the people's associations.
  • the content may be adjusted to a given person and/or a user according to the person and/or user preferences, interests, physiological data, social relationships and strength of relationships of the content to the user into the account.
  • the person and/or the user preference may- include the media type, e.g., an image, a text, a video, an audio or the like.
  • the person and/or the user interests may include, for example, a favourite team, sport, music, art, movies or the like.
  • the person and/or the user social relationships may include, for example, family, work friends, social network friends, and/or any other type of social relationships.
  • the strength of relationships may be measured for example, by parameters such as, for example, sharing of the same interest, fans of the same team, only friends in the social network, and or any other parameter that may measure the strength of the social relationships.
  • the creation of the content may be done according to a given topic, e.g., an event of common interest with other users, selected by the user, thus the parameters and the data which may be used to create the content may be context sensitive to the given topic.
  • a content creation system may generate an output content based on an input media.
  • the input media and the generated output may consist of an arbitrary number of media contents and/or media types according to the person and/or user preferences.
  • an event of common interest may include a sports match, a contest, a concert, a show, a happening, a vacation or a trip.
  • content creation may be defined as contribution of information to any media, and more specifically to a digital media for an end-user and/or an audience and/or a person in specific contexts.
  • the content may be anything expressed through some medium, speech, text writing, video clip, a photograph and/or any of various arts for self-expression, distribution, marketing and/or publication.
  • some forms of content creation may include, for example, maintaining and updating of web sites, blogging, photography, videography, online commentary, maintenance of social media accounts, editing and/or distribution of digital media and/or any other tools and/or services suitable for content creation.
  • content creation may include an intelligent content, which may include, for example, a structured reusable content enriched with metadata and supported by intelligent content technologies.
  • the intelligent content may be the ever-changing content needs of users, e.g., customers, and the proliferation of channels and/or devices which the users may use to consume it.
  • the example application disclosed herein is generation of one or more output media contents of a user. It can be seen that alternate sendees can be provided, based on for example data structures disclosed herein, by changing the algorithm that the system runs on the data.
  • the system can analyze data of events, friends, interests etc., and make recommendations - e.g. the system can alert Bob that there is a Moscow game tomorrow at 6 PM, and that he may want to watch it on Channel 2. The system may also ask if he also wants to receive the most exciting scenes.
  • data of the areas of interest of users, and of degree of excitement of users can be analyzed by a system such as disclosed with regard with Fig. 1. The system can identify a large overlap of interests (e.g a group of people have similar interests in sports and teams, as well as similar tastes in cooking and political parties), and can take an action to create a social -network community and suggest to the users to join the community.
  • Fig. 2 provides an example generalized flow' for generating output media content.
  • Figs. 3-10 describe various detailed steps for an example implementation of a method for generating output media content.
  • Fig. 14 describes an example data flow for methods exemplified by Figs 3-10. The example methods disclosed with respect to Figs. 3-10 and Fig. 14 make use of 4 example databases described with respect to Tables 1-4 herein.
  • Fig. 15 it illustrates one example of a process for generating output media, in accordance with certain demonstrative embodiments in some cases, this process may make use of the methods described with reference to Figs. 3 to 10, based on a data flow' such as described further herein with reference to Fig. 14. In some cases, the methods of Fig. 15 may be implemented with use of the systems 100 and/or 1300, which are described further herein with reference to Figs. 1 and 13.
  • the system may generate a data entity indicative of the association of the user, and of other user(s), with the event of interest to the user, based on the degree of relationship between the user to the other user(s).
  • this step may include combining and reducing data of at least the following: the event metadata associated with the event of interest to the user, the user preference data associated with the user, the user preference data associated with the oilier user(s), and the data indicative of the degree of relationship of the user to other users. Example methods for performing this step are disclosed further herein with regard to Figs. 3a and 3b.
  • the system may generate a data entity indicative of one or more media types and of one or media sizes associated with the one or more media types, according to user media preference data.
  • Example methods for performing tins step are disclosed further herein with regard to Fig. 4.
  • the system may generate a first data entity indicative of the input media content(s), based on at least one of: association of the available media contents with the event, and association of the available media contents with the user preference data associated with the user and with the user preference data associated with other user(s).
  • this step may include combining and reducing data of at least the following: the media metadata related to the input media content, the event metadata associated with the event of interest to the user and the data entity indicative of the association of the user and the other user(s) with the event of interest to the user. Example methods for performing this step are disclosed herein with regard to Fig. 5.
  • the system may generate a data entity indicative of the degree of excitement data of the other user(s), based on at least one of the following: association of the degree of excitement data with the event, and association of the degree of excitement data with at least the user and/or with the other user(s).
  • this step may include combining and reducing data of at least the following: the degree of excitement metadata of the other user(s), the event metadata associated with the event of interest to the user, and the data entity indicative of the association of the user and other user(s) with the event of interest to the user. Example methods for performing this step are disclosed further herein with regard to Fig. 6.
  • the system may generate a data entity indicative of association between the input media content(s) and the degree of excitement data of the other user(s). In some examples, this generation may be performed based on one or more points of time during the event of the interest to the user. In some examples, several of these points may define intervals of time during the event. In some examples, this step may include combining and reducing the first data entity indicative of the input media content(s) and the data entity indicative of association between the input media content(s) and the degree of excitement data of the other user(s). Example methods for performing this step are disclosed further herein with regard to Fig. 7.
  • step 1580 the system may generate a data entity indicative of points in time within one or more times of interest of the event of interest to the user, according to the degree of excitement data. In some examples, this step can be performed according to an indication of relevance that is based on the degree of excitement data of the other user(s). In some examples, for each such time of interest, generating this data entity can be based on the most highly weighted degree of excitement corresponding to that time of interest. Example methods for performing this step are disclosed further herein with regard to Fig. 8.
  • step 1590 the system may generate a second data entity, that is indicative of the available media content(s), where available media contents correspond to the points in time within the time(s) of interest of the event of interest to the user.
  • the second data entity is weighted based on the preferences of the user.
  • this step may include combining and reducing at least the following: the data entity indicative of points in time within at time(s) of interest, the data entity indicative of association between input media content(s) and the degree of excitement data of other user(s), and tire data entity indicative of one or more media types and of one or media sizes associated with the one or more media types.
  • Example methods for performing this step are disclosed further herein with regard to Fig. 9.
  • the system may generate a third data entity indicative of the available media contents(s), based on a relevance field associated with the available media content(s) and on a weight field associated with them . This step may be part of generating the one or more output media contents.
  • this third data entity can be indicative of the one or more media sizes associated with the one or more media types.
  • Example methods for performing this step are disclosed herein with regard to Fig. 10.
  • Fig. 1 it presents a schematic illustration of a block diagram of a content creation system !OG to generate a user based media content, in accordance with some demonstrative embodiments.
  • content creation system 100 may be configured to generate one or more output media contents 114 associated to an event of a interest of a user, and in some cases also of interest one or more other users.
  • Content creation system may include a computing system 105 configured to generate the one or more output media contents 114.
  • computing system 105 may include, a desktop computer, a laptop computer, a tablet computing device, a mobile computing device, a cellphone, a smartphone, a gaming console, a smart television or any other computing platform which includes capabilities of data processing.
  • Computing system 105 may, by way of non -limiting example, comprise one or more processing circuitries 107.
  • Processing circuitry 107 may comprise a processor 110 and memory 108. Processing circuitry 107 is shown in Fig 1 as a broken line.
  • computing system 105 may include a processor (also referred to herein as hardware processing circuitry) 110, which may be configured to generate the one or more output media contents based on an event indication 1 12 and a user indication 1 13.
  • processor also referred to herein as hardware processing circuitry
  • computing system 105 may include a processor (also referred to herein as hardware processing circuitry) 110, which may be configured to generate the one or more output media contents based on an event indication 1 12 and a user indication 1 13.
  • event indication 112 may indicate the event "E" of interest to the user, which may be watched by a user, and or shared by the user to the other users, and/or an event which the user may be participant.
  • event "E” is of common interest to the user and to other users. Note that watching or viewing an event are presented here as non-limiting examples of consumption of media of an event by the user. Other examples of such consumption include listening to audio, reading text on a computer etc.
  • event indication 112 may include any other indication to indicate a user field or area of interest. Tins area of interest may in some examples be indicative of an event of interest to the user.
  • user indication 113 may be configured according to a user profile and may include, for example, a user name, a user nick name, a user email address, and/or any other indication of the user.
  • hardware processing circuitry 1 10 may include a general-purpose processor, coprocessor or special-purpose processor, such as, for example, a network or communication processor, compression engine, graphics processor, a general purpose graphics processing unit (GPGPU), a high- throughput many integrated core (MIC) coprocessor (including 30 or more cores), multi cores processor, embedded processor, or the like.
  • Hardware processing circuitry 110 may thus be referred to herein interchangeably as processor 110.
  • the processing circuitry 107 may also include, in some examples, one or more memories 108 According to some examples of the presently disclosed subject matter, the memory 108 can be configured to hold data used by some or all of modules 150, 155, 160, 165, 170, 175, 180, 190, 195.
  • the databases 120, 122, 124, 126, 128, 130 are shown as external to memory 108.
  • data stores such as for example some or all of those DBs shown in Fig. 1, may reside on memory 108. These are non-limiting examples of data items that may make use of memory 108.
  • Examples of details of system components, and of system architecture, are describe with reference Fig. 13 further herein.
  • the hardware processing circuitry 110 may be configured to receive from the plurality of data stores, e.g., databases, data associated with the event of the common interest of the user and the one or more other users.
  • tire data may include a first indication on a degree of a relation of the user to the event of the common interest and a second indication of a degree of relation of the other users to the event of the common interest
  • the first indication and/or the second indication may be include a weight value to indicate on the degree of the relation.
  • a high value, e.g., 1, may indicate on a strong relation
  • a low value e.g., 0, may indicate on no relation, e.g., as described below.
  • the hardware processing circuitry 1 10 may receive an event indication 1 12 to indicate the event of the common interest and to generate one or more output media contents 114 by processing at least one of: an input media content, user physiological data, social and personal data, event metadata, at least one of user preferences and media metadata related to the input media content, based on the first indication on tire degree of relation of the user to the event of the common interest and the second indication of the degree of relation of the other users to the event of the common interest, and a degree of relationship of the one or more user preferences with a time of creation of tire user physiological data and a time of viewing of the one or more input media contents where the output media content(s) includes at least a portion of the input media content whose associ ated metadata corresponds at least partially to the user preference, and to provide the user with the one or more output media contents 114, e.g. as described below' with reference to Figs. 2-10.
  • hardware processing circuitry 110 may be configured to set the degree of relationship of the one or more user preferences of the user preferences with the time of creation of the user physiological data and the time of viewing of the one or more input media contents by determining one or more weights to indicate the degree of relationship of the one or more media content to the one or more user preferences, e.g as described below' with reference to Figs. 2-10.
  • hardware processing circuitry 1 10 may include one or more engine modules 165, 170, 175, 180, 190 and 195 and one or more handler modules 155 and 160.
  • the one or more engine modules 165, 170, 175, 180, 190 and 195 and one or more handier modules 155 and 160 may be configured to generate one or more data entitles, e.g. data structures, based on data received from the at least one data entity of the plurality of data entities, e.g. as described below.
  • hardware processing circuitry 110 may include an orchestrator module 150 operably coupled to the one or more engine modules 165, 170, 175, 180, 190 and 195 and to the one or more handler modules 155 and 160.
  • the orchestrator module 150 may be configured to coordinate an operation of the one or more engine modules 165, 170, 175, 180, 190 and 195 and the one or more handler modules 155 and 160 according to a predetermined sequence, e.g. as described below.
  • orchestrator module 150 may be configured to: provide to the user the one or more output media contents 114 generated based one or more data portions generated by the one or more engine modules 165, 170, 175, 180, 190 and 195 and the one or more handler modules 155 and 160, e.g. as described below.
  • the one or more output media contents 114 may include one or more media of selected moments of the event of common interest.
  • the one or m ore m edia of selected moments may be selected , for example, according to the degree of relationship of the user physiological data generated at a time of the event of common interest with the one or more media contents and the degree of the relation of the user to the other user and to the event of common interest.
  • an engine of the one or more engine modules 165, 170, 1 75, 180, 190 and 195 may be configured to receive data from die plurality of data entities and to generate a new data entity using, for example, data aggregation methods and according to a predetermined criteria, e.g. as described below.
  • the new generated data entity may include a plurality of fields.
  • the plurality of fields may include at least one of a keyword field, a value field and a weight field.
  • the new data entity may be provided to the orchestrator module 150 for generating the one or more output media content 1 14.
  • the engine of the one or more engine modules 165, 170, 175, 180, 190 and 195 may be further configured to generate the new data entity by combining two or more data entities, set a weight value to a row of the new data entity by summing weights values of rows of the new data entities and normalized the weight value, e.g. each weight value of each row of the new data entity.
  • normalized the weight value may include setting values at the weight field which result, when they are summed, a sum equal to 1, e.g. as described below
  • the weight value may be normalized at least by one of fusing the weight values, multiplying the weights values, merging weight values by addition or merging weight values by weighted averaging, e.g as described below. Example cases of this may be found in the description of Figs. 3 to 10, further herein.
  • content creation system 100 may include a plurality of databases, e.g., as described below.
  • content creation system 100 may include a media metadata DB 120, operably coupled to computing system 105
  • media metadata DB 120 may be configured to provide media metadata 121 to computing system 105.
  • media metadata 121 may include a media identifier (ID), a time when the content was encoded, a media type, such as, for example, video, audio, text, image, photograph, a feed from social networks, or the like.
  • ID media identifier
  • media type such as, for example, video, audio, text, image, photograph, a feed from social networks, or the like.
  • the metadata at media metadata DB 120 may be arranged m a data structure, for example, Table 1, as described herein below.
  • Table 1 may be generated according to concepts of keywords and weights and may hold information of the media in the form of 6-tuples of MedialD, Type, Time, Keyword, Value and Weight.
  • content creation system 100 may include raw Internet of Tilings (IoT) data DB 122, operabiy coupled to computing sy stem 105.
  • the raw IoT data DB 122 may be operabiy coupled to one or more sensors 145, and may be configured to provide raw IoT data 123 to computing system 105.
  • raw IoT data may include physiological data of the user, measured during the event of common interest.
  • the physiological data may include a heart rate (HR), a skin response (SR), a body temperature (BT), a breathing rate (BR), step counter (SC) or the like.
  • content creation system 100 may include an IoT metadata DB 124, operabiy coupled to computing system 105, and may be configured to provide IoT metadata 125 to computing system 105.
  • IoT metadata 125 may include a sensor identifier (ID), a sensor type, a sensor model, a sensor location, sensor capabilities or the like.
  • data of IoT metadata DB 124 may be arranged in a data structure, for example, Table 2, which may include portions of raw IoT data 123, e.g. as described below.
  • Table 2 may be generated according to concepts of keywords and weights and may hold information of the media in the form of 6-tuples of loTID, User, Time, Keyword, Value and Weight.
  • content creation system 100 may include an event metadata DB 126, operably coupled to computing system 105.
  • event metadata DB 126 may be configured to provide event metadata 127 to computing system 105.
  • event metadata 127 may include an event ID, an event location, event type, event date, event time, a list of participants or the like.
  • data of event metadata DB 126 may be arranged in a data structure, for example Table 3, e.g. as described below.
  • Table 3 may be generated according to concepts of keywords and weights, and may hold information of the media in the form of 4-tuples of Event, Keyword, V aiue and Weight.
  • the Event field may include the name of the event of common interest, for example, Europe League Final Four 2016 (ELFF’16).
  • content creation system 100 may include a raw media DB 130, operably coupled to computing system 105.
  • raw media DB 130 may be operably coupled to one or more media sources 140, and may be configured to provide one or more input media contents 131 to computing system 105.
  • input media contents 131 may include a video clip, an audio recording of the event, images, photographs, text messages, feeds from the Internet, feeds from social networks or the like.
  • one or more media sources 140 may include, for example, a video camcorder (Cam), a microphone (Mic) an audio recorder, a text messaging application, the Internet, social networks, e.g , a Twitter Feed, television broadcasting or the like.
  • content creation system 100 may include a user profile DB 128, operably coupled to computing system 105.
  • user profile DB 128 may ⁇ be configured to provide user profile data 129 to computing system 105.
  • user profile data 129 may include a user name, user friends, user hobbies, e.g. sport, user preferences, e.g. desired media length for audio, video and text, or the like.
  • data of event metadata DB 126 may he arranged in a data structure, for example Table 4, e.g. as described below.
  • Table 4 may be generated according to concepts of keywords and weights as described above, and may hold information on users in the form of, for example, quadruples of User, Keyword, Value, Weight.
  • a user e.g. Bob
  • Alice e.g. as defined in the Value field, e.g. Value column and/or Value attribute
  • the strength of the relationship between Bob and Alice may be provided at the Weight field, e.g. Weight column and/or Weight attribute.
  • the DBJLiSERPROFILE includes at least three general categories of information about a user: their relationship to other users, their interest in certain topics (for example, related to hobbies and to entertainment choices), and their preferences regarding media formats and related characteristics such as media length.
  • each category of information can reside in a separate DB or other data structure.
  • the plurality of databases may be configured to provide different types of data from a plurality of different sources.
  • the different type of data from the plurality of databases may be provided to the hardware processing circuitry 110 to be processed by a plurality of engines, e.g. engines 165, 170, 175, 180, 190 and 195, and handlers, e.g. handlers 155, 160, of hardware processing circuitr ' 110 to generate one or more output media content 114, e.g. as described below.
  • each of the databases may be implemented by hardware and/or by software and/or by any combination of hardware and software.
  • hardware processing circuitry 1 10 may include an Orchestrator module 150, configured to process data received, for example, from the plurality of databases, the plurality of engine modules and the plurality of handler modules, and to generate, for example, the one or more output media content 114, e.g a media content associated with the event of common interest, based on event indication 112 and user indication 113, e.g. as described below.
  • Orchestrator module 150 configured to process data received, for example, from the plurality of databases, the plurality of engine modules and the plurality of handler modules, and to generate, for example, the one or more output media content 114, e.g a media content associated with the event of common interest, based on event indication 112 and user indication 113, e.g. as described below.
  • Orchestrator module 150 may generate any other output media content 114, based on other indications rather than event indication 1 12 and user indication 113.
  • Orchestrator module 150 may control a workflow for generating a content according to a user’s preferences, for example, an event of common interest.
  • Orchestrator module 150 may be an entry point and/or an exit point to at least some engine modules and handler modules of hardware processing circuitry ' 110, e.g. as described below.
  • Orchestrator module 150 may call engine modules 165, 170, 175, 180, 190 and 195 and handler modules 155 and 160, one by one in a predetermined sequence, sending them the required input and storing their output.
  • the Orchestrator module 150 may generate the one or more output media contents 1 14 based on data generated by the one or more engine modules and the one or more handler modules when called, according to a predetermined sequence.
  • Orche strator module 150 may call the one or more engine modules 165, 170, 175, 180, 190 and 195 and/or handler modules 155 and 160 in any desired order, e.g. in a serial order or in a parallel order.
  • Orchestrator module 150 may call first, the social links engine module 170 to generate a first data entity, e.g. a first data structure, a first table (e.g. Table 12 below'), which may include a list of persons linked to the user and related to the event.
  • the Orchestrator module 150 may call second, the content preferences engine module 165 to generate a second data entity, e.g. a second data structure, a second sable (e.g. Table 13-2), which may include a list of media types and sizes preferred by the user.
  • the Grchestrator module 150 may call third, the media handler module 155 to generate a third data entity, e.g. a third data structure, a third table (e.g.
  • the Orchestrator module 150 may call fourth, the IoT data handler module 160 to generate a fourth data entity, e.g. a fourth data structure, a fourth table (e.g. Table 20), which may include a list of IoT Data and intervals associated with the users and the event.
  • the Orchestrator module 150 may call fifth, the IoT-media association engine module 175 to generate a fifth data entity, e.g. a fifth data structure, a fifth table (e.g. Table 21 ), which may include a list of IoT data, media, media type, and time intervals where IoT data and media may be both available.
  • the Orchestrator module 150 may call sixth, the moments determination engine module 180 to generate a sixth data entity, e.g. a sixth data structure, a sixth table (e.g. Table 23), which may include a list of moments in time created for different combinations of IoT Data, together with time intervals of moments detected and relevance, e.g. level or degree of excitement, of the moments.
  • the Orchestrator module 150 may call seventh, the media selection engine module 190 to generate a seventh data entity, e.g., a seventh data structure, a seventh table (e.g. Table 25), which may include a list of IoT Data, relevant time intervals, corresponding media, media type and relevance, e.g. level or degree of excitement of the moments.
  • the Orchestrator module 150 may call eighth, the content creation engine module 195 to generate an eighth data entity, e.g. an eighth data structure, an eighth table (e.g. Table 26), which may include a list of IoT Data, relevant time intervals, corresponding media, media type and weight.
  • this sequence may be performed in manner such as that described with reference to Fig.15, and to Fig. 14 further herein. This sequence may in some example cases make use of the methods described with reference to Figs. 3 to 10 further herein.
  • Figs. 3-10 describe some methods that may performed in each module, in some example cases.
  • Orchestrator module 150 may call any other engine module and/or any other handler module in a different order and/or in parallel to generate the one or more data structures, e.g. the tables.
  • Orchestrator module 150 may include, e.g. a standard workflow engine, and/or any other engine, and/or any other software module.
  • Example operations of the plurality of engine modules 165, 170, 1 75, 180, 190 and 195 and the plurality of handler modules 155 and 160, in accordance with some demonstrative embodiments will be detailed described now.
  • hardware processing circuitry 110 may include a media handler module 155, which may be operably coupled to Orchestrator module 150 and may be configured to generate a data entity e.g. a data structure, a list and/or a table.
  • a data entity e.g. a data structure, a list and/or a table.
  • the data entity includes media related to the event of the common interest and to the user, prioritized according to a degree of relation between the event of the common interest and the user, e.g. as described below.
  • the media may include media (Mi) of given types (Ti) and interval (li) associated to the user, e.g. a person PI as indicated by user indicator 1 13, and to an event (E), e.g. as indicated by event indication 112 based on data received from one or more databases, e.g. user profile DB 128, event metadata DB 126 and/or media metadata DB 120, and/or from the one or more engine modules and/or handler modules e.g. as described below'.
  • media handler module 155 may perform the method of Fig. 5, described further herein.
  • hardware processing circuitry ' 110 may include an loT data handier module 160, which may be operably coupled to Orchestrator module 150 and may be configured to generate a data entity, e.g. data structure, a list and/or a table.
  • the data entity may include the physiological data of the user related to the event of the common interest and to the user, prioritized according to a degree of relation of the event of the common interest and the user, e.g. as described below.
  • loT data handler module 160 may perform the method of Fig. 6, described further herein.
  • ToT data handler module 160 may generate the data entity of physiological data, e.g. loT Data (Di), and interval (li) associated to user PI and event E based on the data received from user profile DB 128 and/or IoT metadata DB 124 and/or any other databases and/or engine.
  • loT Data e.g. loT Data (Di)
  • interval li
  • hardware processing circuitry 1 10 may include a content preferences engine module 165, which may be operably coupled to Orchestrator module 150 and may be configured to generate a data entity, e.g. a data structure, a list and/or a table, of preferred media types and sizes based on the data received from, for example, user profile DB 128, e.g. as described below'.
  • the data entity may include one or more media types, and one or media sizes associated with the one or more media types.
  • Tire data entity may be prioritized according to a degree of relation to the event of common interest and the user, e.g. as described below.
  • content preferences engine module 165 may perform the method of Fig. 4, described further herein.
  • hardware processing circuitry 110 may include a social links engine module 170, which may be operably coupled to Orchestrator module 150 and may be configured to generate a data entity, e.g. a data structure, a list and/or a table, of friends of the user associated with the event, based on the data received from, for example, user profile DB 128 and/or event metadata DB 126.
  • the data entity may include one or more users.
  • Tire data entity may be prioritized according to a degree of relation to the event of common interest and to the user, e.g. as described below.
  • social links engine module 170 may perform the method of Figs. 3a and 3b, described further herein.
  • hardware processing circuitry 1 10 may include an IoT-media association engine module 175, which may be operably coupled to Orchestrator module 150, and may be configured to generate a data entity, e.g. a data structure, a list and/or a table, of media preferred by friends of the user associ ated with the event, based on the data generated and/or received from, for example, other engine modules.
  • the data entity may include one or more available media contents generated at one or more points of time during the event of the common interest and may be prioritized according to the physiological data of the user and physiological data of the one or more other users, e.g. as described below.
  • IoT-media association engine module 175 may perform the method of Fig. 7, described further herein.
  • hardware processing circuitry' ⁇ 110 may include a moments determination engine module 180, which may be operably coupled to Orchestrator module 150, and may be configured to generate a data entity, e.g. a data structure, a list and/or a table, of moments in time created for different combinations of ToT Data (Di), together with interval (li) of moments detected and relevance, e.g. degree or level of excitement, of the moments (Ri), based on the data generated and/or received from, for example, other engine modules.
  • the data entity may include points in a time of common interest of the event of the common interest according to one or more users physiological data sharing the event of the common interest, e.g. as described below.
  • moments determination engine module 180 may perform the method of Fig. 8, described further herein.
  • hardware processing circuitry 110 may include a media selection engine module 190, which may be operably coupled to Orchestrator module 150, and may be configured to generate, a data entity, e.g. a data structure, a list and/or a table, of IoT Data (Di), relevant interval (Ii), corresponding media (Mi), media type (Ti) and relevancy, e.g level of excitement, of the moments (Ri), based on data generated and/or received from, for example, other engine modules.
  • a data entity e.g. a data structure, a list and/or a table, of IoT Data (Di), relevant interval (Ii), corresponding media (Mi), media type (Ti) and relevancy, e.g level of excitement, of the moments (Ri), based on data generated and/or received from, for example, other engine modules.
  • the data entity- may include an indication on a preselected media generated at one or more point of times of the event of the common interest corresponded to the user physiological data, and prioritized according to a degree of relation of input data to the moments and to the user preferences, e.g. as described below.
  • media selection engine module 190 may perform the method of Fig. 9, described further herein
  • hardware processing circuitry- 1 10 may include a content creation engine module 195, which may be operably coupled to Orchestrator module 150, and may be configured to generate a data entity, e.g. a data structure, a list and/or a table, of selected media content created during the event, based on data generated and/or received from, for example, other engine modules.
  • the data entity may include a content personalized to the user in a context of the event of the common interest, according to user preferences, detected points of time of the event of the common interest, physiological data of the user and a media generated at the point of times of the event of the common interest, e.g. as described below.
  • content creation engine module 195 may perform the m ethod of Fig. 10, described further herein.
  • content creation system 100 may be configured to generate one or more output media contents 114 associated with the event.
  • content creation system 100 may include one or more media databases, e.g. media metadata DB 120 and raw media DB 130, to store one or more input media contents 131, and media metadata 121 related to a media associated with the event, which may be received from one or more media sources 140.
  • content creation system 100 may include an event DB, e.g. event metadata DB 124, which may be configured to store data associated with the event, e.g. event metadata 127.
  • content creation system 100 may include a physiological database, e.g. raw IoT DB 122, which may be configured to store user physiological data, e.g. raw IoT data 123, collected from one or more sensors 145.
  • content creation system 100 may include an IoT media metadata DB 124, which may be configured to store IoT media, e.g. data 125.
  • content creation system 100 may include a user profile DB
  • the user and user preferences including a type of media, the user physiological data, a degree of user social relationships parameter and social networks data.
  • any or all of the modules may be configured to generate the relevant data structures, e.g. list and/or table entities, based on the data received from any other databases and/or other data sources, if desired, e.g. as described below.
  • any or ail of the modules may be configured to generate any other data entity from any other databases and/or data structure using data aggregation techniques and/or any other data manipulating techniques.
  • each module and database may be implemented by hardware and/or software and/or any combination of hardware and software.
  • modules and DBs described with reference to Fig. 1 are non limiting examples. Some or all of the modules and DBs may' be combined, other modules and DBs may exist, and functions described with reference to one module or DB may be split among more than one module or DB.
  • FIG. 2 it presents a schematic illustration of an example flow chart of a method 200 of generating a user based media content associated with an event, in accordance with some demonstrative embodiments.
  • the method 200 may be implemented by the hardware processing circuitry 110 and may include, for example, receiving from the plurality of databases, e.g. databases 120, 122, 124, 126, 128 and 130, data associate with the event of the common interest of the user and the one or more other users.
  • databases e.g. databases 120, 122, 124, 126, 128 and 130
  • the hardware processing circuitry 110 may receive an indication to generate one or more output media contents for the user ( text box 205). This indication may associated with an event of interest to the user (e.g. event indication 112) and/or with at least one area of interest of the user indicati ve of the event of interest. The event of interest to the user may he an event of common interest to the user and other user(s).
  • This indication may associated with an event of interest to the user (e.g. event indication 112) and/or with at least one area of interest of the user indicati ve of the event of interest.
  • the event of interest to the user may he an event of common interest to the user and other user(s).
  • Hardware processing circuitry 1 10 may receive data.
  • the data received may include one or more input media content(s), and degree of excitement data (e.g. physiological data) of other user(s) (text box 210).
  • the input media content(s) include media content gathered by the at least one data source from at least one of one or more social networks, one or more content platforms, one or more feeds from the Internet and one or more websites
  • First data may be received.
  • the first data includes: event metadata associated with the event of interest to the user, media metadata related to the input media content(s), degree of excitement metadata of other user(s), user preference data (e .g. personal data) associated with the user (a first indication of a degree of relation), user preference data (e.g. personal data) associated with the other user(s) (a second indication of a degree of a relation) (text box 220).
  • the event metadata, the media metadata and/or the degree of excitement data may include time information (e.g. timestamps or time ranges) associated with the event, the input media content(s) and/or the degree of excitement.
  • the event metadata, the media metadata and/or the degree of excitement data may include degree of relation data.
  • the event metadata may comprise a third indication of a degree of relation, of the event to the area(s) of interest;
  • the media metadata may comprise a fourth indication of a degree of relation, of the at least one input media content to the area(s) of interest;
  • the degree of excitement metadata of the other user(s) may comprise a fifth indication of a degree of relation, of the degree of excitement data to the area(s) of interest.
  • one or more of the following data may also be received: data indicative of a degree of relationship of the user to other user(s) (e.g. social data), degree of excitement data of the user, degree of excitement metadata of the user, and/or user media preference data (media type preference and/or media size preference) (text box 225).
  • data indicative of a degree of relationship of the user to other user(s) e.g. social data
  • degree of excitement data of the user e.g. social data
  • degree of excitement metadata of the user e.g. social data
  • user media preference data media type preference and/or media size preference
  • First data may be received.
  • the first data includes: event metadata associated with the event of interest to the user, media metadata related to the input media content(s), degree of excitement metadata of other user(s), user preference data (e.g. personal data) associated with the user (a first indication of a degree of relation), user preference data (e.g. personal data) associated with the other user(s) (a second indication of a degree of a relation).
  • the event metadata, the media metadata and/or the degree of excitement data may include time information (e.g. timestamps or time ranges) associated with the event, the input media content(s) and/or the degree of excitement.
  • the event metadata, the media metadata and/or the degree of excitement data may include degree of relation data.
  • the event metadata may comprise a third indication of a degree of relation, of the event to the area(s) of interest;
  • the media metadata may comprise a fourth indication of a degree of relation, of the at least one input media content to the area(s) of interest;
  • the degree of excitement metadata of the other user(s) may comprise a fifth indication of a degree of relation, of the degree of excitement data to the area(s) of interest.
  • one or more of the following data may also be received: data indicative of a degree of relationship of the user to other user(s) (e.g. social data), degree of excitement data of the user, degree of excitement metadata of the user, and/or user media preference data (text box 225).
  • degree of excitement data of the user and degree of excitement data of the other user(s) can be based on physiological data of the relevant user.
  • user media preference data includes media type preference and/or media size preference.
  • the additional data may also be referred to herein as second data.
  • the hardware processing circuitry' 110 may, in some examples, identify' one or more candidate output media content(s).
  • the candidate output media contents may include at least a portion of the input media content(s).
  • the identification process may include processing such as combining and reducing at least the First Data.
  • the combining and reducing data is based at least partly on a degree of relationship of the time associated with the event, the time associated with the at least one input media content and the time associated with the degree of excitement data hi some examples, combining and reducing the data is based on the degree of relationship of the area(s) of interest of the user indicative of the event of interest (e.g.
  • the identification process may include combining and reducing first data and the data indicative of a degree of relationship of the user to the at least one other user. In some examples, the identification process may include combining and reducing the first data and the data of the degree of excitement metadata of the user. In some examples, die identification process may include combining and reducing the first data and the user media preference data, based on at least one of input media content type and media content size. In some examples, the identification process may include combining and reducing the second data.
  • the combining and reducing may be done by matching Keyword-Value pairs of Keyword-Value-Weight data structure(s), and by setting weight values of weight values within these data structures.
  • setting the weight value can be done by at least one of fusing tire weight values of records of the new data entities, multiplying the weights values, merging weight values by addition and merging weight values by weighted averaging
  • setting a weight value may include normalizing the weight value, where the sum of normalized weight values is equal to 1. Examples of these processes may be found in the description of Figs. 3 to 10, further herein .
  • one or more output media contents 1 14 may be generated (text box 240).
  • the generated output media contents may m some eases include at least a portion of the candidate output media content(s).
  • the generating includes selecting the output media content(s) from candidate output media content(s) using one or more output media selection criteria. Non-limiting examples of such selection criteria are disclosed further herein with reference to Fig. 10.
  • the hardware processing circuitry 1 10 may provide the user with the one or more media output content(s) 114 (text box 250).
  • the user preferences may include the user specific field of interest associate with the event.
  • the user preferences may include for example the highlight moments of his favourite player and/or his favourite team.
  • the content creation system 100 may process any data associate with the event and any data that may be generated at the time of the event to provide the output media content 114, e.g. video, text messages, images, voice and etc., based on the user preferences, a time of creation of the user physiological data 123 and the media content 13.
  • data processing may include: weighting the strength of relationship of the user preferences with the user the user physiological, the time of creation the user physiological data 123 and/or the media content 131 associated with the event and the time of creation the user physiological data 123.
  • the content creation system 100 may provide the user the output media content 1 14 according to the weight of the parameters that defined by the user preferences, for example, media content with the highest weight.
  • generating the one or more output media contents 1 14 may be done, by calling, e.g. by Orchestrator module 150, in a predetermined sequence, one or more engine modules, e.g. engine modules 165, 170, 175, 180, 190 and 195, and/or from one or more handler modules, e.g. handler modules 155 and 160 and generating the one or more output media contents 1 14 based on data generated by the one or more engine modules e.g. engine modules 165, 170, 175, 180, 190 and 195, and/or from one or more handler modules, e.g. handier modules 155 and 160.
  • one or more engine modules e.g. engine modules 165, 170, 175, 180, 190 and 195
  • the content creation system 100 may generate the output media content 1 14 based the user physiological data, social and personal data and the event metadata, based on the user preferences, a time of creation of the user physiological data 123 and the media content 131 by processing at least one of the input media content 127, the user physiological data, social and personal data and the event metadata according to any other algorithms, data processing methods and by using any other hardware structure and/or any other modules.
  • generating the one or more output media contents 1 14 may be done, for example, by selecting moments in time of the event according to the user physiological data generated at the time of the event 1 14.
  • providing the input media contents 131 may be done by gathering from at least one or more social networks one or more media contents and generating the input media contents 131 based on the one or more media content.
  • the hardware processing circuitry 110 may generate the one or more output media content 114, according to one or more data entities, e.g. one or more data structures, generated by at least one of the one or more engine modules, e.g. engine modules 165, 170, 175, 180, 190 and 195, and/or from one or more handler modules, e.g. handier modules 155 and 160.
  • one or more data entities e.g. one or more data structures
  • the hardware processing circuitry 110 may generate the one or more output media content 114, according to one or more data entities, e.g. one or more data structures, generated by at least one of the one or more engine modules, e.g. engine modules 165, 170, 175, 180, 190 and 195, and/or from one or more handler modules, e.g. handier modules 155 and 160.
  • the hardware processing circuitry 110 may be configured to set a degree of relationship of the one or more user preferences of the user preferences with the time of creation of the user physiological data and a time of creation of the one or more media content by attaching to the one or more media content one or more weights to indicate the degree of relationship of the one or more media content to the one or more user preferences.
  • FIG. 3-10 One non-limiting example of the operation of the plurality of engine modules 165, 170, 175, 180, 190 and 195, and the one or more handier modules, e.g. handler modules 155 and 160 will be described now with reference to Figs. 3-10.
  • the figures describe some example methods that may be performed by each module. In some examples, the methods described with reference to Figs. 3-10 may be performed in a sequence such as that described with reference to Figs. 14 and 15 further herein.
  • Figs. 3a and 3b they present schematic illustrations of a flow chart of a method 300 of generating a friends-event key data entity, e.g. data structure, by a social links engine module 170, in accordance with some demonstrative embodiments.
  • social links engine 170 may be called first by Grchestrator module 150 in order to generate output media content 114.
  • social links engine 170 may tap into social relationships of a given person, e.g. the user, and may generate a data structure, e.g. a list and/or a table and/or the like, of the user friends.
  • the data entity e.g., tire data structure, the list and/or the table, may be prioritized by the degree of social relationships and the friends’ association to the event, which may be indicated by event indication 1 12.
  • social links engine 170 may receive an event indication, e.g., event indication 112, a user name, data 129, e.g. DB_USERPROFILE, from user profile DB 128, data 12.7, e.g. DB EVENTMETADATA, from event metadata DB 126, and may output a data structure, e.g. a list and/or a table, of persons linked to the user and related to the event.
  • the data structure e.g. the list and/or the table, may include weights (Wi), which may be used to derive the priority of the data entity, e.g., the data structure, elements, e.g. the list elements and/or the table elements.
  • event indication 112 can include the event, ⁇ ".
  • the user name may be received as part of user indication 113.
  • the event (E) may be ELFF’ 16
  • the user name (PI) may be Bob
  • the data may include, for example, DB_U SERPROFILE and/or DB EVENTMETADATA
  • the output may include FRIENDS_EVENT_WEIGHTS_FRIENDSWEIGHTS data entity (Table 12), if desired.
  • the method 300 may include, generating an Event keys data entity (Table 5), e.g., an Event keys table, including a plurality of keywords associated with an event, a plurality of values associated with the event, and a plurality of weights associated with the plurality of keywords and the plurality of values (text box 310).
  • Table 5 Event keys data entity
  • the method 300 may include, generating an Event keys data entity (Table 5), e.g., an Event keys table, including a plurality of keywords associated with an event, a plurality of values associated with the event, and a plurality of weights associated with the plurality of keywords and the plurality of values (text box 310).
  • social links engine 170 may receive data including a plurality of keywords associated with the event from event metadata DB 127, e.g., DB_EVENTMETADATA (Table 3), and may generate, for example, the Event keys data entity (Table 5), e.g., an EVENTKEYS table, which may be associated with the event“ELFF’16” and may include Keyword, Value and Weight fields as is shown below in the example of Table 5.
  • Table 3 can be a non-limiting example of e vent metadata associated with the event of interest to the user.
  • the method 300 may include generating a Friends keys data entity (Table 6), e.g. a friends keys table, including a plurality of users associated with the event, a plurality of keywords associated with users, a plurality of values associated with the users, and a plurality of weights associated with the users, the plurality of keywords and the plurality of values (text box 320)
  • social links engine 170 may receive data 129 related to a plurality of friends associated with the event from user profile DB 128, e.g. DB USERPROF1LE (Table 4), and from the Event keys data entity ( " fable 6).
  • social links engine 170 may select from the received data, data fields such as, for example, user, keyword, value, and weight and may generate, for example, a Friends keys data entity (Table 2), e.g. a FRIENDSKEYS table, as is shown below in the example of Table 6.
  • Table 2 e.g. a FRIENDSKEYS table, as is shown below in the example of Table 6.
  • Table 4 in some cases a non-limiting example of a data entity that includes user preference data associated with the user, user preference data associated with the other user(s) (the friends) and data indicative of the degree of relationship of the user to the other(s).
  • the method 300 may include generating a Friends weights data entity (Table 7) including a plurality of users associated with the event and a plurality of weights associated with the users, the plurality of keywords and the plurality of values (text box 330).
  • social links engine 170 may receive data including a plurality of weights of friends associated with the event from user profile DB 128, e.g. DB_USERPROFILE, and may select the Value field, e.g. Value column and/or Value attribute, and the Weight field, e.g. Weight column and/or Weight attribute, from user profile DB 128 to generate a Friends weights data entity (Table 7), e.g. a FRIENDS WEIGHTS table, as is shown below in the example of Table 7. In some examples cases, this process can derive a degree of relationship between the user to the other user(s).
  • the method 300 may include cleaning up unused keywords at an Event keys data entity (Table 5), e.g. an Event keys table, to include keywords, ,e g. only keywords, and values that are in the Friend keys data entity (Table 6) (text box 340).
  • social links engine 170 may receive data including a plurality of weights of friends associated with the event from user profile DB 128, e.g. DB_USERPROFILE, and may select the Keyword field, e.g. Keyword column and/or Keyword attribute, the Value field, e.g. Value column and/or Value attribute, and the Weight field, e.g. Weight column and/or Weight attribute, from Friends keys data entity (Table 6), e.g. the FRJENDSKEYS data entity, to generate the Event keys data entity (Table 8), e.g. an EVENTKEYS data entity, as is shown below in the example of Table 8.
  • the method 300 may include normalizing the weights values at the Event keys data entity ( Table 8), the Friends keys data entity (Table 6) and Friends weights data entity (Table 7), by setting the total sum of the weights in the data entity to 1 (text box 350).
  • the Event keys data entity (Table 8), the Friends keys data entity (Table 6) and the Friends weights data entity (Table 7) may be as in die example below:
  • the method 300 may include, lasing the weights with the same keywords of the event keys data entity (Table 8’) with Friend keys data entity (Table T) to generate a ne data entity, for example, a Friends_Event_Keys_Weights data entity (Table 9), by multiplying the weights of Event keys data entity (Table 8’) with Friend keys data entity (Table 6’) and normalizing the weights by setting the total sum of the weights of Friends_Event_Keys_Weights data entity (Table 9 ⁇ to 1 (text box 360), as is shown below in the example of Table 9.
  • a Friends_Event_Keys_Weights data entity Table 9
  • the method 300 may include fusing the weights with same keywords of the Friends_Event_Keys_Weights data entity (Table 9) with the Friends weights data entity (Table T) to generate a new data entity, for example, a Friends Event Keys Weights Friendwaights data entity (Table 10).
  • generating the Friends Event Keys Weights Friendweights data entity may be done by multiplying the weights of Friends_Event_Keys_Weights data entity (Table 9) with Friends weights data entity (Table T ) and normalizing the weights by setting the total sum of the weights of Friends Event Keys Weights Friendwaights data entity (Table 10) to 1 (text box 365), as is shown below' in the example of Table 10.
  • the method 300 may include creating, a new data entity, for example, an Event Keys Weights Friendsweights data entity (Table 11) by merging rows with the same keyword value pairs from the Friends_Event_Keys_Weights_Friendweigbts data entity (Table 10), summing the respective weights and dropping the user fields (text box 370), as is shown below in the example of Table 11.
  • a new data entity for example, an Event Keys Weights Friendsweights data entity (Table 11) by merging rows with the same keyword value pairs from the Friends_Event_Keys_Weights_Friendweigbts data entity (Table 10), summing the respective weights and dropping the user fields (text box 370), as is shown below in the example of Table 11.
  • tire method 300 may include creating a new data entity, for example, a Friends_Event_Weights_Friendsweights data entity (Table 12), by merging rows with the same user from the Friends _Event_Keys_Weights_Friend_ weights data entity (Table 10), summing the respective weights and dropping the keyword and Value fields (Text box 380), as is shown below in the example of Table 12.
  • Table 12 is anon-limiting example of a data entity indicative of the association of the user, and of other user(s), with the event of interest to the user based on the degree of relationship between die user to the other user(s).
  • this step may include combining and reducing data of at least the following: the event metadata associated with the event of interest to the user, the user preference data associated wi th the user, the user preference data associated with the users, and the data indicative of the degree of relationship of the user to other users.
  • the method 300 may include, providing the data entity, e.g. the Friends Event Weights Friendsweights table (Table 12) to Orchestrator module 150 (text box 380).
  • the data entity e.g. the Friends Event Weights Friendsweights table (Table 12)
  • Orchestrator module 150 text box 380.
  • FIG. 4 it presents a schematic illustration of a flow chart of a method 400 of generating a preferred media data entity by the content preference engine module 165, in accordance with some demonstrative embodiments.
  • content preference engine module 165 may tap into the profile of the user (P 1 ) and may generate a data entity, e .g . , a list and/o r a table and/or the like, of preferred media types and sizes
  • content preference engine module 165 may be configured to generate the data entity having one or more media types and one or more media sizes associated with the one or more media types.
  • content preference engine module 165 may be called by Orchestrator module 150 to generate a preferred media data entity, e.g., PREFERRED_MEDIA (Table 9), by receiving from the event indication 112 an indication on the event, for example ELFFM 6, from the user indication 113 an indication of a user name, for example, Bob, and data 129 from user profile DB 128 e.g., DB JLISERPRQF1LE.
  • a preferred media data entity e.g., PREFERRED_MEDIA (Table 9)
  • content preference engine module 165 may receive from user profile DB 128 (Table 4) a plurality of media types, a plurality of media sizes and a plurality of media weights, associated with the user, and generate the preferred media data entity, e.g., PREFERREDJMEDIA (Table 13).
  • the preferred media data entity e.g , PREFERRED_MEDIA (Table 13) may include a media type field, a media size field and a Weight field, e.g... Weight column and/or Weight attribute, (text box 410).
  • Table 13 is a non-limiting example of user media preference data.
  • content preference engine module 165 may merge media sizes of medias of the same type, based on a weight of the media type and averaging the media weight (text box 420). For example, content preference engine module 165 may merge double audio sizes by building the weighted average of the size values and averaging the weight values to generate, for example, PREFERRED MEDIA (Table 13-1).
  • content preference engine module 165 may normalize the weights values by setting the total sum of the weights to 1 (text box 430), as is shown in Table 13-2 below'.
  • Table 13-2 is a non-limiting example of a data entity indicative of one or more media types and of one or media sizes associated with the one or more media types.
  • content preference engine module 165 may be configured to provide the data entity, for example, the Preferred Media table (Table 13-2) to Orchestrator module 150, (text box 440).
  • content preference engine module 165 may be configured to provide a default media type and size for a new user. For example, if the user Bob no preferred media is defined, in some examples a default preference for media type and size can be used in this method. Similarly, in some cases this default preference could be entered into DB_USERPROFILE for the user, and perhaps be modified dynamically as the user's preferences are learned based on other behavior of theirs.
  • FIG. 5 it presents a schematic illustration of a flow chart of a method 500 of generating an available media data entity by a media handler module 155, in accordance with some demonstrative embodiments.
  • media handler module 155 may tap into the media, e.g. existing media, and the user profiles to generate a data entity, e.g. a list and/or a table, of media related to the event indicated by event indication 1 12, e.g. ELFF’16, and/or to a user indicated by user indication 113 , e.g. Bob.
  • media handler 155 may associate the media directly, for example, via the user, e.g. Bob, and/or indirectly, for example, via the user friends, e.g. Alice and Carl
  • media handler module 155 may be configured generate a data entity, e.g. AVAILABLE MEDIA (Table 16), including media related to the event and the user.
  • a data entity e.g. AVAILABLE MEDIA (Table 16)
  • the media may be prioritized according to a degree of relationship between the event, e.g. ELFF’16 and the user, e.g. Bob
  • media handler module 155 may be configured to generate a data entity, e.g. AVAILABLE MEDIA (Table 16), by receiving data from the at least one of user profile DB 128, event metadata DB 124 , media metadata 120 DB and/or a data structure, for example, FRIENDS_EVENT_WEIGHTS_FRIENDSWEIGHTS (Table 12).
  • a data entity e.g. AVAILABLE MEDIA (Table 16)
  • media handler module 155 may be configured to generate a USERS MEDL4METADATA table (Table 14) by joining, for example, User_Profile_DB table (Table 4) of user profile DB 128 and Media_Metadata_DB table (Table 1 ) of media metadata 120 DB, selecting media metadata related to users according to keyword value pairs, e.g. the same keyword value pairs, multiplying weights corresponding to the selected keyword, e.g., multiplying weights User_Profi!e_DB table (Table 4) with weights of Media_Metadata_DB table (Table 1) and by 10, and dropping lines with non-corresponding keyword value pairs (text box 510).
  • Table 14 USERS MEDL4METADATA table
  • media handler module 155 may be configured to generate, for example, Table 15,
  • USERS_MEDIAMETADATA_EVENTMETADATA by joining Table 14 USERS MEDIAMETADATA with Event Metadata DB table (Table 3) of event metadata DB 124 filtered on a selected event, selecting the media metadata related to the selected event related to keyword value pairs, e.g., same keyword value pairs, multiplying the weights corresponding to the selected keyword, e.g., multiplying weights Event Metadata DB table (Table 3) with weights of Table 14 USERS MED 1AMETADATA and multiplying by 10, and dropping lines with non corresponding keyword value pairs (text box 520). In the case of Table 15-1, for example, the lines corresponding to the Vertical "Fashion" have been dropped. This is a non-limiting example of association of the available media contents with the event, by combining and reducing data of the media metadata related to the input media content, and the event metadata associated with the event of interest to the user.
  • media handler module 155 may be configured to join, for example, the USERS_MEDIAMETADATA_EVENTMETADATA table (Table 15-1) with, for example, Friend Event Weights _Friend_Weights table (Table 12), by multiplying the weights, remove the column "User”, and merge rows with the same attribute at the User_MediaMetadata_EventMetada table ( " fable 15-2), summing up the weights (text box 530)
  • media handler module 155 may be configured to generate an Available Media data entity (Table 16) by merging rows with related Media ID values, e.g., the same Media ID values, at the USERS_MEDIAMETADATA_EVENTMETADATA table (Table 15-2), and by- summing the weights, dropping Keyword field, e.g., Keyword column and/or Keyword attribute, Value field, e.g., Value column and/or Value atribute, and normalize the weights (text box 540).
  • Table 16 may in some examples be referred to also as a first data entity indicative of the input media content(s).
  • media handler module 155 may be configured to provide the data entity, e.g., the Available Media table (Table 16) to the Orchestrator module 150 (text box 550) .
  • the data entity e.g., the Available Media table (Table 16)
  • the Orchestrator module 150 text box 550
  • FIG. 6 it presents a schematic illustration of a flow chart of a method 600 of generating a friends physiological (or loT) data entity based on data captured from sensors by a physiological data handier module, e.g., loT data handler 160 , in accordance with some demonstrative embodiments.
  • a physiological data handier module e.g., loT data handler 160
  • physiological data handler module e.g., IoT data handler 160
  • IoT data handler 160 may be configured to tap into the IoT data, e.g., physiological data, and the user profiles, to generate a data entity, e.g., a list and/or a table, of FRIENDSIOT table (Table 20) based on IoT data, e.g., existing IoT Data, related to the event as indicated by event indicator 112, and the user as indicated by user indication 113.
  • a data entity e.g., a list and/or a table, of FRIENDSIOT table (Table 20)
  • IoT data e.g., existing IoT Data
  • an association of the IoT data may be direct, e.g., via the user, and/or indirect, e.g., by friends of the user.
  • the data entity e.g., Table 20
  • the data entity may be prioritized by the degree of relation to the event and the user and/or the friends of tire user.
  • IoT Data Handler module 160 may be configured to generate the data entity including, for example, the physiological data of the user associated to the event metadata.
  • the physiological data may be prioritized according to a degree of relationship between the event e.g., ELFF’16 and the user, e.g , Bob.
  • IoT Data Handler module 160 may be configured to generate the data entity, e.g., FRIENDSIOT table (Table 20), by receiving data from at least one of, for example,
  • DB USERPROFILES table (Table 4) of user profile DB 128 , Event Metadata DB 126 (Table 3), and/or DB_IOTMETADATA table (Table 2) of IoT metadata DB 124.
  • IoT Data Handler module 160 may be configured to generate, for example, an IOTMETADATA FRIENDS table (Table 17) by removing from DB IOTMETADATA table (Table 2) entries that are not associated with a person in the friend list, e.g. Table 4 (text box 610). For example, in the case of Table 17, the entry associated with the user "Dora" has been removed.
  • Table 17 an IOTMETADATA FRIENDS table
  • Hie generation of Table 17 may be referred to also as a non-limiting example of generation based on association of the degree of excitement data with the user and/or with the other user(s), including combining and reducing data of the degree of excitement metadata of the other user(s) and the data entity indicative of the association of the user and other user(s) with the event of interest to the user.
  • loT Data Handler module 160 may be configured to generate, for example, IOTMET AD ATA__ FRIENDS table (Table 18) from table IOTMETADATA FRIENDS (Table 17) by removing all entries that are not associated with the event, e.g, ELFF’ 16, as indicated for example by Table 3 (Text box 620).
  • the generation of Table 18 may be referred to also as a non-limiting example of generation based on association of the degree of excitement data with the event, including combining and reducing data of the degree of excitement metadata of the other user(s) and the event metadata associated with the event of interest to the user.
  • loT Data Handler module 160 may be configured to join, for example, the IOTMETADATA_FRIENDS table (Table 18) with FRIENDS_EVENT_WEIGHTS_FRIENDSWEIGHTS table (Table 12) to generate an IOTMETADATA_FRIENDS table (Table 19) (text box 630). Note that though Alice is in Table 12, she does not appear in Table 19, as she has no record in Table 18.
  • ToT Data Handler module 160 may be configured to generate the FRIENDSIOT table (Table 20) from, for example, the merged IOTMETADATA FRIENDS table (Table 19), by summing up the weights and dropping the User field, keyword field and Value field, e.g., Value column and/or Value attribute, and normalizing the weights by setting the sum of all weight values in the weight field to 1 (text box 640).
  • Table 20 is an non-limiting example of a data entity indicative of the degree of excitement data of the oilier user(s).
  • loT Data Handler module 160 may be configured to provide the data entity, for example, the FRIENDSIOT table (Table 20), for example, to the Orchestrator module 150 (text box 650).
  • FIG. 7 it presents a schematic illustration of a flow chart of a method 700 of generating a friends available media data entity based on data captured from sensors by a IoT-media association engine module 175, in accordance with some demonstrative embodiments
  • the IoT-Media association engine 175 may be configured to provide available loT data.
  • the available loT data of the user and/or the friends of the user may be associated with the available media.
  • the engine may take into consideration that the media and loT data should cover the same time.
  • the available media may be generated at o verlapping time of generation of the IoT data.
  • the available IoT data e.g. physiological data, may be prioritized by the degree of the relation of the media with respect to the IoT data associated with the event, e.g. ELFF’16.
  • the IoT-Media association engine 175 may be configured to generate a data entity, for example, a
  • FRIENDSIOT AVAILABLE MEDIA table may include one or more available media contents generated at one or more points of time during the event, e.g., ELFF’16.
  • the one or more available media contents may be prioritized according to the physiological data of the user, e.g. IoT data, and/or user friends’ physiological data, e.g. IoT data, which may be generated at the one or more points of time during the event.
  • the data entity e.g. FRIENDSIOT_AVAILABLE_MEDIA table (Table 21 )
  • the IoT-Media association engine 175 may be configured to generate a data entity, for example, a
  • FRIENDSIOT FRIENDSIOT .
  • AVAILABLE MEDIA table Table 21
  • Table 20 FRIENDSIOT table
  • Table 16 AVAILABLE_MEDIA table
  • the IoT-Media association engine 175 may generate, for example, a FRIENDSIOT . AVAILABLE .
  • MEDIA table (Table 21) by joining table FRIENDSIOT (Table 20) with AVAILABLE_MEDIA table (Table 16) according to joint criteria, for example, overlapping time frames. For example, one each pair of Media and loT records with overlapping time frames, multiplying the weights, dropping lines with non-corresponding time frames, and re-nonnalizing the weights, for example, by setting the sum of all weight values in the weight field to 1 (text box 710).
  • Table 21 is one non-limiting example of a data entity indicative of association between the input media content(s) and the degree of excitement data of the other user(s).
  • the generation of Table 21 may also be referred to as one non-limiting example of generation based on one or more points of time during the event of the interest to the user, including combining and reducing the first data entity indicative of the input media content(s) and the data entity indicative of association between the input media content(s) and the degree of excitement data of the other user(s). Note that in the creation of Table 21, several of these points in time define intervals of time during the event.
  • the ioT-Media association engine 175 may be configured to provide the data entity, e.g.FRIENDSIOT AVAILABLE MEDIA table (Table 21) to Orchestrator module 150 (text box 720).
  • the data entity e.g.FRIENDSIOT AVAILABLE MEDIA table (Table 21)
  • Orchestrator module 150 text box 720.
  • FIG 8 it presents a schematic illustration of a flow chart of a method 800 of generating a merged friends physiological data entity based on the friend physiological (or loT) data entity by a moments determination engine module 180, in accordance with some demonstrative embodiments.
  • moments determination engine module 180 may be configured to determine moments in time, e.g. exciting moments.
  • the moments may include points in time of interest from relevant loT data of the user, e.g. Bob, and friends of the user.
  • moments detennination engine module 180 may be configured to generate a data entity, e.g. MERGED_FRIENDS10T table (Table 22), which may include a list of moments and a relevancy of the moments, e.g., strong moment and/or weak moment, and may be prioritized by the degree of relation of the respective input data to the event, e.g. ELFF’16, and the user, e.g., Bob.
  • a data entity e.g. MERGED_FRIENDS10T table (Table 22)
  • Table 22 may include a list of moments and a relevancy of the moments, e.g., strong moment and/or weak moment, and may be prioritized by the degree of relation of the respective input data to the event, e.g. ELFF’16, and the user, e.g., Bob.
  • moments determination engine module 180 may be configured to generate the data entity, e.g., MOMENTS table (Table 23), including media generated at moments in time of the event, e.g ELFF’16, associated with the user’s physiological data and friends of the user’s physiological data, e.g. Bob and/or friends IoT data.
  • the physiological data, e.g. IoT data may be prioritized, for example, according to a degree of relationship of friends available media, a preferred media and the moments to the event and the user
  • moments determination engine module 180 may be configured to generate the data entity, e.g. MOMENTS table (Table 23) based on FRIENDSIOT table (Table 20).
  • moments determination engine module 180 may be configured to generate a data entity, e.g. a MERGED FRIENDSIOT table (Table 22), from the FRIENDSIOT table (Table 20) by taking, for a covered time interval, e.g each covered time interval, for example, a row with the highest weight (text box 810).
  • a data entity e.g. a MERGED FRIENDSIOT table (Table 22)
  • moments determination engine module 180 may he configured to generate the MOMENTS table (Table 23), for a row, e.g., each row, of MERGEDJFRIENDSIOT table (Table 22), by finding, for example, the times of 3 highest peaks of physiological measurements of the user in the corresponding loT Data together with the peaks of physiological measurements of the user which may be a relative high to the other physiological measurements, expanding the peak times, for example, in two seconds in each temporal direction, putting the expended peak times together with the corresponding row ’ s weight into the MOMENTS table (Table 23) and normalizing the weights at the weight field and relevancy at a relevance field (text box 820).
  • High peaks of physiological measurements may be an example indication of high levels or degree of excitement, strong moments, and thus of relatively high relevancy.
  • determination of the highest peaks can he done using analytics tools known in the art.
  • case a peak is a plateau the time of the middle of the plateau can be selected. Note that the choice of the three highest peaks of physiological measurements, as well as the choice of two seconds for expanding peak times, are non limiting examples.
  • Table 23 is a non-limiting example of a data entity indicative of points in time within one or more times of interest of an event of interest to the user, generated according to the degree of excitement data.
  • the data entity was generated according to an indication of relevance that is based on the degree of excitement data of the other user(s). e.g. based on the most highly weighted degree of excitement corresponding to that time of interest.
  • moments determination engine module 180 may be configured to provide the data entity, e.g., MOMENTS table (Table 23), to Orchestrator module 150 (text box 830).
  • data entity e.g., MOMENTS table (Table 23)
  • Orchestrator module 150 text box 830.
  • moments determination engine module 180 may be configured to generate any other data entity according to any other selected criteria.
  • the Moments are not time instances, but rather time intervals of different durations.
  • FIG. 9 it presents a schematic illustration of a flow chart of a method 900 of generating a selected media data entity based on the preferred media data entity and friends available media data entity by a media selection engine module 190, in accordance with some demonstrative embodiments.
  • media selection engine module 190 may be configured to determine which of a preselected media and the IoT data may correspond to detected moments and to generate a data entity, e.g. a list and/or a table based on the determination.
  • media selection engine module 190 may be configured to generate a SELECTEDJVIEDIA table (Table 25).
  • the SELECTED MEDIA table (Table 25) may include the relevancy of the moment, e.g., a‘strong” moment and/or a“weak” moment, and the moments may be prioritized, for example, by a degree of relation of the respective input data, e.g., physiological measurement, to the event of common interest, e.g. ELFF' 16 and the user, e.g. Bob.
  • media selection engine module 190 may be configured to generate the data entity, e.g. the SELECTED MEDIA table (Table 25).
  • the data entity e.g. the SELECTED_MEDIA table (Table 25)
  • the data entity may include an indication of media generated at one or more moments of the event of interest, e.g. ELFF’ 16.
  • the indication of the media may indicate media associated with the physiological data of the user and that of friends of the user. The media may be prioritized according to a degree of relationship of friends available media, a preferred media, the moments of the event, and the user.
  • media selection engine module 190 may ⁇ be configured to receive a first data entity, e.g., the FRIENDSIOT_AVAILABLE_MEDTA table (Table 21 ), a second data entity, e.g., the PREFERRED MEDIA table (Table 13-2), a third data entity, e.g., the MOMENTS table (Table 23) and may generate a fourth data entity e.g., the SELECTED_MEDIA table (25) based on data of the first, the second and the third data entity.
  • a first data entity e.g., the FRIENDSIOT_AVAILABLE_MEDTA table (Table 21 )
  • a second data entity e.g., the PREFERRED MEDIA table (Table 13-2)
  • a third data entity e.g., the MOMENTS table (Table 23)
  • a fourth data entity e.g., the SELECTED_MEDIA table (25) based on data of the first, the second and the third data entity.
  • media selection engine module 190 may be configured to generate the SELECTED _MEDIA table (Table 25) by generating first a SELECTED_FRIENDSIOT_MEDIA table (Table 24).
  • the media selection engine module 190 may be configured to generate the SELECTED_FRIENDSIOT_MEDIA table (Table 24) by joining the PREFERRED MEDIA table (Table 13-2) with the FRIENDSIOT AVAILABLE JV1EDIA table (Table 21) on a Type field, removing a Size field from the jointed table, multiplying respective weights from the PREFERRED_MEDIA table (Table 13-2) and Table 21 FRIENDSIOT AVAILABLE MEDLA and re -normalizing tire weights of the j oined table, e.g., SELECTED FRIEN DSIOT MEDIA table (Table 24) (text block 910).
  • the generation of Table 24 includes combining and reducing the data entity indicative of association between input media content(s) and the data entity
  • media selection engine module 190 may ⁇ be configured to generate the SELECTED_MEDIA table (table 25), as is shown in text box 920, for example, by:
  • a Joining the tables SELECTED FKIENDSiQT . MEDIA (Table 24) and MOMENTS (Table 23) on IoTID field and providing a joint table.
  • b Deleting from the joint table a row, where an Overlapping Time field of the SELECTED ERIE DSIOT MEDIA table (Table 24) is disjoined with a Time field of the MOMENTS table (Table 23).
  • c Keeping common intervals of Overlapping Time field of SELECTED FRIENDSIOT MEDIA table (Table 24), and the Time field of the MOMENTS table (Table 23) in an Overlapping Time field of the joint table.
  • d Merging weights at the joint table by multiplying weights of SELECTED_FRIE3SfDSIOT_MEDIA table (Table 20) with weights of MOMENTS table (Table 23) and re-normalizing.
  • Table 25 is a non-limiting example of a second data entity, that is indicative of the available media content(s), where available media contents correspond to the points in time within the tiine(s) of interest of the event of interest to the user.
  • the second data entity is weighted based on the preferences of the user.
  • the generation of Table 25 includes combining and reducing the data entity indicative of points in time within at time(s) of interest, the data entity indicative of association between input media content(s), the data entity indicative of one or more media types and of one or media sizes associated with the one or more media types and the degree of excitement data of other user(s).
  • media selection engine module 190 may provide the SELECTED_MEDIA table (Table 25) to Orchestrator module 150 (text box 930).
  • Table 25 SELECTED_MEDIA table
  • Orchestrator module 150 text box 930.
  • the content creation engine module 195 may be configured to generate, for example, a content personalized to the user, e.g., Bob, in the context of the event of common interest, e.g., ELFF’16, according to the user preferences, detected moments, and available loT data, e.g., physiological data of the user, and media.
  • a content personalized to the user e.g., Bob
  • the context of the event of common interest e.g., ELFF’16
  • loT data e.g., physiological data of the user, and media.
  • the content creation engine module 195 may be configured to generate a data entity, e.g., a MEDIA_SELECTED_NORM table (Table 26).
  • the data entity e.g., the MEDIA_ SELECTED _NORM table (Table 26)
  • the data entity may include, for example, a content personalized to the user, e.g., Bob, associated to the event of common interest, e.g., ELFF’16, according to user preferences, moments of the event of a user interest, physiological data of the user, and a media generated at the moments of the event of the user interest.
  • the content creation engine module 195 may he configured to generate the data entity, e.g , the MEDIA_SELECTED_NORM table (Table 26), from, for example, a first data entity, e.g., the SELECTED _ MEDIA table (Table 25) and a second data entity, e.g., the PREFERRED_MEDIA table (Table 13-2).
  • the data entity e.g , the MEDIA_SELECTED_NORM table (Table 26)
  • a first data entity e.g., the SELECTED _ MEDIA table (Table 25)
  • a second data entity e.g., the PREFERRED_MEDIA table (Table 13-2).
  • the content creation engine module 195 may be configured to generate the data entity, e.g., a MEDIA_SELECTED_NORM table (Table 26) by multiplying a Relevance field and Weight field of the SELECTED JVffiDIA table (Table 25) to create a Weight field of the MEDIA_SELECTED_ NORM table (Table 26) and for a media type, e.g., each media type keeping 3 rows with the highest weights (text box 1010).
  • the creation of Table 26 may thus be based, in some example cases, on Relevance Fields and Weight Fields of Table 25. Note also that the choice of three rows is a non-limiting example.
  • the content creation engine module 195 may be configured to do the following for, the media type, e.g., each media type (text box 1020):
  • the content creation engine module 195 may be configured to provide the MEDIA_SELECTED_ NORM table (Table 26) to the Orchestrator module 150 (Text box 1030).
  • Table 26 is one non-limiting example of a third data entity indicative of the available media contents(s), created as part of generating one or more output media contents to be provided to the user.
  • this third data entity can be indicative of the one or more media sizes associated with the one or more media types.
  • this third data entity is generated based on a relevance field associated with the available media content(s) and on a weight field associated with them.
  • the methods described herein in some cases making use of KVW data structures and their manipulation, may in some examples provide a smaller volume of media to the user, while meeting the user's needs of consuming relevant content.
  • Tins may be achieved by identifying those media content items, and those portions of them, that are most strongly related to one or more of the event, the areas of interest of the user and his or her friends, the strength of relationship with each friend, the degree of excitement associated with certain moments as measured for example in various IoT sensors, and the user's preference for media types and sizes. This can be performed, according to some examples, by combining and reducing data of multiple inputs and of different characteristics (e.g. media, user profiles, excitement etc.), to derive degrees of relationship of comparatively high values and thus high relevance.
  • the media content items output to the user e.g. Bob
  • the less relevant media content items, and/or the less relevant portions of those items need not be sent. For one non-limiting illustrative example, video of an entire 90 minute football match may not be sent, but rather only the 3 goals made, constituting only 2-3 minutes of video.
  • the method also enables combining various pieces of data to derive a more accurate measure of relevance. For example, in Table 4 we see that, although Carl has a very high level of interest of 0.8 for Basketball, measurements related to Carl regarding Basketball may have comparatively less relevance in choosing media for Bob, since Bob's friendship with Carl is at a level of 0.2, considerably less than Bob's friendship with Alice (level of 0.5). Another example is giving more weight to Bob's relationship to himself, then to his relationship with friends such as Alice. As exemplified above, this in some examples results in more weight given to Bob's areas of interest, Bob's excitement levels etc., as compared to the areas of interest, and excitement levels etc., of a friend such as Alice.
  • An additional example advantage is deriving the most relevant output media content while minimizing use of processing and storage resources.
  • the data entities, used and/or created at various steps of the process of identifying candidate output media contents can be kept relatively smaller in size.
  • the use of relatively smaller data entities at a next step of the process can in turn reduce, in a relative way, the processing resources required for the next step.
  • the presently disclosed subject matter discusses generation of output media eontent(s), based on, for example, input media content, degree of excitement data and moments in time.
  • the concepts and methods disclosed can be generalized Similar methods can accept and/or function on, for example, any kind of time bound data sets.
  • the invention can perform any type of detection on a second data set ("data set 2", exemplified herein by degree of excitement data from loT) and create a respective compilation from a first data set ("data set 1") with a multiplicity of potential material or information from which to select, originating from multiple sources (exemplified herein by input media contents).
  • data set 2 exemplified herein by degree of excitement data from loT
  • data set 1 a multiplicity of potential material or information from which to select, originating from multiple sources (exemplified herein by input media contents).
  • the system can be used for arbitrary context-sensitive multi-instance relation- aware data compilation.
  • Context-sensitive may in some examples refer to sensiti vity to the environment of the data, e.g. date and time of day, as exemplified herein.
  • the sensitivity can be, as non-limiting examples, to parameters other than time - e.g. to the season, w'eather, location, state of health of an individual, etc.
  • Multi instance may in some examples refer to the presence of multiple instances of an entity. An example from the present disclosure is the multiple users, and the ability to create a relevant compilation for Bob based on infonnation associated with his friends and other contact people.
  • the sources of material of the first data set, the items of material themselves, and the portion of the items, are selected based on the second data set, and its sensitivity to contexts derived in some cases from additional data (exemplified herein by event data and user profile data). Relations between all types of entities (exemplified herein by events, media IDs, loTID, users etc.) can be derived (for example using KVW data structures) to enable the selection based on relevance.
  • Some demonstrative implementations may be used, for example, to provide a core engine for user profiling, targeted advertisements/campaigns and promotions, orchestrate an integrated consumer experience, increase consumer engagement, promote tailored content through heterogeneous distribution channels, turn intelligence into action, and action into results with embedding advance loT analytics in media, facilitate the expansion of the diversified user base and the online community and any other use.
  • Some demonstrative embodiments may be used, for example, in areas benefitting from context-sensitive multi-instance relation-aware data compilation, and may be geared to other scenarios where personalized content for a user may be created, for context targeting user groups(instead of individual users) and/or any other areas of use cases
  • the system disclosed e.g with regard to Fig 1 may he referred to as a computing system configured to create a compilation output from a first data set , the computing system comprising a processing circuitry and configured to: receive an indication to generate a compilation;
  • the first data set comprising a time associated with first data of the first data set, wherein the first data set is relation-aware
  • the identifying comprises combining and reducing data of at least the firs data set and the second data set
  • the combining and reducing data is based at least partly on a degree of relationship of the time associated with the first data and the time associated with the second data
  • the at least one candidate compilation output includes at least a portion of the first data set
  • At least one compilation output media content includes at least a portion of the at least one candidate compilation output
  • Such a system can in some examples utilize methods disclosed herein, including to but not limited to the use of KVW data structures such as disclosed herein; setting weight values by at least one of fusing the weight values of records of the new data entities, multiplying the weights values, merging weight values by addition and merging weight values by weighted averaging; and normalizing the weight value, wherein the sum of normalized weight values is equal to 1.
  • the computing system may be configured to generate one or more output media contents associated with an event of interest tothe user.
  • the computing system in order to provide the output media content 114 the computing system is configured to generate a data structure, e.g. a data entity, which may be used to generate the output media content.
  • a data structure e.g. a data entity
  • a method and a system for generation of the data structure will be described below with reference to Figs. 11 and 12.
  • Non-limiting examples of this data structure, and its uses, include Tables 1 -26 described above, relating to the example application of generating output media content.
  • the data stmcture(s) can include some or all of the following fields:
  • this data structure may also be referred to herein as a Keyword-Value -Weight (KVW) data structure.
  • KVW Keyword-Value -Weight
  • a system may include one or more of such data structures. In the example of Tables 1-26, numerous instances of such a data structure exist.
  • These fields may be grouped together within a record or entry of the data structure.
  • the combination of the keyword and value fields may be referred to herein also as the KV fields or a Keyword -V alue pair.
  • the combination of the keyword, value and weight fields may be referred to herein also as the KVW fields.
  • records in the data structure are rows, such as exemplified in Tables 1-26.
  • fields in the data structure are columns, such as exemplified in Tables 1-26.
  • KVW data structure with one entity type Two non-limiting examples of a KVW data structure with one entity type are Table 3, DB_EVENTMETADATA, where the entity value is "Event", and Table 4, DB_USERPROFILE, where the entity value is " User".
  • a non-limiting example of a KVW data structure containing more than one entity field is DB_IOTMETADATA, Table 2, where at least the two fields IoTID and User may be entity fields.
  • entity fields of different KVW data structures may be, in a particular system, of different entity types, and not the of the same entity type.
  • One non-limiting example of this is user, event, lotID, MedialD etc. that appear in different tables of Tables 1- 26 that support output media content generation.
  • a keyword value in the keyword field can indicate on a feature (e.g. Team) associated with the one or more entity values, and a value m the value field can be associated to the keyword value (e.g. Team ⁇ Moscow', Istanbul).
  • a feature e.g. Team
  • a value m the value field can be associated to the keyword value (e.g. Team ⁇ Moscow', Istanbul).
  • keyword and value pairs are those of Table 4: keywords include Vertical (with values Fashion and Sport), Audiolength (with values 20 sec, 40 sec), and Friends (with values Alice, Bob, Carl).
  • the weight field can include a weight value, or weight metric, to indicate a degree of relation of the one or more entity fields to the keyword field and the value field, e.g. as described below.
  • the weight value may indicate the degree of relation of the one or more entity fields to the combination of the keyword field and the value field. For example, in Table 4, the degree of relation between the entity "Bob" and the keyword-value pair "Team + Moscow” is indicated by weight 0.4.
  • a KVW data structure may be initialized with only three weight values, for example for simplicity reasons.
  • the possible weight values may include 0 for no relation of the data applied by the row, 0.5 for medium relation of the data applied by the row , and 1 for high relation of the data applied by the row.
  • other or more values may be applied to the weight value in order to indicate more degrees of relation of the data associated with a record and/or with a plurality of records.
  • a KVW data structure may not contain an entity field.
  • EVENTKEYS Table 5
  • keywords can be used to establish relationships between the entity fields of different KVW data structures.
  • Each entity such as person or event, can be associated with a set of keywords, where multiple associations of the same keyword (possibly with different values and/or weights) to one entity is explicitly allowed.
  • KVW data structure One example feature of a KVW data structure is that, in some examples, it can be generated from portions of input data such as that in other existing data structures. This generation may be done according to one or more data aggregation methods.
  • data aggregation may include one or more of normalization, fusion and merging. As disclosed further herein, normalization may involve summing weights, fusing may involve multiplying weights, and merging may involve addition of weights or weighted averaging of values.
  • One example of such a data aggregation method is combining at least two other data structures, also referred to herein as source (or input) data structures.
  • one or both of the source data structures are themselves KVW data structures.
  • Table 14 U SERS_MEDIAMETAD ATA is generated using Table 4 and Table 1.
  • Table 14 is in turn used, together with Table 3, to generate new Table 15-1 USERS_MEDIAMETADATA_EVENTMETADATA.
  • all of tire tables are KVW data structures - both the input Tables 1, 3 and 4, and also the output tables generated, Tables 14 and 15-1.
  • Table 15-1 may be in some examples be generated by a combination of the three Tables 1 , 3 and 4, and thus is an example of generating a KVW data structure by combining more than two source data structures. Note also that both Tables 14 and 15-1 are created by joining the relevant source data structures.
  • KVW data structure Another example feature of a KVW data structure is that it, in some examples, it can capable of being utilized to generate a new data structure, by combining the KVW data structure with at least one other source data structure.
  • the new' data structure may also be referred to herein as a result (or output) data structure.
  • the result of this combination is itself a KVW data structure.
  • both source data structures are KVW data structures, and the combination is performed on the basts of the combination of keyword fields and value fields of the source data structures.
  • Tables 1 and 4 One example of this is combining Tables 1 and 4 to generate resulting Table 14.
  • a KVW data structure may be combined with at least one other source data structure to generate a modified version of a data structure.
  • both source data structures are KVW data structures, and the combination is performed on the basis of the combination of keyword fields and value fields of the two or more source data structures.
  • the modified data structure is itself a KVW data structure.
  • KVW data structure Table 18 IOTMETADATA_FRIENDS is a modified version of the same I0TMETADATA_FR1ENDS (Table 17), generated by combining Table 17 with Table
  • the other source data structure, the new data structure, or the modified data structure is not a KVW data structure.
  • Such a structure may be referred to herein also as a "non-KVW data structure" .
  • Non-limiting examples of non-KVW data structures include Table 7 and Table 12.
  • the KVW data structure can be generated, by merging two or more records of other source KVW data structures. In some example cases, the merging may be performed on the basis of the combination of the keyword field and the value field. In some examples the weight value for the record of the result KVW data structure can be set by multiplying weights values of records of the other KVW data structures.
  • the KVW data structure USERS_MEDIAMETADATA (Table 14 ⁇ is generated by merging Table 1 and Table
  • the resulting Table 14 contains entity fields of both source data structures - both the User field of Table 4, and the MedialD, Type and Time fields of Table 1 - in addition to the KVW fields.
  • the entity types of the entity fields of the other, source, KVW data structures, which were used to generate a third KVW data structure are not the same.
  • the abovementioned Tables 4 and 1 , used to generate Table 14, are a non limiting example of this.
  • the weight value of a result KVW data structure may be set by merging the weight values by weighted averaging.
  • the weight value of a KVW data structure may be set by merging the weight values by averaging.
  • Tables 13-1 and 13 also exemplify another example feature of KVW data structures --- the possibility of setting a Value Field value by merging the values of a Value Field (in this case the field "Size"), by weighted averaging of the Value Field values corresponding to Keyword (Type) ::: “Audio” - where 20 see and 40 sec gave a weighted average value of 28 sec, based on the Weight values.
  • the weight value may be set, when merging two or more records of the K W data structure, by summing or adding the weights values of two or more records of the other KVW data structures.
  • Friends Event Keys Weights Friendweights Table 11
  • the KVW data structure can be generated, by merging two or more data fields of the source data structure or structures, and setting the weight value by multiplying weights values of the data fields.
  • MEDIA SELECTED NORM Table 26 is generated by merging data fields (columns) Relevance and Weight of SELECTED_MEDIA (Table 25), by multiplying the value of the two data fields.
  • the generation of the KVW data structure can also involve reducing data.
  • Table 11 An additional example of reducing data, is the generation of Table 11 from Table 10, discussed above, where records with matching keyword-value pairs were merged, and in addition the entity field "User” was then removed, leaving a data structure containing only the KVW fields.
  • Table 12 FRIEND S_EVENT_WEIGFITS_FRIENDS WEIGHTS involved a different merging method that involved reducing data, one based on matching values of the entity field "User”.
  • Table 12 contains only the entity and weight fields, and is a non-limiting example of generating a data structure that is not a KVW data structure, based on a KVW data structure.
  • the weight value may then be normalized. For example, the sum of the normalized weight values of the rows may be equal to 1, e.g. as described below. In some demonstrative embodiments, the weight value may be normal ized at least by one of fusing the weight values, multiplying the weights values, merging weight values by addition, or merging weight values by weighted averaging, e.g., as described below.
  • One non-limiting example of the utilization of normalization is the generation of Table 9, FRIENDS_EVENT_KEYS_WEIGHTS.
  • normalization is performed once, after the desired KVW data structures have been generated. In other examples, normalization is performed at each stage, after the generation of each intermediate KVW data structure, in an example process that yields one or more ultimate desired KVW data structures as the final output.
  • new data structures, or modified data structures may be generated by processing one source KVW data structure, for example by merging records and, in some cases, reducing data - without performing combination with another source data structure.
  • records with the same value of a KV- pair may be merged, and the entity fields removed from the data structure.
  • one or more of the methods disclosed herein may be used for generating a KVW data structure based on one or more other data structures.
  • One or more of the methods disclosed herein may also be used for generating new data structures, or modified data structures, based on the KVW data structure, some case by combining the KVW data structure with at least one other data structure.
  • FIG. 1 1 is a schematic illustration of a block diagram of a computing system 1 100 for generating one or more data structures, in accordance with some demonstrative embodiments.
  • the computing system 1 100 may include and/or be included in the content creation system 100 and/or in computing system 1300, e.g as described below.
  • computing system 1 100 may include an I/O (Input/Output) interface 1 110 configured to gather and distribute data from and to one or more data processing devices such as, for example, sensors, databases, social networks, the Internet or the like.
  • I/O Input/Output
  • the interfaces to the Databases shown in Fig. i, and/or the interfaces to the inputs and outputs 1 12, 1 13 and 1 14, may be an example of I/O interface 11 10.
  • computing system 1100 may comprise processing circuitry 1115.
  • Processing circuitry 1 1 15 may comprise a processor 1120 (also referred to herein as a hardware processing circuitry) and memory 1 140.
  • Processing circuitry 1115 is shown in Fig. 11 as a broken line. In some example cases, processing circuitry 1115 may be processing circuitry 107.
  • processing circuitry' 1 1 15 may comprise a hardware processing circuitry 1120.
  • Hardware processing circuitry 1 120 may be referred to herein interchangeably as processor 1120.
  • the hardware processing circuitry may be hardware processing circuitry ' 110 and/or hardware processing circuitry' 1310 (see Fig 13).
  • Hardware processing circuitry 1120 may be configured to generate a data structure 1 130, e.g. a data entity, based on portions of the data according to one or more data aggregation methods. Non-limiting examples of such methods are described further herein.
  • computing system 110 may include a memory 1140, which may be configured to store the one or more data structures 1130, for example KVW data structures, as described herein.
  • a memory 1140 may be configured to store the one or more data structures 1130, for example KVW data structures, as described herein.
  • memo! ⁇ ' 1130 may be memory 10, and/or may be memory 1308 of Fig 13, e.g. as described below'.
  • the processor 1120 may be configured to generate the KVW data structure 1130, by one or more of the methods disclosed herein.
  • computing system 1100 may be configured to generate one or more data structures, e.g. data entities, including corresponding Keywords, Values, and Weights, which may be referred to as KVW data.
  • KVW data may be filled in manually.
  • Other example methods of populating KVW fields, including initially assigning associated weight, are disclosed above regarding generation of an output media content.
  • Fixed weight values may provide a better than state of the art performance of the system, e.g. by seting the weight of the user to 100 and all the other weights (Wi) to one.
  • the respective KVW data may be filled in manually when the number of users, the number of events and a respecti ve media is relatively small at a time and/or may be limited and the respective KVW data may be filled in manually
  • semi-automatic and/or automatic KVW data generation may be achieved by applying content analytics, e.g. Twitter feeds, transitive relations, e.g. a video on the event may inherit KVW data from the event, conversion of available '‘regular” metadata, social analytics on the user profiles, etc.
  • content analytics e.g. Twitter feeds
  • transitive relations e.g. a video on the event may inherit KVW data from the event
  • conversion of available '‘regular” metadata e.g.
  • FIG. 12 there is presented a schematic illustration of a flow chart of a method of generating a data structure, in accordance with some demonstrative embodiments. In some demonstrative embodiments, these functions may be performed by components of Fig. 11.
  • computing system 1100 may gather from, and distribute data to, one or more data processing devices (text block 1210), e.g., as described herein.
  • the one or more data processing devices may include, the content creation system 100, databases, e.g., databases 120, 122, 124 126,128 and 130, servers, mobile device, computers, cars, cell phones or the like. In some examples, this may be done by I/O interface 1 1 10.
  • the one or more data structures may include, for example, tables, lists, arrays, or the like, which may be generated by the processor 1120 of computing system 1100 .
  • the processor 1120 may be configured to generate the data structure based on portions of the gathered data according to one or more data aggregation methods.
  • tire data structure e.g., Tables 1-26, may include one or more records , and one or more data fields.
  • the one or more data fields includes at least one of the following fields: Keyword, Value and Weight and the data of the fields may be stored at the one or more records (text block 1220).
  • One non-limiting example of such generation may be method 300, wiiere Table 12 was generated based on data gathered from Tables 3 and 4.
  • the generated data structure may be stored in memory 1140.
  • the processor 1 120 may set, at each record, the weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the value field (text box 1230). Example methods of setting the weight value are disclosed herein.
  • hardware processing circuitry 1120 may normali ze the weight value at each record by equalizing a sum of the weight values of record, e.g., all the record, of the data structure to 1 (text box 1240).
  • the weight field may include a weight metric.
  • the weight metric may be applied to determine values for different weights.
  • the weight metric may apply 3 values to a weight value:“0” may represent no influence, “0.5” may represent medium influence, and‘T’ may represent strong influence.
  • the weights may be influenced by factors applied to various data entities in the data aggregation processes throughout a workflow ' of generating the data structure.
  • the weights may be used to weight the degree of relation between the one or more entities of the content creation system 100.
  • tire weights may be influenced from social relationships between the user and his friends into generation of the content. For example, a close friend may have a higher influence than a not-so-c!ose friend and/or a friend of a friend.
  • the weights may be influenced from factors such as, for example, IoT data, e.g., physiological data and/or the like.
  • some other weight metrics which may include more than 3 values and/or less than 3 values, may be applied in order to determine the values to the Weight field, e.g., Weight column and/or Weight attribute.
  • keywords may be used to establish relationships between one or more entities.
  • an entity of the one or more entities e.g., each entity, may include a person or an event, and may be associated with a set of keywords.
  • a keyword may with multiple associations, e.g., the same keyword with different values and/or weights, related to one entity, may be used.
  • an instance of a keyword e.g., each instance of the keyword, may be associated with an entity.
  • the keyword may include a value related to the entity and a respective weight may relate to the entity .
  • KVW data structures appear in Tables 1-26, in the context of the generation of the output media content.
  • Example entity fields described in those tables include events, users, MedialD, loTID, Type and Time.
  • Example Keywords described in those tables involve subjects such as: event location (City): areas of interest of a user (Vertical, Discipline, Team, Designer); media preferences (Mediatype, Audiolength, Textlength); social relationships of users (Friend). Values include, for example, Fashion and Sport for the keyword Vertical, Basketball and Volleyball for the keyword Discipline, and Madrid and Moscow' for die keyword Teams. These examples are fields which are relevant to die application of generating the output media content.
  • two or more tables may contain different information, and may be based on different types of entities.
  • one table may relate to people, e.g. or users of a service, while another may relate to media content items, such as video and audio clips generated for example by different content providers.
  • It may still be possible to manipulate or process such tables, which contain different information, e.g. by comparison, merging and other methods, by making use of keywords and values and possibly their combinations to match between records of dissimilar entities (e.g. user and media content item) on different tables.
  • KVW data structures in computerized systems may in some examples allow inference and deduction of relationships that were not obvious.
  • use of a KVW data structure may allow easy detection, by straightforward manipulation based on KV fields, that if Bob is related to Alice, and Alice to Carl, Bob has a relationship to Carl of a certain degree (weight).
  • This structure this enables comparative weighting of relationships of an entity to itself, as well to other entities, when determining tire weights to assign various attributes of each.
  • Another relationship that can be inferred with in a simple manner using KVW data structure is that between two different entities.
  • the inference and deduction of relationships may allow prov iding useful outputs (e.g. pro viding Bob with the most appropriate media content item, based on interests of a wider circle of friends and friends of friends, and weighting in the correct proportion Bob's own interests and that of his friends) that in some cases may not have been possible using other data structures.
  • new data structures may be created, data structures may be combined etc., new relationships may be recorded as data, and weights of relationships may be modified, all based on the use of keyword and value fields and weights, while using as input existing database tables (for example).
  • this provides an example advantage, in that instead of having to develop or obtain sophisticated algorithms or analytics, the desired result may be provided by simple combinations of structures such database tables, using for example standard SQL queries to perform standard actions such as e.g joins of tables, merger of records and duplication removal, based on keyword and value fields. This in turn may, in some examples, enable simpler and thus more robust computer programs, with a lower chance of bugs.
  • KVW data structures can enable storage space savings. This may in some examples apply both to data structures that are input to a process (e.g. Tables 1 -4), as well as to those data structures generated while performing an application task such as output media content creation (for example Tables 5-26).
  • the savings in space can be achieved as a result of combination.
  • the data structure IOTMETADATA_ FRIENDS, Table 17 is decreased in size to Table 18, on the basis of a combination with Table 3 which removes a record that is not of interest.
  • the same data structure is then converted to Table 19, on the basis of a combination with Table 3, and again the superfluous record corresponding to "Team-Madrid" is removed.
  • the ability to merge records, and/or to merge data fields, described herein can decrease the storage space of a data structure .
  • One non limiting example is merging the records of different enti ties based on identical KV-pair values, to eliminate unneeded duplication.
  • One example of this is the conversion of Table 15- 1 to a smaller version Table 15-2 of the same data structure, where three records for "Cam! " that share KV-pair "Team -Moscow" were compressed into one record, by eliminating the field User which is not needed for the remainder of the particular task.
  • the weight value set by these methods reflects the weight values of the source records and/or data structures, so sufficient infonnation remains in the smaller resultant data structure.
  • the storage space occupied by a particular data structure is decreased, while maintaining in that data structure all information that is relevant to performance of the relevant task.
  • larger data structures may have to be maintained for more stages of performance of a task, by comparison to a case of use of a KVW data structure.
  • An additional advantage, m some examples, is that a KVW -based solution such as disclosed here may in some cases be stateless, involving only combinations of data structures and merging. This may in some examples reduce the inter- and intra component constraints to the minimally required output-input-chaining dependencies. This can in some example allow the maximum degree of scalability and execution re ordering to optimize the workflow.
  • An additional advantage in some examples, is that merging of records and fields, and/or use of normalization, at various stages of manipulating KVW data structures, and not only after all calculations have been performed, may simplify the calculation load, and may require the multiplication of values associated with a relatively smaller number of records.
  • merging of records and fields, and/or use of normalization, at various stages of manipulating KVW data structures, and not only after all calculations have been performed may simplify the calculation load, and may require the multiplication of values associated with a relatively smaller number of records.
  • instead of using keywords in relational databases it is possible to use other means such as, for example, semantic databases and respective inference methods or any other means.
  • FIG. 13 it presents an illustration of a block diagram of a machine in the form of a computing system 1300 which includes a set of instructions, wlien executed, can cause the machine to perform any one or more of the methodologies discussed hereinabove, in accordance with some demonstrative embodiments.
  • the machine may operate as a standalone device.
  • computing system 1300 may include and/or he included in computing system 105 (Fig. 1) and/or computing system 1 100 (Fig. 11).
  • Fig. 13 may in some example cases provide more detail on some possible implementations of components described with regard to Figs. 1 and 11.
  • the processor 110 and memory 108 of Fig. 1 are shown at a high level.
  • Fig. 1 provides details on example modules that may in some cases reside in processor 110.
  • Fig. 1 also shows some specific example database functions, that in some example cases may interact with processor 110.
  • the machine may be connected, e.g. using a network, to one or more other machines.
  • the machine may operate the capacity of a server or a client user machine in server- client user network environment, and/or as a peer machine in a peer-to-peer and/or distributed network environment.
  • the machine may include a server computer, a client user computer, a personal computer (PC), a tablet PC, a laptop computer, a desktop computer, a control system, a network router, switch or bridge, or any machine capable of executing a set of instmctions (sequential or otherwise) that specify actions to be taken by that machine.
  • a device may include any electronic device that provides voice, video or data communication.
  • the term "machine" shall also be understood to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein
  • computing system 1300 may include a processing circuitry 1307.
  • Processing circuitry 107 may comprise a processor 1310, and memory' 1308.
  • Processing circuitry' ⁇ 1307 is shown in Fig. 13 as a broken line.
  • Processing circuitry 1307 may include a processor 1310.
  • processor 1310 may include a central processing unit (CPU), a graphics processing unit (GPU), and/or any other processing unit.
  • processor 1310 may be referred to herein also as hardware processing circuitry' ⁇ 1310. In some demonstrative embodiments, this may be hardware processing circuitry 110 or hardware processing circuitry 1120.
  • the processing circuitry 1307 of computing system 1300 may further include a memory ' 1308 (shown as a broken line). In some demonstrative embodiments, this may' be memory' 108 or memory ⁇ 1 140.
  • Memory' 1308 may include one or more of the following: a main memory' 1320, a static memory' 1330 and a machine-readable medium 1380, which communicate with each other via a bus 1395. Note that a bus such as bus 1395 may exist also in the systems of Figs. 1 and 11, although they are not shown in those figures.
  • computing system 1300 may include a machine-readable medium 1380, which may be configured to stored one or more sets of instructions 1305, e.g. software, embodying any one or more of the methodologies or functions described herein, including those methods illustrated above.
  • the instmctions 1305 may also reside, completely or at least partially, within the main memory 1320, the static memory' 1330, and/or within the processor 1310 during execution thereof by the computing system 1200.
  • the main memory 1320 and the processor 1310 also may constitute machine-readable media.
  • the computing system 1300 may further include a video display unit 1350 (e.g. a liquid crystal display (LCD), a flat panel, a solid state display, a cathode ray tube (CRT)) and/or any type of display.
  • a video display unit 1350 e.g. a liquid crystal display (LCD), a flat panel, a solid state display, a cathode ray tube (CRT)
  • computing system 1300 may include an input device 1360, e.g. a keyboard, a touch pad or the like, a cursor control device 1370, e.g. a mouse, a signal generation device 1390, e.g. a speaker and/or a remote control and/or the like, and a network interface device 1340 which may be operably coupled to a network 1345, e.g. a server, the Internet, a cloud and/or the like.
  • a network 1345 e.g. a server, the Internet, a cloud and/or
  • dedicated hardware implementations including, but not limited to, application specific integrated circuits, programmable logic arrays and other hardware devices, can likewise be constructed to implement the methods described herein.
  • Applications that may include the apparatus and systems of various embodiments broadly include a variety of electronic and computing systems. Some embodiments may implement functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application-specific integrated circuit.
  • the example system is applicable to software, firmware, and hardware implementations.
  • the methods described herein are intended for operation as software programs running on a computer processor.
  • software implementations can include, but are not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing, and may also be constructed to implement the methods described herein.
  • the machine readable medium 1380 contains instructions 1305, and/or that which receives and executes instructions 1305 from a propagated signal, so that a device connected to a network environment 1345 may send and/or receive voice, video or data, and to communicate over the network 1345 using the instructions 1305.
  • the instructions 1305 may be transmitted and/or received over network 1345 via the network interface device 1340.
  • machine-readable medium 1380 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be understood to include a single medium or multiple media, e.g. a centralized or distributed database, and/or associated caches and servers, that may store one or more sets of instructions.
  • the term “machine-readable medium” may also be taken to include any medium that is capable of storing, encoding or carrying out a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
  • machine-readable medium may accordingly be taken to include, but not be limited to: solid-state memories such as a memory card or other package that houses one or more read-only (non-volatile) memories, random access memories, or other re -writable (volatile) memories; magneto-optical or optical medium such as a disk or tape; and carrier wave signals such as a signal embodying computer instructions in a transmission medium; and/or a digital file attachment to e-mail or other self-contained information archive or set of archives considered a distribution medium equi valent to a tangible storage medium. Accordingly, the disclosure is considered to include any one or more of a machine-readable medium or a distribution medium, as listed herein and including art-recognized equivalents and successor media, in which the software implementations herein are stored.
  • FIGs. 1, 1 1 , 13 illustrate only generalized schematics of the system architecture, describing, by way of non-limiting example, some aspects of the presently disclosed subject matter only in an informative manner, for purposes of clarity of explanation. Only certain components are shown, as needed to exemplify the presently disclosed subject matter. Other components and sub-components, not shown, may exist.
  • Each system component in Figs. 1 , 11 and 13 can be made up of any combination of software, hardware and/or firmware, executed on a suitable device or de vices, that perform the functions as defined and explained herein. Equivalent and/or modified functionality, as described with respect to each system component, can be consolidated or divided in another manner. Thus, in some examples of the presently disclosed subject matter, the system may include fewer, more, modified and/or different components, modules and functions than those shown in Figs. 1, 11 and 13. One or more of these components can be centralized in one location or dispersed and distributed over more than one location. For example, the present disclosure is of having the Orchestrator calling all the engines, for simplicity of exposition. However, on other implementations, components can call each other directly.
  • Each component in Figs. 1 , 1 1 and 13 may represent a plurality of the particular component, possibly in a distributed architecture, which are adapted to independently and/or cooperatively operate to process various data and electrical inputs, and for enabling operations related to signal detection.
  • multiple instances of a component may be utilized for reasons of performance, redundancy and/or availability.
  • multiple instances of a component may be utilized for reasons of functionality or application. For example, different portions of the particular functionality may be placed in different instances of the component.
  • FIGs. 14A-C they illustrate one example of a data flow' for generating output media, in accordance with certain demonstrative embodiments.
  • this data flow' may make use of the methods exemplified with reference to Figs. 3 to 1Q.
  • the data flow' of Figs. 14 may be implemented with use of the systems 100 and/or 1300.
  • Fig. 14 refer to procedures or methods, for example the methods described with reference to Figs. 3 to 10.
  • Legend 1405 shows that each rec tangle (block), marked with a reference numeral of the block, also lists a reference numeral of an example method, the relevant example figure, and the module that may in some examples perform the method.
  • the figure also shows Tables, w'hich may in some example cases be tables described herein with reference to Figs. 3 to 10. A brief description of each example table, as well as the table name from the above exposition, is also provided.
  • These example tables represent data that may in some cases be output from one method, and may in turn be input to another method.
  • the figure also show ' s databases, which may for example those of Fig. 1.
  • the arrow's indicate example data flows, whereby data from databases and/or input tables are input to the methods, and whereby output tables are output by the methods
  • Fig. 14 The order presented in Fig. 14 is anon-limiting example, presented for purposes of clarity of exposition. It will be readily apparent that other methods may be used, that methods may be combined and/or separated, that methods may be performed and order different than that shown, and that tables other than those shown may be used to implement the overall method exemplified by Fig. 14. As one non-limiting example, in some implementations content preferences engine 165 and media selection engine 190 can be combined into one engine, and their methods combined.
  • FIG. 14 The example data flow of Fig. 14 may in some examples start with block 1430.
  • This block may correspond to method 300 of Figs. 3a and 3b, exemplified by block 1530 of Fig. 5, and implemented in some examples by social links engine 170.
  • Example inputs are event indication 112 and user indication 113, the Event Metadata Database 126 (Table 3) and User Profile Database 128 (Table 4).
  • An example output is the FRIENDS_EVENT_WEIGHTS_FRIENDSWEIGHTS data entity (Table 12).
  • FIG. 14 The example data flow' of Fig. 14 may m some other examples start with block 1440.
  • This block may correspond to method 400 of Fig. 4, exemplified by block 1540 and implemented in some examples by content preferences engine 165.
  • Example inputs are user indication 113 and User Profile Database 128 (Table 4).
  • An example output is the PREFERRED MEDIA data entity (Table 13-2).
  • Block 1450 may correspond to method 500 of Fig. 5, exemplified by block 1550 and implemented in some examples by media handler 155.
  • Example inputs are indication 112, indication 113, Table 12, Event Metadata Database 126 (Table 3) , Media Metadata Database 120 (Table 1) and User Profile Database 128 (Table 4).
  • An example output is the AV AILABLE_MEDIA. data entity (Table 16).
  • Block 1460 may correspond to method 600 of Fig. 6, exemplified by block 1560 and implemented in some examples by IoT data handler 160. In some examples, it can be performed before or in parallel with block 1450.
  • Example inputs are indication 112, indication 113, Event Metadata Database 126 (Table 3) , IoT Metadata Database 124 (Table 2) and User Profile Database 128 (Table 4).
  • An example output is the FR I ENDS 104 ' data entity (Table 20).
  • Block 1470 may correspond to method 700 of Fig. 7, exemplified by block 1570 and implemented in some examples by IoT-media association engine 175.
  • Example inputs are Table 6 and Table 20.
  • An example output is the data entity FRIENDSIOT AVAILABLE MEDIA (Table 21).
  • Block 1480 may correspond to method 800 of Fig. 8, exemplified by block 1580 and implemented in some examples by moments determination engine 180 In some examples, it can be perfonned before or in parallel with block 1470.
  • Example inputs are Table 20 and Raw IoT database 122.
  • An example output is the MOMENTS data entity (Table 23).
  • Block 1490 may correspond to method 900 of Fig. 9, exemplified by block 1590 and implemented in some examples by media selection engine 190.
  • Example inputs are Table 21 , Table 23 and Table 13-2.
  • An example output is the SELECTED MEDIA data entity (Table 25).
  • Block 1495 may correspond to method 1000 of Fig. 10, exemplified by block 1595 and implemented in some examples by content creation engine 195
  • Example inputs are Table 25 and Table 13-2.
  • An example output is the MEDIA_SELECTED_NORM data entity (Table 26).
  • Fig. 15 illustrated further above, illustrates one example of a process for generating output media, in accordance with certain demonstrative embodiments. In some example cases, this process may make use of the methods described with reference to Figs. 3 to 10, based on data flows such as described with reference to Fig. 14. In some cases, the methods of Fig. 15 may be implemented with use of the systems 1QQ and/or 1300.
  • conditional language such as '‘may ” ,“might’, or variants thereof should be construed as conveying that one or more examples of the subject matter may include, while one or more other examples of the subject matter may not necessarily include, certain methods, procedures, components and features.
  • conditional language is not generally intended to imply that a particular described method, procedure, component or circuit is necessarily included in all examples of the subject mater.
  • usage of non-conditional language does not necessarily imply that a particular described method, procedure, component or circuit is necessarily included in all examples of the subject matter.
  • the Orchestrator can call the social links engine 170 (method 300) and the content preferences engine 165 (method 400) in the opposite order, or in parallel.
  • system according to the invention may be, at least partly, implemented on a suitably programmed computer.
  • the invention contemplates a computer program being readable by a machine or computer for executing the method of the invention .
  • the invention further contemplates a non- transitory computer-readable memory tangibly embodying a program of instructions executable by the computer for executing the method of the invention.
  • the presently disclosed subject matter further contemplates a non -transitory computer readable storage medium having a computer readable program code embodied therein, configured to be executed so as to perform the method of the presently disclosed subject matter.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurosurgery (AREA)
  • Computer Graphics (AREA)
  • Biomedical Technology (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Security & Cryptography (AREA)
  • Data Mining & Analysis (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

A computing system to generate output media content(s) for a user. The system receives an indication to generate the output media, the indication associated with an event of interest to the user and/or area(s) of interest of the user indicative of the event of interest. In some examples, the system combines and reduces received data, to identify candidate output media content(s), based at least partly on a degree of relationship of times associated with: the event, input media content and degree of excitement data of other user(s). The data includes user preference data associated with the user, and with the other user(s), including a degree of their relation to the area(s) of interest. The system generates, and provides the user with, the output media content, including at least a portion of candidate output media content.

Description

SYSTEM AND METHOD OF GENERATING MEDIA CONTENT AND RELATED DATA STRUCTURES
CROSS-REFERENCES TO RELATED APPLICATIONS
[001] The present application claims benefit from United State provisional Patent application No 62/591,990 tiled on November 29, 2017, incorporated hereby by reference in their entirety.
TECHNICAL FIELD
[002] Embodiments described herein generally relate to systems and methods for media content creation or generation.
BACKGROUND
[003] Events may be documented by the use of media capture device(s), such as cellphones, cameras, video camcorders and the like . The media capture device(s) may capture media at the time of an event to provide a captured media.
[004] The captured media content may be viewed and shared by a user to his/her friends through social networks and other sharing services such as, for example, an Email service, a land mail service, cellphones, the Internet and/or the like.
SUMMARY OF THE INVENTION
[005] In accordance with another aspect of the presently disclosed subject matter, there is presented a content creation system to generate one or more output media contents associated to an event of interest of a user, the content creation system comprising: one or more media databases to store one or more input media content and media metadata related to a media associated with the event and received from one or more media sources;
an event database to store data associated with the event; a physiological database to store user physiological data collected from one or more sensors;
a user profile database to store social and personal data of the user and one or more user preferences comprising a type of media, the user physiological data, a degree of user social relationships parameter and social network data; and hardware processing circuitry' to receive an event indication to indicate the event and to generate the one or more output media contents by processing at least one of the input media content, the user physiological data, social and personal data and the event metadata, based on a degree of relationship of the one or more user preferences with a time of creation of the user physiological data and a time of creation of the one or more media contents, and to provide the user with tire one or more output media contents.
[006] In addition to the above features, the system according to this aspect of the presently disclosed subject matter can include one or more of features (i) to (xv) listed below, in any desired combination or permutation which is technically possible:
(i) wherein the hardware processing circuitry· is configured to set a degree of relationship of the one or more user preferences of the user preferences with the time of creation of the user physiological data and a time of creation of the one or more media content by attaching to the one or more media content one or more weights to indicate the degree of relationship of the one or more media content to the one or more user preferences.
(ii) wherein the hardware processing circuitry comprising:
one or more engine modules and one or more handler modules to generate one or more data structures based on data received from the at least one of the one or more media databases, the event database, the physiological database or the user profile database; and
an orchestrator module operably coupled to the one or more engine modules and to the one or more handler modules, the orchestrator module is configured to call the one or more engines modules and the one or more handler modules according to a predetermined sequence and to generate the one or more output media contents based on a degree of relationship between one or more data portions of data generated by the one or more engine modules and the one or more handler modules. (iii) wherein the one or more output media contents comprise selected m oments of the event data structure, which include one or more moments of the event selected according to a degree of relationship of the user physiological data generated at the time of the event with the one or more media contents.
(iv) wherein the input media contents comprise one or more media contents gathered from at least one of one or more social networks, one or more content platfonns and one or more websites by the one or more media databases.
(v) wherein the hardware processing circuitry is configured to generate the one or more output media content according to one or more data structures generated by at least one of the one or more engine modules and the one or more handier modules.
(vi) wherein the one or more engine modules comprise a first engine module to generate a data structure comprising information of one or more users association with the event, prioritized by a degree of social relationship between the user and one or more other users, and a degree of relationship between the user and the one or more other users to the event metadata, according to the event indication, an indication on a user name, an indication on one or more other users names, and the event metadata.
(vii) wherein the one or more engine modules comprise a second engine module to generate a data structure comprising one or more media types and one or media sizes associated with the one or more media types, the one or more media types are prioritized according to a degree of relationship of the one or more users to the event metadata, and the data structure is generated according to the event indication and a user indication. (viii) wherein the one or more engine modules comprise a third engine module to generate a data structure comprising one or more available media contents generated at one or more points of time during the event according to data applied by an available media data structure and a friends physiological data structure, wherein the one or more available media contents are prioritized according to the physiological data of the user and physiological data of friends of the user, which are generated at the one or more points of time during the event.
(ix) wherein the one or more engine modules comprise a fourth engine module to generate a data structure comprising moments of the event according to one or more users physiological data sharing the event prioritized according to degree of relation of the one or more users physiological data to the moments of the event, wherein the data structure is generated according to data applied by a friends physiological data structure. (x) wherein the one or more engine modules comprise a fifth engine module to generate a data structure comprising indication on media generated at one or more moments of the event, wherein the media is associated to the user physiological data and friends of the user physiological data, and prioritized according to a degree of relationship of friends available media, a preferred media and the moments to the event and the user
(xi) wherein the one or more engine modules comprise a sixth engine module to generate a data structure comprising a content personalized to the user associated to the event, according to user preferences, moments of the event of a user interest, physiological data of the user and a media generated at the moments of the event of the user interest.
(xii) wherein the one or more handler modules comprise a first handler module to generate a data structure comprising media related to the event and the user prioritized according to a degree of relationship between the event and the user
(xiii) wherein the one or more handier modules comprise a second handler module to generate a data structure comprising the physiological data of the user associated to the event metadata and prioritized according to a degree of relationship between the event and the user
(xiv) wherein the event data comprise metadata of the event.
(xv) wherein generating the one or more output media contents is by selecting moments of the event according to the user physiological data generated at the time of the event.
(xvi) wherein the hardware processing circuitry is configured to set the degree of relationship of the one or more user preferences of the user preferences with the time of creation of the user physiological data and the time of viewing of the one or more input media contents by determining one or more weights to indicate the degree of relationship of the one or more media content to the one or more user preferences.
(xvii) wherein the hardware processing circuitry comprising:
one or more engine modules and one or more handler modules configured to generate one or more data structures based on data received from the at least one data entity of the plurality of data entities; and
an orchestrator module operably coupled to the one or more engine modules and to the one or more handler modules, the orchestrator module is configured to coordinate an operation of the one or more engine modules and the one or more handler modules according to a predetermined sequence.
(xvii) wherein the orchestrator module is configured to: provide to the user the one or more output media contents generated based on one or more data portions of data generated by the one or more engine modules and the one or more handier modules.
(xviii) wherein the one or more output media contents comprises one or more media of selected moments of the event of common interest, the one or more media is selected according to a degree of relationship of the user physiological data generated at a time of the event of common interest with the one or more media contents and the degree of the relation of the user to the other user and to the event of common interest.
(xix) wherein the one or more engine modules comprises a social links engine module configured to generate a data entity comprising information the user and one or more users association with the event of common interest, prioritized by a degree of social relationship between the user and the one or more other users, according to the event indication, a user name indication, an indication on one or more other users’ profiles, and the event metadata.
(xx) wherein the one or more engine modules comprises a content preferences engine module to generate a data entity comprising one or more media types and one or media sizes associated with the one or more media types, the data entity is prioritized according to a degree of relation to the event of common interest and the user.
(xxi) wherein the one or more engine modules comprise an internet of things (IoT)-media association engine module to generate a data entity comprising one or more available media contents generated at one or more points of time during the event of the common interest prioritized according to the physiological data of the user and physiological data of the one or more other users.
(xxii) wherein the one or more engine modules comprise a moments determination engine module to generate a data entity comprising points in a time of interest of the event of the common interest according to one or more users physiological data sharing the event of the common interest prioritized according to a degree of relation of input data to the event of the common interest and the user.
(xiii) wherein the one or more engine modules comprises a media selection engine module to generate a data entity comprising an indication on a preselected media generated at one or more point of times of the event of the common interest corresponded to the user physiological data, and prioritized according to a degree of relation of input data to the event of the common interest and the user
(xxiv) wherein the one or more engine modules comprise a content creation engine module to generate a data entity comprising a content personalized to the user in a context of to the event of the common interest, according to user preferences, detected points of time of the event of the common interest, physiological data of the user and a media generated at the point of rimes of the event of the common interest.
(xxv) wherein the one or more handler modules comprise a media handler module to generate a data entity comprising media related to the event of the common interest and the user priori tized according to a degree of relation between the even t of the common interest and the user.
(xxvi) wherein the one or more handler modules comprise an Internet of things (IoT) data handler module to generate a data entity comprising the physiological data of the user related to the event of the common interest and the user, prioritized according to a degree of relation of the event of the common interest and the user.
(xxvii) where the generating comprises gathering from at least one or more social networks one or more media contents; and
generating the input media contents based on the one or more media contents.
(xxviii) wherein an engine of the one or more engine modules is configured to:
receive data from the plurality of data entities;
generate a new data entity comprising a plurality of fields according to a predetermined criteria, wherein the plurality of fields comprising at least one of a keyword field, a value field and a weight field; and
provide the new data entity to the orchestrator module for generating the one or more output media content.
(xxix) wherem the generating comprises generating by at least one of one or more engine modules and one or more handler modules one or more new' data entities based on data received from the at least one data entity of the plurality of data entities; and
coordinating an operation of the one or more engine modules and the one or more handler modules according to a predetermined sequence by an orchestrator module operably coupled to the one or more engine modules and to the one or more handler modules.
(xxx) wherein the generating comprises providing to the user, by the orchestrator module, the one or more output media contents generated based on a degree of relationship between one or more data portions of data generated by the one or more engine modules and the one or more handler modules.
(xxxi) wherein the engine of the one or more engine modules is configured to:
generate tire new data entity by combining two or more data entities; and set a weight value to a row of the new data entity by summing weights values of rows of the new data enti ties and normalized the weight value, wherein the sum of normalized weight values is equal to 1.
(xxxii) wherein the weight value is normalized at least by one of fusing the weight values, multiplying the weights values, merging weight values by addition or merging weight values by weighted averaging.
[007] In accordance with another aspect of the presently disclosed subject matter, there is presented a method for generating one or more output media contents associated with an event selected by a user by a content creation system, the method comprising: storing one or more input media content and media metadata related to a media associated with the event and received from one or more media sources at one or more media databases;
storing data associated with the event at an event database;
storing user physiological data collected from one or more sensors at a physiological database;
storing social and personal data of tire user and one or more user preferences comprising a type of media, the user physiological data, a degree of user social relationships parameter and social network data at a user profile database; receiving an event indication to indicate the event and generating the one or more output media contents by processing at least one of the input media content, the user physiological data, social and personal data and the event metadata, based on a degree of relationship of the one or more user preferences with a time of creation of the user physiological data and a time of creation of the one or more media contents, and to provide the user with the one or more output media contents; and
providing the user with the one or more output media contents.
[008] This aspect of the disclosed subject matter can optionally include one or more of features (i) to (xxxii) listed above, mutatis mutandis, in any desired combination or permutation which is technically possible.
[009] In accordance with another aspect of the presently disclosed subject matter, there is presented a method for generating one or more output media contents associated with an event of the common interest selected by a user by a content creation system, the method comprising: receiving from a plurality of data entities, data associate with the event of the common interest of the user and the one or more other users, the data comprising a first indication on a degree of a relation of the user to the event of the common interest and a second indication of a degree of relation of the other users to the event of the common interest;
receiving an event indication to indicate the event of the comm on interest and generating one or more output media contents by processing at least one of: an input media content, user physiological data, social and personal data, event metadata, at least one of user preferences and media metadata related to the input media content, based on the first indication on the degree of relation of the user to the event of the common interest and the second indication of the degree of relation of the other users to the event of the common interest and a degree of relationship of the one or more user preferences with a time of creation of the user physiological data and a time of viewing of the one or more input media contents, wherein the one or more output media contents includes at least a portion of the input media content whose associated metadata corresponds at least partially to the user preference; and
providing the user with the one or more output media contents.
[010] This aspect of the disclosed subject mater can optionally include one or more of features (i) to (xxxii) listed above, mutatis mutandis, in any desired combination or permutation which is technically possible.
[Oi l] In accordance with another aspect of the presently disclosed subject matter, there is presented a computing system configured to generate one or more output media contents associated to an event of a common interest of a user and one or more other users, the computing system comprising hardware processing circuitry configured to: receive from a plurality of data entities, data associate with the event of the common interest of the user and the one or more other users, the data comprising a first indication on a degree of a relation of the user to the event of the common interest and a second indication of a degree of relation of the other users to tire event of the common interest;
receive an event indication to indicate the event of the common interest and to generate one or more output media contents by processing at least one of: an input media content, user physiological data, social and personal data, event metadata, at least one of user preferences and media metadata related to the input media content, based on the first indication on the degree of relation of the user to the event of the common interest and the second indication of the degree of relation of the other users to the event of the common interest and a degree of relationship of the one or more user preferences with a time of creation of the user physiological data and a time of viewing of the one or more input media contents, wherein the one or more output media contents includes at least a portion of the input media content whose associated metadata corresponds at least partially to the user preference; and
provide the user with the one or more output media contents.
[012] This aspect of the disclosed subject matter can optionally include one or more of features (i) to (xxxii) listed above, mutatis mutandis, in any desired combination or permutation which is technically possible.
[013] In accordance with another aspect of the presently disclosed subject matter, there is presented a content creation system to generate one or more output media contents associated with an event selected by a user, the content creation system comprising: one or more media databases to store one or more data structures associated wtith the event, a data structure of the one or more data structures comprising a keyword field to establish relationships between one or more entities of the content creation system, a value field to provide a value to the keyword and a weight field to weight tiie relationships between the one or more entities; and hardware processing circuitry to generate a data structure based on portions of data of the one or more data structures and an event selected by a user by using one or more data aggregation methods to manipulate the data of the keyword field and the value field and the event field by applying a weight value to the weight field based on the relationship between the one or more entities.
[014] In addition to the above features, the system according to this aspect of the presently disclosed subject matter can include one or more of features (xxxiii) to (xxxvib) listed below, in any desired combination or permutation which is technically possible:
(xxiii) wherein the weight value is configured to indicate a strength of relationships between two or more fields of the data structure.
(xxxiv) wherein the data structure comprises one or more rows and a row of the one or more rows comprises:
the keyword to establish a relationship between one or more entities which provided input data to the content creation system.
(xxxiv) wherein the data structure comprises one or more rows, a row of the one or more rows comprises:
one or more entity values in the one or more entity fields;
a keyword value in the keyword field to indicate on a feature associated with the one or more entity values:
a value in the value field to be associated to the keyword value; and the weight value set by the hardware processor circuitry to indicate the degree of relation of the one or more entity fields to the keyword field and the value field (xxxv) wherein the hardware processor circuitry is configured to:
generate the data structure by merging two or more data fields of other data structures; and
set the weight value to a row of the data structure by summing weights values of row's of the data structure and normalized the weight value, wherein the sum of the normalized weight values of the rows of the data structure are equal to 1.
(xxxvi) wherein the weight value is normalized at least by one of fusing the weight values, multiplying the weights values, merging weight values by addition, or merging weight values by weighted averaging.
(xxxvib) wherein the weight value comprises 0 for no influence or relation of the data applied by the row*. 0.5 for medium influence or relation of the data applied by the row, and 1 for high influence or relation of the data applied by the row.
[015] In accordance with another aspect of the presently disclosed subject matter, there is presented a computing system for generating a data structure, the computing system comprising: an input/output (I/O) interface configured to gather and distribute data to one or more data processing devices;
hardware processing circuitry to generate a data structure based on portions of the data according to one or more data aggregation methods, the data structure comprising one or more entity fields, a keyword field, a value field and a weight field, wherein the weight field includes a weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the value field; and
a memory' configured to store the one or more data structures.
[016] This aspect of the disclosed subject matter can optionally include one or more of features (xxxiii) to (xxxvib) listed above, mutatis mutandis, in any desired combination or permutation which is technically possible.
[017] In accordance with another aspect of the presently disclosed subject matter, there is presented a product comprising one or more tangible computer-readable non- transitory storage media comprising computer-executable instructions operable, when executed by hardware processing circuitry, to cause a content creation system to: store at one or more media databases one or more data structures associated with the event, data structure of the one or more data structures comprising a keyword field to establish relationships between one or more entities of the content creation system, a value field to provide a value to the keyword and a weight field to weight the relationships between the one or more entities; and
to generate a data structure by the hardware processing circuitry based on portions of data of the one or more data structures and an event selected by a user by- using one or more data aggregation methods to manipulate the data of the keyword field, the value field and the event field by applying a weight value to the weight field based on the relationship between the one or more entities of the data structure.
[018] This aspect of the disclosed subject matter can optionally include one or more of features (xxxiii) to (xxxvib) listed above, mutatis mutandis, in any desired combination or permutation which is technically possible.
[019] In accordance with another aspect of the presently disclosed subject matter, there is presented a product comprising one or more tangible computer-readable non- transitory storage media comprising computer-executable instructions operable, when executed by hardware processing circuitry, to cause a computing system to: gather data from one or more data processing devices and distribute data to the one or more data processing devices;
generate a data structure based on portions of gathered data according to one or more data aggregation methods, the data structure comprising one or more entity fields, a keyword field, a value field and a weight field, wherein the weight field includes a weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the value field; and
store the data structure at a memory.
[020] This aspect of the disclosed subject matter can optionally include one or more of features (xxxiii) to (xxxvib) listed above, mutatis mutandis, in any desired combination or permutation which is technically possible. [021] In accordance with another aspect of the presently disclosed subject matter, there is presented a computing system configured to generate one or more output media contents for a user, the computing system comprising a processing circuitry' and configured to: receive an indication to generate one or more output media contents for the user, wherein the indication is associated with at least one of: an event of interest to the user: at least one area of interest of the user indicative of the event of interest; receive, from at least one data source, event metadata associated with the event of interest to die user, the event metadata comprising a time associated with the event; receive, from the at least one data source, at least one input media content; receive, from the at least one data source, media metadata related to the at least one input media content, comprising a time associated with the at least one input media content;
receive, from the at least one data source, degree of excitement data of at least one other user;
receive, from the at least one data source, degree of excitement metadata of the at least one other user, comprising a time associated with the degree of excitement data; receive, from the at least one data source, user preference data associated with the user, wherein the user preference data associated with the user comprises a first indication of a degree of relation of the user to the at least one area of interest;
receive, from the at least one data source, user preference data associated with the at least one other user, wherein the user preference data associated with the at least one other user comprises a second indication of a degree of a relation of the at least one other user to the at least one area of interest, wherein the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the at least one other user, the user preference data associated with the user, and the user preference data associated with the at least one other user constitute first data;
identify at least one candidate output media content, wherein the identifying comprises combining and reducing data of at least the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the at least one other user, the user preference data associated with the user, and the user preference data associated with the at least one other user,
wherein the combining and reducing data is based at least partly on a degree of relationship of the time associated with the event, the time associated with the at least one input media content and the time associated with the degree of excitement data, wherein the at least one candidate output media content includes at least a portion of the at least one input media content;
generate the at least one output media content, wherein at least one output media content includes at least a portion of the at least one candidate output media content; and
provide the user with the at least one output media content.
[022] In addition to the above features, the system according to this aspect of the presently disclosed subject matter can include one or more of features (xxxvii) to (ixviii) fisted below, in any desired combination or permutation which is technically possible:
(xxxvii) the identifying at least one candidate output media content further comprises normalizing data.
(xxxviii) the generating of the at least one output media content comprises selecting the at least one output media content from the at least one candidate output media content using at least one output media selection criterion
(xxxix) the event metadata comprises a third indication of a degree of relation of the event to the at least one area of interest, the media metadata related to the at least one input media content comprises a fourth indication of a degree of relation of the at least one input media content to the at least one area of interest, and the degree of excitement metadata of the at least one other user comprises a fifth indication of a degree of relation of the d egree of excitement data to the at least one area of interest, wherein the combining and reducing data is further based on a degree of relationship of the at least one area of interest of the user indicative of the event of interest, the first indication of a degree of relation of the user to the at least one area of interest, the second ind ication of a degree of a relation of the at least one other user to the at least one area of interest, the third indication of a degree of relation of the event to the at least one area of interest, the fourth indication of a degree of relation of the at least one input media content to the at least one area of interest, and the fi fth indication of a degree of relation of the degree of excitement data to the at least one area of interest.
(xl) further configured to receive, from the at least one data source, data indicative of a degree of relationship of the user to the at least one other user, wherein the identifying at least one candidate output media content further comprises combining and reducing the data indicative of a degree of relationship of the user to the at least one other user and the first data.
(xli) the degree of excitement data of the at least one other user is based on physiological data of the at least one other user.
(xlii) further configured
to receive, from the at least one data source, degree of excitement data of the user; and
to receive, from the at least one data source, degree of exci tement metadata of the user;
wherein the identifying at least one candidate output media content further comprises combining and reducing the data of the degree of excitement metadata of the user and the first data.
(xliii) the degree of excitement data of the user is based on physiological data of the user.
(xliv) the user preference data associated with the user further comprises user media preference data, the user media preference data comprising at least one of media type preference and media size preference,
wherein the media metadata related to the at least one input media content comprises at least one of input media content type and input media content size, wherein identifying at least one candidate output media content further comprises combining and reducing the first data and the user media preference data, based on at least one of input media content type and media content size.
(x!v) the at least one input media content comprise at least one media content gathered by the at least one data source from at least one of one or more social networks, one or more content platforms, one or more feeds from the Internet and one or more websites.
(xlvi) at least part of the first data comprises at least one data structure comprising one or more entity fields, a keyword field, a value field and a weight field, wherein the weight field includes a weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the value field, the at least one data structure constituting at least one Keyword-Value-Weight (KVW) data structure, and wherein the combining and reducing the first data comprises matching Keyword- V alue patrs of the at least one Keyword -Value- Weight (KVW) data structure.
(xlvii) the combining and reducing data comprising seting the weight value by at least one of fusing the weight values of records, multiplying the weights values, merging weight values by addition and merging weight values by weighted averaging.
(xlviii), the setting a weight value for a record of a data entity comprising normalizing the weight value, the sum of normalized weight values is equal to 1.
(xlix) tire identifying comprises generating a data entity indicative of the association of the user and the at least one other user with the event of interest to the user, based on the degree of relationship between the user to the at least one other user.
(1) the generating of the data entity indicative of the association of the user and the at least one other user with the event of interest to the user comprises combining and reducing data of at least the event metadata associated with the event of interest to the user, the user preference data associated with the user, the user preference data associated with the at least one other user, and the data indicative of the degree of relationship of the user to the at least one other user.
(li) the identifying comprises generating a data entity indicative of one or more media types and of one or media sizes associated with the one or more media types, according to user media preference data.
(lii) the i dentifying compri ses generating a first data entity indicative of the at least one input media content, based on at least one of: association of the one or more available media contents with the event, and association of the one or more available media contents with the user preference data associated with the user and with the user preference data associated with the at least one other user.
(liii) the generating of the first data entity indicative of one or more available media contents comprises combining and reducing data of at least the media metadata related to the at least one input media content, the event metadata associated with the event of interest to the user and the data entity indicative of the association of the user and the at least one other user with the event of interest to the user. ( iv) the identifying comprises generating a data entity indicative of the degree of excitement data of at least one other user, based on at least one of: association of the degree of excitement data with the event, and association of the degree of excitement data with at least one of the user and the at least one other user.
(lv) the generating of the data entity indicative of the degree of excitement data of at least one other user comprises combining and reducing data of at least the degree of excitement metadata of the at least one other user, the event metadata associated with the event of interest to the user and the data entity indicative of the association of the user and the at least one other user with the event of interest to the user.
(lvi) the identifying comprises generating a data entity indicative of association between the at least one input media content and the degree of excitement data of at least one other user, the data entity indicative of association between the at least one input media content and the degree of excitement data of at least one other user generated based on one or more points in time during the event of the interest to the user.
(lvii) the generating of the data entity indicative of association between the at least one input media content and the degree of excitement data of at least one other user comprises combining and reducing the first data entity indicative of the at least one input media content and the data entity indicative of association between the at least one input media content and the degree of excitement data of the at least one other user.
(lviii) the identifying comprises generating a data entity indicative of points in time within at least one time of interest of the event of interest to the user, according to the degree of excitement data.
(lix) the generating the data entity indicative of points in time is according to an indication of relevance that is based on the degree of excitement data of the at least one other user.
(lx) for each time of interest of the at least one time of interest, generating the data entity indicative of points in time is based on the most highly weighted degree of excitement data of the degree of excitement data corresponding to the each time of interest.
(Ixi) the identifying comprises generating a second data entity indicative of the one or more available media contents, wherein the one or more available media contents correspond to the points in time within the at least one time of interest of the event of interest to the user.
(Ixii) the second data entity is weighted based on the preferences of the user.
(Ixiii) the generating of second data entity indicative of the one or more available media contents comprises combining and reducing the data entity indicative of points in time within at least one time of interest, the data entity indicative of association between the at least one input media content and the degree of excitement data of at least one other user and the data entity indicative of one or more media types and of one or media sizes associated with the one or more media types.
(Ixiv) generating the at least one output media content comprises generating a third data entity indicative of the one or more available media contents, wherein the generation of the third data entity is based on a relevance field associated with the one or more available media contents and on a weight field associated with the one or more available media contents.
(lxv) the third data entity is indicative of the one or more media sizes associated with the one or more media types.
(lxvi) the event of interest to the user comprises at least one of a sports match, a contest, a concert, a show, a happening, a vacation or a trip.
(lxvii) the physiological data of the at least one other user comprise at least one of a blood pressure, a heartbeat, a heart rate, a breathing rate, a step counter and a body temperature.
(Ixviii) the output media content comprises at least one of a video clip, an audio recording, an image and a text captured at the time of the event interest to the user, according to the user media preference data.
[023] In accordance with another aspect of the presently disclosed subject matter, there is presented a computerized method, performed by the above computing system, for generating one or more output media contents for a user.
[024] In accordance with another aspect of the presently disclosed subj ect matter there is presented a non-transitory program storage device readable by machine, tangibly embodying a program of instructions executable by the machine to perform the method of the above computing system for generating one or more output media contents for a user. [025] In accordance with another aspect of the presently disclosed subject matter, there is presented a computing system configured to create a compilation output from a first data set, the computing system comprising a processing circuitry and configured to: receive an indication to generate a compilation;
receive, from the at least one data source, the first data set, comprising a time associated with first data of the first data set, wherein die first data set is relation-aware; receive, from the at least one data source, a second data set, comprising a time associated with second data of the second data set, wherein the second data set is relation- aware:
identifying at least one candidate compilation output, wherein the identifying comprises combining and reducing data of at least the firs data set and the second data set,
wherein the combining and reducing data is based on a degree of relationship of the time associated with the first data and the time associated with the second data, wlierein the at least one candidate compilation output includes at least a portion of the first data set;
generate the at least one compilation output, wherein at least one compilation output media content includes at least a portion of the at least one candidate compilation output; and
provide the user with the at least one compilation output.
[026] In accordance with another aspect of the presently disclosed subject matter, there is presented a computerized method, performed by the above computing system, for creating a compilation output from a first data set.
[027 ] In accordance with another aspect of the presently disclosed subject matter there is presented a non-transitory program storage device readable by machine, tangibly embodying a program of instructions executable by the machine to perform the method of the above computing system for creating a compilation output from a first data set.
[028] The computerized method, the non-transitory' program storage device and the computing system disclosed herein according to the above five aspects, can optionally further comprise one or more of features (xxxvii) to (Ixviii) listed above, mutatis mutandis, in any desired combination or permutation which is technically possible. [029] In accordance with another aspect of the presently disclosed subject matter, there is presented a computing system for generating a data structure, the computing system comprising: an input/output (I/O) interface configured to gather from and distribute data to one or more data processing devices;
a processor to generate at least one data structure based on portions of the data according to one or more data aggregation methods, the at least one data structure comprising one or more entity fields, a keyword field, a value field and a weight field, wherein the weight field includes a weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the value field, the at least one data structure constituting at least one Keyword- Value-Weight (KVW) data structure; and a memory' configured to store the at least one data structure, wherein the at least one data structure is capable of being utilized by the processing circuitry'· to generate at least one of: at least one new' data structure, at least one modified data structure, by combining the at least one data structure with at least one other data structure.
[030] In accordance with another aspect of the presently disclosed subj ect matter there is presented a non-transitory program storage device readable by machine, tangibly embodying a program of instructions executable by the machine to perform the method of the above computing system for generating a data structure.
[031] In accordance with another aspect of the presently disclosed subject matter, there is presented a computerized method, performed by the above computing system, for generating a data structure.
[032] The computerized method, the non-transitory program storage device and the computing system disclosed herein according to the above three aspects, can optionally further comprise one or more of features (Ixix) to (lxxviii) listed below', mutatis mutandis, in any desired combination or permutation which is technically possible:
(Ixix) the KVW data structure comprises at least one record, a record of the at least one record comprises:
one or more entity values in the one or more entity fields; a keyword value in the keyword field to indicate on a feature associated with the one or more entity values; and
a value in the value field to be associated to the keyword value; and the weight value set by the processor to indicate the degree of relation of the one or more entity fields to the keyword field and the value field.
(lxx) the value in the value field can be equal to an entity value of an entity field of the entity field of the one or more entity fields
(Ixxi) the one or more data aggregation methods comprise:
combining at least two source KVW data structures on the basis of the combination of keyword fields and value fields of the at least two source data structures; and
setting the weight value to a record of the data structure by at least one of: multiplying weights valises of records of the at least two source KVW data structures, summing the weights values, merging the weight values by weighted averaging.
(Ixxii) the one or more data aggregation methods further comprise: merging two or more data fields of the at least two source KVW data structures.
(Ixxiii) the processor circuitry is further configured to: merge two or more records of the KVW data structure by summing weights values of the two or more records, wherein the combination of the keyword field and the value field in the two or more records match.
(lxxiv) the generation comprises reducing data.
(lxxv) the generating of the KVW data structure further comprising: normalizing the weight values, wherein the sum of the normalized weight values of the records of the KVW data structure are equal to 1.
(lxxvi) the weight value comprises 0 for no relation of the data comprised in the record, 0.5 for medium relation of the data comprised in the record, and 1 for high relation of the data comprised in the record.
(lxxvii) entity types of the entity fields of the source KVW data structures are not the same.
(Ixxviii) the at least one data structure is capable of being utilized by the processing circuitry' to generate one or more output media contents for a user, the computing system further configured to:
receive an indication to generate one or more output media contents for the user, wherein the indication is associated with at least one of: an event of interest to the user; at least one area of interest of the user indicative of the event of interest; receive, from at least one data source, event metadata associated with the event of interest to the user, the event metadata comprising a time associated with the event; receive, from the at least one data source, at least one input media content; receive, from the at least one data source, media metadata related to the at least one input media content, comprising a time associated with the at least one input media content;
receive, from the at least one data source, degree of excitement data of at least one other user;
receive, from the at least one data source, degree of excitement metadata of the at least one other user, comprising a time associated with the degree of excitement data; receive, from the at least one data source, user preference data associated with the user, wherein the user preference data associated with the user comprises a first indication of a degree of relation of the user to the at least one area of interest;
receive, from the at least one data source, user preference data associated with the at least one other user, wherein tire user preference data associated with the at least one other user comprises a second indication of a degree of a relation of the at least one other user to the at least one area of interest, wherein the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the at least one other user, the user preference data associated with the user, and the user preference data associated with the at least one other user constitute first data;
identifying at least one candidate output media content, wherein the identifying comprises combining and reducing data of at least the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the at least one other user, the user preference data associated with the user, and tire user preference data associated with the at least one other user,
wherein the combining and reducing data is based on a degree of relationship of the time associated with the event, the time associated with the at least one input media content and the time associated with the degree of excitement data, wherein the combining and reducing the first data comprises matching Keyword-Value pairs of the at least one Keyword- V alue-Weight (KVW) data structure,
w'herein the at least one candidate output media content includes at least a portion of the at least one input media content; generate the at least one output media content, wherein at least one output media content includes at least a portion of the at least one candidate output media content; and
provide the user with the at least one output media content.
BRIEF DESCRIPTION OF THE DRAWINGS
[033] In order to understand the invention and to see how it can be carried out in practice, embodiments will be described, by way of non-limiting examples, with reference to the accompanying drawings, in which:
[034] Fig, 1 is a schematic illustration of a block diagram of a content creation system to generate a user based media content, in accordance with some demonstrative embodiments.
[035] Fig. 2 is a schematic illustration of a flow chart of a method of generating a user based media content associate with an event of common interest, in accordance with some demonstrative embodiments.
[036] Figs. 3a and 3h are schematic illustrations of a flow' chart of a method of generating a friends-event key data entity by a social links engine module, in accordance with some demonstrative embodiments.
[037] Fig. 4 is a schematic illustration of a flow chart of a method of generating a preferred media data entity by a content preference engine module, in accordance with some demonstrative embodiments.
[038] Fig. 5 is a schematic illustration of a flow' chart of a method of generating an available media data entity by a media handler module, in accordance with some demon strative embodiments .
[039] Fig. 6 is a schematic illustration of a flow chart of a method of generating a friends data entity based on data captured from sensors by an Internet of things (loT) data handier module, in accordance with some demonstrative embodiments.
[040] Fig. 7 is a schematic illustration of a flow chart of a method of generating a friends available media data entity based on data captured from sensors by a IoT-media association engine module, in accordance with some demonstrative embodiments.
[041] Fig. 8 is a schematic illustration of a flow chart of a method of generating a merged friends physiological data entity based on the friend data entity by a moments determination engine module, in accordance with some demonstrative embodiments. [042] Fig. 9 is a schematic illustration of a flow chart of a method of generating a user selected media data entity based on a preferred media data entity and a selected media data entity by a media selection engine module, in accordance with some demonstrative embodiments.
[043] Fig. 10 is a schematic illustration of a flo w chart of a method of generating a selected media data entity based on the preferred media data entity by a content creation engine module, in accordance with some demonstrative embodiments.
[044] Fig. 11 is a schematic illustration of a block diagram of a computing system for generating one or more data structure, in accordance with some demonstrative embodiments.
[045] Fig. 12 is a schematic illustration of a flow chart of a method of generating a data structure by a computing system of Fig. 11 , in accordance with some demonstrative embodiments.
[046] Fig. 13 is a schematic illustration of a block diagram of a machine in the form of a computing system, in accordance with some demonstrative embodiments.
[047] Figs. 14A-C are a schematic illustration of a of a data flow for generating output media, in accordance with certain demonstrative embodiments.
[048] Fig. 15 is a schematic illustration of a flow chart of a method of generating output media, in accordance with certain demonstrative embodiments.
DETAILED DESCRIPTION
[049] In the drawings and descriptions set forth, identical reference numerals indicate those components that are common to different embodiments or configurations.
[050] In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that the presently disclosed subject matter may be practiced without these specific details. In other instances, well-known methods, procedures, components and circuits have not been described in detail so as not to obscure the presently disclosed subject matter.
[051] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing", "computing", "receiving", "generating", "identifying", "combining", "reducing",“providing”, "selecting", "setting", "normalizing" or the like, refer to the action(s) and/or process(es) of a computer that manipulate and/or transform data into other data, said data represented as physical, such as electronic, quantities and/or said data representing the physical objects.
[052] The terms“plurality” and“a plurality”, as used herein, include, for example, “multiple” or“two or more”. For example,“a plurality of items” includes two or more items.
[053] References to“demonstrative embodiment” indicate that the embodiment(s) so described may include a particular feature, structure, or characteristic, but not every embodiment necessarily includes the particular feature, structure, or characteristic. Further, repeated use of the phrase“in some demonstrative embodiments” does not necessarily refer to the same embodiments, although it may.
[054] As used herein, unless otherwise specified, the use of the ordinal adjectives “first”, “second”, “third” etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
[055] The term“computer” and/or“computing system” as used herein, includes, for example, any kind of hardware-based electronic device with data processing capabilities including, by way of non-limiting example, a hardware processing circuitry disclosed in the present application. By way of non-limiting example, this term may refer to a personal computer, a server, a computing system, a communication device, a processor or processing unit (e.g. digital signal processor (DSP), a microcontroller, a microprocessor, a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), etc.), any other electronic computing device, including, by way of non-limiting example, the processing circuitry'· therein, such as for example the processing circuitries 1107, 1 115 and 1307 (further detailed herein with regard to Figs 1, 11 and 13), disclosed in the present application.
[056] The operations in accordance with the teachings herein may be performed by a computer specially constructed for the desired purposes, or by a general-purpose computer specially configured for the desired purpose by a computer program stored in a non-transitory computer-readable storage medium.
[057] Embodiments of the presently disclosed subject matter are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the presently disclosed subject matter as described herein
[058] The terms "non -transitory memory" and“non-transitory storage medium” as used herein, include, for example, any volatile or non-volatile computer memory suitable to the presently disclosed subject matter.
[059] It is to be understood that the term "signal" as used herein, may exclude, for example, transitory propagating signals, but may include, for example, any other signal suitable to some demonstrative embodiments.
[060] The term“data aggregation” as used herein, includes, for example, a process in which information is gathered and expressed in a summary' form such as data structures.
[061] The tenn“database (DB)” as used herein, includes, for example, an organized collection of data stored, for example, at a memory' of a DB server, which may be a dedicated computer to hold the DB and to ran a dedicated software to, for example, collect, store and manage the data of the DB For example, the DB may store one or more data structures, which include a collection of schemas, tables, queries, reports, views, and other elements.
[062] Databases are presented here as a non-limiting example of data stores. To the extent that systems described in the presently disclosed subject matter receive information from the databases as input for their processing activities, the databases may be in some cases be non-lim iting examples of data sources.
[063] The term “module” as used herein, includes, for example, any hardware component, a software or firmware, and/or any combination of software, firmware and hardware, which may be embedded on a computing system, such as, for example, on a processor of a computing system .
[064] In some demonstrative embodiments, databases may include, for example, a cloud DB, DB server, an in memory' DB, an active DB, distributed DB, an embedded DB, an End-user DB, SQL DB or the like.
[065] One non-limiting example of a possible service scenario of media output generation is now presented , for clarity of exposition. Bob is a great fan of basketball, and a moderate fan of the Moscow basketball team. He has a close friend Alice, and a less close friend Carl. Bob likes to listen to audio of short duration, and dislikes reading text content. Bob is listening over the internet to a broadcast of the Europe League Final Four 2016 basketball game, played in Berlin between Istanbul and Moscow'. This game is one example of an event of interest to the user (Bob). [066] Carl and Alice may be in other locations. Carl is watching the same game, is wearing an ioT heart rate sensor, and his heart rate increases when particularly exciting plays are made, or certain baskets are scored, in the game. He is a strong fan of Moscow, and a strong fan of basketball in general. Alice is also listening to the game, and she tweets her text comments to her followers during the game. She is wearing a body temperature sensor, and she too is more excited by some parts of the game than by others. She is a strong fan of Moscow.
[067] A large amount of media content (video, audio, text etc.) related to this game is generated by various platforms. It may be required to send to Bob portions of a small number of media content items concerning the game, e.g. only the items that are expected to be the most interesting and relevant to Bob. In one non-limiting example, this requirement is due to the fact that Bob will not be able to watch the entire game in another example. Bob is not able to watch this particular game at all. In other examples, Bob may have pre-enrolled for a sendee which sends him highlights of the most interesting moments of the game. In still other examples, after the game is over he may initiate a request to a content creation system that it send him highlights. In some examples, Bob has enrolled to receive, and/or initiates a request to receive, media content related to the team Moscow in general, and not necessarily for the specific game event --- but the team Moscow is indicative of the Final Four game, since Moscow is playing in the game. The request can be made, in some examples, via a user interface (e.g. web site, mobile application, dial-in to a service). This enrolment, or this request, are examples of a user indication sent to a content creation system. This enrolment, or tins request, are also examples of an event indication sent to a content creation system. The enrolment and requests are also examples of an indication to generate one or more output media contents for the user, where the indication is associated with an event of interest to the user, and or with area(s) of interest of the user which are indicative of an event of interest. In choosing which content to send Bob, in some examples it is advantageous for the content creation system to take into consideration at least some of the following considerations or criteria: which media items are relevant to the game event, which friends of Bob have interests in common to Bob's as relates to tins Final Four event, what levels or degrees of excitement are measured by the IoT data of each friend at different points in time during the game. In some examples, the times of the event, the media and the IoT data are considered. In some examples, degrees of relationship of some or ail of: the event, the input media, the IoT data, Bob and/or his friends , to a particular area of interest (e.g. the team Moscow), are considered. In some examples, it is advantageous to take into consideration: how close is Bob's relationship with each friend or acquaintance, and what are Bob's consumption preferences regarding media types and sizes. Different relative weightings may be given to each parameter, and a process that accounts for these different weightings and combines them in some way may prov ide an optimum set of output content for Bob.
[068] One example of this is that the system may balance the fact that Alice is a closer friend of Bob's, but that Carl has a higher interest in the particular event, and that each may exhibit high levels of excitement during different points of the game. Similarly, in choosing between sending Bob text content associated with times of high friend excitement and video content from moments m time of low excitement, Bob's preferences for media types should be weighed. In tire context of this scenario, Bob may be considered the user, and his friends considered the other users. Note that in some examples, an event of interest to a user may also be an event of a common interest of the user and other users.
[069] In some demonstrative embodiments, a content creation system and a method thereof are presented. The content creation system may generate, for example, a media content of an event of interest of the user and some other users according to a degree of relationship between the user and tire other users to moments of the event of interest, user preferences, a degree of excitem ent of the user at some moments of the event. In some example cases, a content creation system may receive information from one or more data sources. For example, the system may receive an indication to generate one or more output media contents for the user (e.g. Bob), where the indication is associated with an event of interest to the user (e.g. the Final Four game);, and/or one or more areas of interest of the user that are indicative of the event of interest (e.g. Team^Moscow, City-Berlin etc.). The system may receive input media content items, along with media metadata characterizing them, e.g. time of capture, media type, information about topic of the media, and in some cases the event associated with the media content item The system may receive degree of excitement data of at least one other user, as well as degree of excitement metadata that characterize this degree of excitement data "fire degree of excitement metadata may include a time associated with the degree of excitement data. The system may in some cases receive user preference data relating to preferences of the user Bob, providing a first indication, of a degree of relation of the user to the area(s) of interest (e.g. sports, basketball, the team Moscow). The system may in some cases receive user preference data relating to preferences of the other users (e.g. Alice), providing a second indication, of a degree of relation of the other users to the area(s) of interest (e.g. Team Moscow). The above- listed set of data may also be referred to herein as first data. In some examples the media metadata, and/or degree of excitement metadata, may also include third and fourth indications, respectively the a degree of relation of these data items to the area(s) of interest. Some or all of this information input to the system may include weight values.
[070] In some examples, the system may receive other data. The system may also receive data indicative of the degree or strength of the relationship between people, e.g. between the user and some or all of the other users (e.g. Bob and Alice, Bob and Carl). The system may receive degree of excitement data of the user, as well as degree of excitement metadata that characterize this degree of excitement data. The degree of exci tement metadata may include a time associated with the degree of excitement data. The system may in some cases receive user media preference data, including at least media type preference and /or media size preference (e.g. Bob likes short video). In such a case, the media metadata may include input media content type and input media content size
[071 ] Some or all of this information input to the system may include weight values. Note also that in some examples time data disclosed here, e.g. associated with a record, may be timestamps, and some examples this time data may include one or more time ranges ("between tl to t2"). In that sense, "time associated with an event", " time of capture of media", "time of degree of excitement data", etc. may m some examples refer to time ranges.
[072] The process to generate the output media content(s) may be as follows: the system may identify at least one candidate output media content. In some examples, the identification process may involve combining and reducing one or more of these data: the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the other user(s), tire user preference data associated with the user, and the user preference data associated with the at least one other user(s). In some examples, the combining and reducing data can be based at least partly on a degree of relationship of the time associated with the event, the time associated with the at least one input media content and the time associated with the degree of excitement data. [073] In some examples, the combining and reducing data can be based at least partly on a degree of relationship of the area(s) of interest of the user indicative of the event of interest, the first indication of a degree of relation of the user to the area(s) of interest, the second indication of a degree of a relation of the other user(s) to the area(s) of the third indication of a degree of relation of the event to area(s) of interest, the fourth indication of a degree of relation input media content to the area(s) of interest, and/or the fifth indication of a degree of relation of the degree of excitement data to the area(s) of interest. In some examples, the identification process may involve combining and reducing the data indicative of a degree of relationship of the user and the other user(s) with the first data, and/or with other input data. In some examples, identifying the candidate output media content comprises combining and reducing the first data and the user media preference data, based on at least one of input media content type and media content size. In some examples, identifying the candidate output media content(s) involves normalizing data
[0741 After performing the identifying, the system can in some examples generate the output media content(s). hi some examples, the output media content] s) include at least a portion of the candidate output media content(s). in some examples, the system selects the output media content] s) from the candidate output media content(s) using at one or more output media selection criteria.
[075] The system can in some examples provide the user with the output media content(s).
[076] In some cases, some or all of the input data may exist in data entities. In some cases, the above processing of data, to identify candidate output media content] s), may involve:
® receive data from at least one data entity;
® generate one or more new data entities and/or modified data entities;
• perform the processing based on the at least one new data entity.
[077] In some cases, at least part of the above data includes at least one data structure that includes one or more entity fields, a keyword field, a value field and a weight field. The weight field can in some examples include a weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the value field. Such as may also be referred to herein as a (KVW) data structure. The combining and reducing of data, and the generation of new data entities, may include matching Keyword-Value pairs of the KVW data structures.
[078] In some cases, the combining and reducing data involved setting a weight value for record of the new or modified data entities, by at least one of fusing the weight values of records of the new data entities, multiplying the weights values, merging weight values by addition or merging weight values by weighted averaging.
[079] In some cases, the setting a weight value for a record of a data entity may also involve normalizing the weight value, where the sum of normalized weight values is equal to 1.
[080] In some cases, the processing of the data to generate output media content is a complex process, involving multiple stages of generating data entities, including fusing and merging. Non-limiting examples of such generation of data entities, so as to generate output media content, are disclosed with reference to Figs. 3-10 further herein. More details on KVW data structures and possible methods utilizing them are discussed, for example, with reference to Figs. 1 1 and 12.
[081] For example, the input media content may include one or more media contents gathered by the plurality of data entities from at least one of one or more social networks, one or more content platforms, one or more feeds from the Internet and one or more websites.
[082] In some demonstrative embodiments, automatic content may be created from captured media and physiological data. For example, the physiological data may provide an indication of excitement of the user and/or of a degree of excitement of the user. In some examples, such physiological data can be created from measurement of one or more physiological parameters, which may be measured, for example, by one or more sensors, for example embedded in a wearable Internet of Things (ToT) device. For example, physiological parameters may include heartbeats, sweat, body temperature, blood pressure or the like.
[083] In some example eases, the automatic content may be created based on physiological data of the user. In some example cases, the automatic content may be created based on physiological data of other users, for example the friends of the user. In some example cases, the automatic content may be created based on physiological data of both the user and of other users.
[084] Note that physiological data is described herein as one non-limiting example of degree of excitement data. Such degree of excitement data may in some cases be used to measure the degree of excitement of a user who, for example, is consuming and/or generating content. Another non-limiting example of degree of excitement data may be data based on image, video or audio recordings of a user or other users, recorded for example while the user is consuming content, or content generated by a user or other users, where the recordings have been analyzed to detect excitement based on user movement, gestures, expression, voice etc., for example using known techniques. Still another example of measuring degree of excitement may be analysis of text generated by a user or other users, e.g. Twitter tweets that they sent at a particular moment in time, that relate to certain events (e.g. Final Four 2016) and/or to certain topics (e.g. Moscow team). Words can be detected that express different level of exci tement. This may be performed in some cases using known techniques.
[085] Some additional non-limiting example methods of determining or populating the data that is input to the method of the presently disclosed subject mater are disclosed. Data that is not especially provided for the present system to work, such as video recordings, twitter feeds, associated metadata, etc. can be gathered and stored using known techniques in the field of media content. Example methods are also disclosed herein for generation of keywords, values, and weights (KVW data) corresponding to entity field values.
[086] In one example, the KVW data can be filled in manually, preferably keeping the number of possible and assigned keywords as small as possible. In some demonstrative embodiments, the weight of the user in the data structure may be an input value and/or may be part of the user profile, rather than a configuration parameter. For example, users enrolling or registering for a media content service may check off, and/or type in as text, their areas of interest, such as their favorite sports, teams and pastimes, and maybe also enter a degree of interest ("I am a big fan/ 1 am a moderate fan/I have a low' interest"). Similarly, personnel of the content generator organization can in some cases manually enter KVW data for an event or for a media content item. For example, staff of a TV station recording a game may manually enter data for the game, to be associated with the video recording: "vertical = Sport, discipline = Basketball, city=Berlin, TeanwMoscow, Team=Istanbul", with a woight of 1 (representing a high value) for each such KV pair. Note that having a given (but editable) list of possible keywords may in some examples help ensure the consistent use of keywords throughout the system. [087] In another examples, automated or semi-automated methods can be used. Some non-limiting examples include:
- use fixed weight values, e.g. setting the weight of a user's friendship to 100, and all other weights to 1 , in an automated fashion.
- analytics to characterize events - analyze websites, announcements, etc. to obtain available information on the event (e.g. general topic (basketball, final four), specific topics (Moscow', Istanbul), date and time, location.
- apply content analytics - e.g. natural language processing of text media such as twitter feeds to derive the areas of interest and weights. A feed that mentions "Basketball" six times may receive a high weight for "Discipline = Basketball" etc. Another examples is as recognition on video or audio.
- social analytics on user profiles. E.g. if Bob's social networking page mentions team osco ' a lot, and his banner contains the team logo, he can be assigned the KV pair "Team-Moscow" with a relatively high weight.
- for loT MetaData - can, for example, assign default values to all users. The values can be established by organizational means, e.g. handing out sensors only to people present at the event.
- the weight for a loT data associated to certain keywords may increase / decrease during a game. Certain portions of the game rnay more highly related to e.g. a Keyword-Value pair "Position= Goalie", or "Game Event = Penalty" than others.
- apply transitive relations. For example, a video associated event "E" can inherit KVW data from "E".
- conversion of available metadata of other types for use with the data structures used in the method.
— the system notices that Alice watched three Moscow games, sees no associated KVW data, and asks her / prompts her to enter data on the topic (an example of a semi-automatic method) - methods known in the art.
[088] In some other demonstrative embodiments, assigned weight values may he dynamic values. For example, the weight for an loT data associated to certain keywords may increase and/or decrease during the event of common interest.
[089] Note that in some examples data such as the preferred media sizes of users can have a certain range (e.g. +·/- 3 sec), which can be a system parameter or part of the user profile. In addition, preferences such as media type and size can in some examples also be influenced or even overwritten by system preferences. This can be implemented, for example, as configuration parameters.
[090] Note that in some examples, data that is initially entered manually may be modified automatically, in some cases making use of real time data. As one example, Bob manually entered that his level of interest in Team Moscow is 0.2. Later, systems monitor the media that he generates and consumes, e.g. TV shows that he watches and comments that he makes on social media. The systems notice his very intense involvement with Moscow , and over time change the weight in the relevant record for "Bob: Team=Moscow" from 0.2 to 0.6. As another example, the system notices Bob's high degree of excitement when w'atching Moscow-related events (games), and increases the relevant record for "Bob: Team=Moscow".In some examples, the raw IoT data (degree of excitement, time of capture etc._ is populated using state of the art data capturing and processing technologies.
[091] Note that the example flow below does not include user preference data regarding IoT data type in other examples, this could be included the same way as is done for media type, and the data processing flow adjusted accordingly.
[092] Note also that, in the example flow below, the keywords-values-weights of media and IoT data (for example) are known, and their relation to for example peoples can be derived from this knowledge. In other examples, the people's association with areas of interest, and with certain media and IoT data, could be known, and the KVW of the media and IoT data could be derived at least in part from data of the people's associations.
[093] In some demonstrative embodiments, the content may be adjusted to a given person and/or a user according to the person and/or user preferences, interests, physiological data, social relationships and strength of relationships of the content to the user into the account. For example, the person and/or the user preference may- include the media type, e.g., an image, a text, a video, an audio or the like. The person and/or the user interests may include, for example, a favourite team, sport, music, art, movies or the like. The person and/or the user social relationships may include, for example, family, work friends, social network friends, and/or any other type of social relationships. The strength of relationships may be measured for example, by parameters such as, for example, sharing of the same interest, fans of the same team, only friends in the social network, and or any other parameter that may measure the strength of the social relationships.
[094] In some demonstrative embodiments, the creation of the content may be done according to a given topic, e.g., an event of common interest with other users, selected by the user, thus the parameters and the data which may be used to create the content may be context sensitive to the given topic. For example, a content creation system may generate an output content based on an input media. The input media and the generated output may consist of an arbitrary number of media contents and/or media types according to the person and/or user preferences. Non-limiting examples of an event of common interest may include a sports match, a contest, a concert, a show, a happening, a vacation or a trip.
[095] In some demonstrative embodiments, content creation may be defined as contribution of information to any media, and more specifically to a digital media for an end-user and/or an audience and/or a person in specific contexts. For example, the content may be anything expressed through some medium, speech, text writing, video clip, a photograph and/or any of various arts for self-expression, distribution, marketing and/or publication.
[096] In some demonstrative embodiments, some forms of content creation may include, for example, maintaining and updating of web sites, blogging, photography, videography, online commentary, maintenance of social media accounts, editing and/or distribution of digital media and/or any other tools and/or services suitable for content creation.
[097] In some demonstrative embodiments, content creation may include an intelligent content, which may include, for example, a structured reusable content enriched with metadata and supported by intelligent content technologies. The intelligent content may be the ever-changing content needs of users, e.g., customers, and the proliferation of channels and/or devices which the users may use to consume it. [098] Note that the example application disclosed herein is generation of one or more output media contents of a user. It can be seen that alternate sendees can be provided, based on for example data structures disclosed herein, by changing the algorithm that the system runs on the data. As one non-limiting example, rather than sending a media content item to Bob for a game that he missed, the system can analyze data of events, friends, interests etc., and make recommendations - e.g. the system can alert Bob that there is a Moscow game tomorrow at 6 PM, and that he may want to watch it on Channel 2. The system may also ask if he also wants to receive the most exciting scenes. As another non-limiting example, data of the areas of interest of users, and of degree of excitement of users, can be analyzed by a system such as disclosed with regard with Fig. 1. The system can identify a large overlap of interests (e.g a group of people have similar interests in sports and teams, as well as similar tastes in cooking and political parties), and can take an action to create a social -network community and suggest to the users to join the community.
[099] Before turning to a description of an example system and its components in Fig. 1, the arrangement of example flow' charts of the presently disclosed subject matter is described. Fig. 2 provides an example generalized flow' for generating output media content. Figs. 3-10 describe various detailed steps for an example implementation of a method for generating output media content. Fig. 14 describes an example data flow for methods exemplified by Figs 3-10. The example methods disclosed with respect to Figs. 3-10 and Fig. 14 make use of 4 example databases described with respect to Tables 1-4 herein.
[100] Turning first to Fig. 15, it illustrates one example of a process for generating output media, in accordance with certain demonstrative embodiments in some cases, this process may make use of the methods described with reference to Figs. 3 to 10, based on a data flow' such as described further herein with reference to Fig. 14. In some cases, the methods of Fig. 15 may be implemented with use of the systems 100 and/or 1300, which are described further herein with reference to Figs. 1 and 13.
[101] In step 1530, the system may generate a data entity indicative of the association of the user, and of other user(s), with the event of interest to the user, based on the degree of relationship between the user to the other user(s). In some examples, this step may include combining and reducing data of at least the following: the event metadata associated with the event of interest to the user, the user preference data associated with the user, the user preference data associated with the oilier user(s), and the data indicative of the degree of relationship of the user to other users. Example methods for performing this step are disclosed further herein with regard to Figs. 3a and 3b.
[ 102] In step 1540, the system may generate a data entity indicative of one or more media types and of one or media sizes associated with the one or more media types, according to user media preference data. Example methods for performing tins step are disclosed further herein with regard to Fig. 4.
[103] In step 1550, the system may generate a first data entity indicative of the input media content(s), based on at least one of: association of the available media contents with the event, and association of the available media contents with the user preference data associated with the user and with the user preference data associated with other user(s). In some examples, this step may include combining and reducing data of at least the following: the media metadata related to the input media content, the event metadata associated with the event of interest to the user and the data entity indicative of the association of the user and the other user(s) with the event of interest to the user. Example methods for performing this step are disclosed herein with regard to Fig. 5.
[104] In step 1560, the system may generate a data entity indicative of the degree of excitement data of the other user(s), based on at least one of the following: association of the degree of excitement data with the event, and association of the degree of excitement data with at least the user and/or with the other user(s). In some examples, this step may include combining and reducing data of at least the following: the degree of excitement metadata of the other user(s), the event metadata associated with the event of interest to the user, and the data entity indicative of the association of the user and other user(s) with the event of interest to the user. Example methods for performing this step are disclosed further herein with regard to Fig. 6.
[105] In step 1570, the system may generate a data entity indicative of association between the input media content(s) and the degree of excitement data of the other user(s). In some examples, this generation may be performed based on one or more points of time during the event of the interest to the user. In some examples, several of these points may define intervals of time during the event. In some examples, this step may include combining and reducing the first data entity indicative of the input media content(s) and the data entity indicative of association between the input media content(s) and the degree of excitement data of the other user(s). Example methods for performing this step are disclosed further herein with regard to Fig. 7. [106] In step 1580, the system may generate a data entity indicative of points in time within one or more times of interest of the event of interest to the user, according to the degree of excitement data. In some examples, this step can be performed according to an indication of relevance that is based on the degree of excitement data of the other user(s). In some examples, for each such time of interest, generating this data entity can be based on the most highly weighted degree of excitement corresponding to that time of interest. Example methods for performing this step are disclosed further herein with regard to Fig. 8. In step 1590, the system may generate a second data entity, that is indicative of the available media content(s), where available media contents correspond to the points in time within the time(s) of interest of the event of interest to the user. In some examples, the second data entity is weighted based on the preferences of the user. In some examples, this step may include combining and reducing at least the following: the data entity indicative of points in time within at time(s) of interest, the data entity indicative of association between input media content(s) and the degree of excitement data of other user(s), and tire data entity indicative of one or more media types and of one or media sizes associated with the one or more media types. Example methods for performing this step are disclosed further herein with regard to Fig. 9. in step 1595, the system may generate a third data entity indicative of the available media contents(s), based on a relevance field associated with the available media content(s) and on a weight field associated with them . This step may be part of generating the one or more output media contents. In some examples, this third data entity can be indicative of the one or more media sizes associated with the one or more media types. Example methods for performing this step are disclosed herein with regard to Fig. 10.
[107] Turning to Fig. 1 , it presents a schematic illustration of a block diagram of a content creation system !OG to generate a user based media content, in accordance with some demonstrative embodiments. In some demonstrative embodiments, content creation system 100 may be configured to generate one or more output media contents 114 associated to an event of a interest of a user, and in some cases also of interest one or more other users. Content creation system may include a computing system 105 configured to generate the one or more output media contents 114. For example, computing system 105 may include, a desktop computer, a laptop computer, a tablet computing device, a mobile computing device, a cellphone, a smartphone, a gaming console, a smart television or any other computing platform which includes capabilities of data processing. [108] Computing system 105 may, by way of non -limiting example, comprise one or more processing circuitries 107. Processing circuitry 107 may comprise a processor 110 and memory 108. Processing circuitry 107 is shown in Fig 1 as a broken line.
[109] In some demonstrative embodiments, computing system 105 may include a processor (also referred to herein as hardware processing circuitry) 110, which may be configured to generate the one or more output media contents based on an event indication 1 12 and a user indication 1 13.
[110] In some demonstrative embodiments, event indication 112 may indicate the event "E" of interest to the user, which may be watched by a user, and or shared by the user to the other users, and/or an event which the user may be participant. In some examples, event "E" is of common interest to the user and to other users. Note that watching or viewing an event are presented here as non-limiting examples of consumption of media of an event by the user. Other examples of such consumption include listening to audio, reading text on a computer etc.
[111] In other demonstrative embodiments, event indication 112 may include any other indication to indicate a user field or area of interest. Tins area of interest may in some examples be indicative of an event of interest to the user.
[112] In some demonstrative embodiments, user indication 113 may be configured according to a user profile and may include, for example, a user name, a user nick name, a user email address, and/or any other indication of the user.
[113] In some embodiments, for example, hardware processing circuitry 1 10 may include a general-purpose processor, coprocessor or special-purpose processor, such as, for example, a network or communication processor, compression engine, graphics processor, a general purpose graphics processing unit (GPGPU), a high- throughput many integrated core (MIC) coprocessor (including 30 or more cores), multi cores processor, embedded processor, or the like. Hardware processing circuitry 110 may thus be referred to herein interchangeably as processor 110.
[114] The processing circuitry 107 may also include, in some examples, one or more memories 108 According to some examples of the presently disclosed subject matter, the memory 108 can be configured to hold data used by some or all of modules 150, 155, 160, 165, 170, 175, 180, 190, 195. In Fig. 1, the databases 120, 122, 124, 126, 128, 130, are shown as external to memory 108. In other example cases, data stores, such as for example some or all of those DBs shown in Fig. 1, may reside on memory 108. These are non-limiting examples of data items that may make use of memory 108. [115] Examples of details of system components, and of system architecture, are describe with reference Fig. 13 further herein.
[ 116] Examples of the functions of processor 110 will now be further elaborated. Additional examples of functions are described with reference to other figures.
[117] In some demonstrative embodiments, the hardware processing circuitry 110 may be configured to receive from the plurality of data stores, e.g., databases, data associated with the event of the common interest of the user and the one or more other users. For example, tire data may include a first indication on a degree of a relation of the user to the event of the common interest and a second indication of a degree of relation of the other users to the event of the common interest, The first indication and/or the second indication may be include a weight value to indicate on the degree of the relation. For example, a high value, e.g., 1, may indicate on a strong relation and a low value, e.g., 0, may indicate on no relation, e.g., as described below.
[1 18] In some examples, the hardware processing circuitry 1 10 may receive an event indication 1 12 to indicate the event of the common interest and to generate one or more output media contents 114 by processing at least one of: an input media content, user physiological data, social and personal data, event metadata, at least one of user preferences and media metadata related to the input media content, based on the first indication on tire degree of relation of the user to the event of the common interest and the second indication of the degree of relation of the other users to the event of the common interest, and a degree of relationship of the one or more user preferences with a time of creation of tire user physiological data and a time of viewing of the one or more input media contents where the output media content(s) includes at least a portion of the input media content whose associ ated metadata corresponds at least partially to the user preference, and to provide the user with the one or more output media contents 114, e.g. as described below' with reference to Figs. 2-10.
[119] In some demonstrative embodiments, hardware processing circuitry 110 may be configured to set the degree of relationship of the one or more user preferences of the user preferences with the time of creation of the user physiological data and the time of viewing of the one or more input media contents by determining one or more weights to indicate the degree of relationship of the one or more media content to the one or more user preferences, e.g as described below' with reference to Figs. 2-10.
[120] In some demonstrative embodiments, hardware processing circuitry 1 10 may include one or more engine modules 165, 170, 175, 180, 190 and 195 and one or more handler modules 155 and 160. The one or more engine modules 165, 170, 175, 180, 190 and 195 and one or more handier modules 155 and 160 may be configured to generate one or more data entitles, e.g. data structures, based on data received from the at least one data entity of the plurality of data entities, e.g. as described below.
[121] In some embodiments, hardware processing circuitry 110 may include an orchestrator module 150 operably coupled to the one or more engine modules 165, 170, 175, 180, 190 and 195 and to the one or more handler modules 155 and 160. The orchestrator module 150 may be configured to coordinate an operation of the one or more engine modules 165, 170, 175, 180, 190 and 195 and the one or more handler modules 155 and 160 according to a predetermined sequence, e.g. as described below.
[122] In some demonstrative embodiments, orchestrator module 150 may be configured to: provide to the user the one or more output media contents 114 generated based one or more data portions generated by the one or more engine modules 165, 170, 175, 180, 190 and 195 and the one or more handler modules 155 and 160, e.g. as described below. For example, the one or more output media contents 114 may include one or more media of selected moments of the event of common interest. The one or m ore m edia of selected moments may be selected , for example, according to the degree of relationship of the user physiological data generated at a time of the event of common interest with the one or more media contents and the degree of the relation of the user to the other user and to the event of common interest.
[123] In some demonstrative embodiments, an engine of the one or more engine modules 165, 170, 1 75, 180, 190 and 195 may be configured to receive data from die plurality of data entities and to generate a new data entity using, for example, data aggregation methods and according to a predetermined criteria, e.g. as described below. For example, the new generated data entity may include a plurality of fields. The plurality of fields may include at least one of a keyword field, a value field and a weight field. The new data entity may be provided to the orchestrator module 150 for generating the one or more output media content 1 14.
[124] In some demonstrative embodiments, the engine of the one or more engine modules 165, 170, 175, 180, 190 and 195 may be further configured to generate the new data entity by combining two or more data entities, set a weight value to a row of the new data entity by summing weights values of rows of the new data entities and normalized the weight value, e.g. each weight value of each row of the new data entity. For example, normalized the weight value may include setting values at the weight field which result, when they are summed, a sum equal to 1, e.g. as described below
[ 125] In some examples, the weight value may be normalized at least by one of fusing the weight values, multiplying the weights values, merging weight values by addition or merging weight values by weighted averaging, e.g as described below. Example cases of this may be found in the description of Figs. 3 to 10, further herein.
[126] In some demonstrative embodiments, content creation system 100 may include a plurality of databases, e.g., as described below.
[127] In some demonstrative embodiments, content creation system 100 may include a media metadata DB 120, operably coupled to computing system 105 For example, media metadata DB 120 may be configured to provide media metadata 121 to computing system 105. For example, media metadata 121 may include a media identifier (ID), a time when the content was encoded, a media type, such as, for example, video, audio, text, image, photograph, a feed from social networks, or the like.
[128] In some demonstrative embodiments, the metadata at media metadata DB 120 may be arranged m a data structure, for example, Table 1, as described herein below.
[129] For example, Table 1 may be generated according to concepts of keywords and weights and may hold information of the media in the form of 6-tuples of MedialD, Type, Time, Keyword, Value and Weight.
Figure imgf000044_0001
Figure imgf000045_0001
Table 1
[130] In some demonstrative embodiments, content creation system 100 may include raw Internet of Tilings (IoT) data DB 122, operabiy coupled to computing sy stem 105. For example, the raw IoT data DB 122 may be operabiy coupled to one or more sensors 145, and may be configured to provide raw IoT data 123 to computing system 105. For example, raw IoT data may include physiological data of the user, measured during the event of common interest. For example, the physiological data may include a heart rate (HR), a skin response (SR), a body temperature (BT), a breathing rate (BR), step counter (SC) or the like.
[131] In some demonstrative embodiments, content creation system 100 may include an IoT metadata DB 124, operabiy coupled to computing system 105, and may be configured to provide IoT metadata 125 to computing system 105. For example, IoT metadata 125 may include a sensor identifier (ID), a sensor type, a sensor model, a sensor location, sensor capabilities or the like.
[132] In some demonstrative embodiments, data of IoT metadata DB 124 may be arranged in a data structure, for example, Table 2, which may include portions of raw IoT data 123, e.g. as described below.
[133] For example, Table 2 may be generated according to concepts of keywords and weights and may hold information of the media in the form of 6-tuples of loTID, User, Time, Keyword, Value and Weight.
Figure imgf000045_0002
Figure imgf000046_0001
Table 2
[134] In some demonstrative embodiments, content creation system 100 may include an event metadata DB 126, operably coupled to computing system 105. For example, event metadata DB 126 may be configured to provide event metadata 127 to computing system 105. For example, event metadata 127 may include an event ID, an event location, event type, event date, event time, a list of participants or the like.
[135] In some demonstrative embodiments, data of event metadata DB 126 may be arranged in a data structure, for example Table 3, e.g. as described below.
[136] For example. Table 3 may be generated according to concepts of keywords and weights, and may hold information of the media in the form of 4-tuples of Event, Keyword, V aiue and Weight. For example, the Event field may include the name of the event of common interest, for example, Europe League Final Four 2016 (ELFF’16).
Figure imgf000046_0002
Figure imgf000047_0001
Table 3
[137] In some demonstrative embodiments, content creation system 100 may include a raw media DB 130, operably coupled to computing system 105. For example, raw media DB 130 may be operably coupled to one or more media sources 140, and may be configured to provide one or more input media contents 131 to computing system 105. For example, input media contents 131 may include a video clip, an audio recording of the event, images, photographs, text messages, feeds from the Internet, feeds from social networks or the like.
[138] In some demonstrative embodiments, one or more media sources 140 may include, for example, a video camcorder (Cam), a microphone (Mic) an audio recorder, a text messaging application, the Internet, social networks, e.g , a Twitter Feed, television broadcasting or the like.
[139] In some examples, content creation system 100 may include a user profile DB 128, operably coupled to computing system 105. For example, user profile DB 128 may¬ be configured to provide user profile data 129 to computing system 105. For example, user profile data 129 may include a user name, user friends, user hobbies, e.g. sport, user preferences, e.g. desired media length for audio, video and text, or the like.
[140] In some demonstrative embodiments, data of event metadata DB 126 may he arranged in a data structure, for example Table 4, e.g. as described below.
[141] For example. Table 4 may be generated according to concepts of keywords and weights as described above, and may hold information on users in the form of, for example, quadruples of User, Keyword, Value, Weight. For example, a user, e.g. Bob, may be a friend, e.g. as defined the Keyword column, of Alice, e.g. as defined in the Value field, e.g. Value column and/or Value attribute, and the strength of the relationship between Bob and Alice may be provided at the Weight field, e.g. Weight column and/or Weight attribute. Note that in the non-limiting example of Table 4, the DBJLiSERPROFILE includes at least three general categories of information about a user: their relationship to other users, their interest in certain topics (for example, related to hobbies and to entertainment choices), and their preferences regarding media formats and related characteristics such as media length. In other examples, each category of information can reside in a separate DB or other data structure.
Figure imgf000048_0001
Table 4
[142] As described above, the plurality of databases may be configured to provide different types of data from a plurality of different sources. The different type of data from the plurality of databases may be provided to the hardware processing circuitry 110 to be processed by a plurality of engines, e.g. engines 165, 170, 175, 180, 190 and 195, and handlers, e.g. handlers 155, 160, of hardware processing circuitr ' 110 to generate one or more output media content 114, e.g. as described below.
[143] In some examples, each of the databases may be implemented by hardware and/or by software and/or by any combination of hardware and software.
[144] In some demonstrative embodiments, hardware processing circuitry 1 10 may include an Orchestrator module 150, configured to process data received, for example, from the plurality of databases, the plurality of engine modules and the plurality of handler modules, and to generate, for example, the one or more output media content 114, e.g a media content associated with the event of common interest, based on event indication 112 and user indication 113, e.g. as described below.
[145] In some other demonstrative embodiments, Orchestrator module 150 may generate any other output media content 114, based on other indications rather than event indication 1 12 and user indication 113.
[146] In some demonstrative embodiments, Orchestrator module 150 may control a workflow for generating a content according to a user’s preferences, for example, an event of common interest. For example, Orchestrator module 150 ma be an entry point and/or an exit point to at least some engine modules and handler modules of hardware processing circuitry' 110, e.g. as described below.
[ 147] In some demonstrative embodiments, Orchestrator module 150 may call engine modules 165, 170, 175, 180, 190 and 195 and handler modules 155 and 160, one by one in a predetermined sequence, sending them the required input and storing their output. The Orchestrator module 150 may generate the one or more output media contents 1 14 based on data generated by the one or more engine modules and the one or more handler modules when called, according to a predetermined sequence.
[148] In some demonstrative embodiments, Orche strator module 150 may call the one or more engine modules 165, 170, 175, 180, 190 and 195 and/or handler modules 155 and 160 in any desired order, e.g. in a serial order or in a parallel order.
[149] In some demonstrative embodiments, Orchestrator module 150 may call first, the social links engine module 170 to generate a first data entity, e.g. a first data structure, a first table (e.g. Table 12 below'), which may include a list of persons linked to the user and related to the event. The Orchestrator module 150 may call second, the content preferences engine module 165 to generate a second data entity, e.g. a second data structure, a second sable (e.g. Table 13-2), which may include a list of media types and sizes preferred by the user. The Grchestrator module 150 may call third, the media handler module 155 to generate a third data entity, e.g. a third data structure, a third table (e.g. Table 16), which may include a list of media of given types and time intervals associated with the users and the event. The Orchestrator module 150 may call fourth, the IoT data handler module 160 to generate a fourth data entity, e.g. a fourth data structure, a fourth table (e.g. Table 20), which may include a list of IoT Data and intervals associated with the users and the event. The Orchestrator module 150 may call fifth, the IoT-media association engine module 175 to generate a fifth data entity, e.g. a fifth data structure, a fifth table (e.g. Table 21 ), which may include a list of IoT data, media, media type, and time intervals where IoT data and media may be both available. The Orchestrator module 150 may call sixth, the moments determination engine module 180 to generate a sixth data entity, e.g. a sixth data structure, a sixth table (e.g. Table 23), which may include a list of moments in time created for different combinations of IoT Data, together with time intervals of moments detected and relevance, e.g. level or degree of excitement, of the moments. The Orchestrator module 150 may call seventh, the media selection engine module 190 to generate a seventh data entity, e.g., a seventh data structure, a seventh table (e.g. Table 25), which may include a list of IoT Data, relevant time intervals, corresponding media, media type and relevance, e.g. level or degree of excitement of the moments. The Orchestrator module 150 may call eighth, the content creation engine module 195 to generate an eighth data entity, e.g. an eighth data structure, an eighth table (e.g. Table 26), which may include a list of IoT Data, relevant time intervals, corresponding media, media type and weight. In some demonstrative embodiments, this sequence may be performed in manner such as that described with reference to Fig.15, and to Fig. 14 further herein. This sequence may in some example cases make use of the methods described with reference to Figs. 3 to 10 further herein. Figs. 3-10 describe some methods that may performed in each module, in some example cases.
[150] In other demonstrative embodiments, Orchestrator module 150 may call any other engine module and/or any other handler module in a different order and/or in parallel to generate the one or more data structures, e.g. the tables.
[151] In some example embodiments, Orchestrator module 150 may include, e.g. a standard workflow engine, and/or any other engine, and/or any other software module. [152] Example operations of the plurality of engine modules 165, 170, 1 75, 180, 190 and 195 and the plurality of handler modules 155 and 160, in accordance with some demonstrative embodiments will be detailed described now.
[153] In some demonstrative embodiments, hardware processing circuitry 110 may include a media handler module 155, which may be operably coupled to Orchestrator module 150 and may be configured to generate a data entity e.g. a data structure, a list and/or a table. For example, the data entity includes media related to the event of the common interest and to the user, prioritized according to a degree of relation between the event of the common interest and the user, e.g. as described below.
[ 154] For example, the media may include media (Mi) of given types (Ti) and interval (li) associated to the user, e.g. a person PI as indicated by user indicator 1 13, and to an event (E), e.g. as indicated by event indication 112 based on data received from one or more databases, e.g. user profile DB 128, event metadata DB 126 and/or media metadata DB 120, and/or from the one or more engine modules and/or handler modules e.g. as described below'. In some demonstrative embodiments, media handler module 155 may perform the method of Fig. 5, described further herein.
[155] In some demonstrative embodiments, hardware processing circuitry' 110 may include an loT data handier module 160, which may be operably coupled to Orchestrator module 150 and may be configured to generate a data entity, e.g. data structure, a list and/or a table. The data entity may include the physiological data of the user related to the event of the common interest and to the user, prioritized according to a degree of relation of the event of the common interest and the user, e.g. as described below. In some demonstrative embodiments, loT data handler module 160 may perform the method of Fig. 6, described further herein.
[156] In some demonstrative embodiments, ToT data handler module 160 may generate the data entity of physiological data, e.g. loT Data (Di), and interval (li) associated to user PI and event E based on the data received from user profile DB 128 and/or IoT metadata DB 124 and/or any other databases and/or engine.
[157] In some demonstrative embodiments, hardware processing circuitry 1 10 may include a content preferences engine module 165, which may be operably coupled to Orchestrator module 150 and may be configured to generate a data entity, e.g. a data structure, a list and/or a table, of preferred media types and sizes based on the data received from, for example, user profile DB 128, e.g. as described below'. The data entity may include one or more media types, and one or media sizes associated with the one or more media types. Tire data entity may be prioritized according to a degree of relation to the event of common interest and the user, e.g. as described below. In some demonstrative embodiments, content preferences engine module 165 may perform the method of Fig. 4, described further herein.
[158] In some demonstrative embodiments, hardware processing circuitry 110 may include a social links engine module 170, which may be operably coupled to Orchestrator module 150 and may be configured to generate a data entity, e.g. a data structure, a list and/or a table, of friends of the user associated with the event, based on the data received from, for example, user profile DB 128 and/or event metadata DB 126. The data entity may include one or more users. Tire data entity may be prioritized according to a degree of relation to the event of common interest and to the user, e.g. as described below. In some demonstrative embodiments, social links engine module 170 may perform the method of Figs. 3a and 3b, described further herein.
[159] In some demonstrative embodiments, hardware processing circuitry 1 10 may include an IoT-media association engine module 175, which may be operably coupled to Orchestrator module 150, and may be configured to generate a data entity, e.g. a data structure, a list and/or a table, of media preferred by friends of the user associ ated with the event, based on the data generated and/or received from, for example, other engine modules. The data entity may include one or more available media contents generated at one or more points of time during the event of the common interest and may be prioritized according to the physiological data of the user and physiological data of the one or more other users, e.g. as described below. In some demonstrative embodiments, IoT-media association engine module 175 may perform the method of Fig. 7, described further herein.
[160] In some demonstrative embodiments, hardware processing circuitry'· 110 may include a moments determination engine module 180, which may be operably coupled to Orchestrator module 150, and may be configured to generate a data entity, e.g. a data structure, a list and/or a table, of moments in time created for different combinations of ToT Data (Di), together with interval (li) of moments detected and relevance, e.g. degree or level of excitement, of the moments (Ri), based on the data generated and/or received from, for example, other engine modules. The data entity may include points in a time of common interest of the event of the common interest according to one or more users physiological data sharing the event of the common interest, e.g. as described below. In some demonstrative embodiments, moments determination engine module 180 may perform the method of Fig. 8, described further herein.
[ 161] In some examples, hardware processing circuitry 110 may include a media selection engine module 190, which may be operably coupled to Orchestrator module 150, and may be configured to generate, a data entity, e.g. a data structure, a list and/or a table, of IoT Data (Di), relevant interval (Ii), corresponding media (Mi), media type (Ti) and relevancy, e.g level of excitement, of the moments (Ri), based on data generated and/or received from, for example, other engine modules. The data entity- may include an indication on a preselected media generated at one or more point of times of the event of the common interest corresponded to the user physiological data, and prioritized according to a degree of relation of input data to the moments and to the user preferences, e.g. as described below. In some examples, media selection engine module 190 may perform the method of Fig. 9, described further herein
[162] In some demonstrative embodiments, hardware processing circuitry- 1 10 may include a content creation engine module 195, which may be operably coupled to Orchestrator module 150, and may be configured to generate a data entity, e.g. a data structure, a list and/or a table, of selected media content created during the event, based on data generated and/or received from, for example, other engine modules. The data entity may include a content personalized to the user in a context of the event of the common interest, according to user preferences, detected points of time of the event of the common interest, physiological data of the user and a media generated at the point of times of the event of the common interest, e.g. as described below. In some demonstrative embodiments, content creation engine module 195 may perform the m ethod of Fig. 10, described further herein.
[163] In some demonstrative embodiments, content creation system 100 may be configured to generate one or more output media contents 114 associated with the event. For example, content creation system 100 may include one or more media databases, e.g. media metadata DB 120 and raw media DB 130, to store one or more input media contents 131, and media metadata 121 related to a media associated with the event, which may be received from one or more media sources 140.
[164] In some demonstrative embodiments, content creation system 100 may include an event DB, e.g. event metadata DB 124, which may be configured to store data associated with the event, e.g. event metadata 127. [165] In some demonstrative embodiments, content creation system 100 may include a physiological database, e.g. raw IoT DB 122, which may be configured to store user physiological data, e.g. raw IoT data 123, collected from one or more sensors 145. In some demonstrative embodiments, content creation system 100 may include an IoT media metadata DB 124, which may be configured to store IoT media, e.g. data 125.
[166] In some examples, content creation system 100 may include a user profile DB
128, which may be configured to store social and personal data, e.g., user profile data
129, of the user and user preferences including a type of media, the user physiological data, a degree of user social relationships parameter and social networks data.
[ 167] In some other examples, any or all of the modules may be configured to generate the relevant data structures, e.g. list and/or table entities, based on the data received from any other databases and/or other data sources, if desired, e.g. as described below.
[168] The example disclosed with regard to Figs. 3-10 disclose the creation by various example modules of certain example data entities. In some other demonstrative embodiments, any or ail of the modules may be configured to generate any other data entity from any other databases and/or data structure using data aggregation techniques and/or any other data manipulating techniques. In some demonstrative embodiments, each module and database may be implemented by hardware and/or software and/or any combination of hardware and software.
[169] Note that all the modules and DBs described with reference to Fig. 1 are non limiting examples. Some or all of the modules and DBs may' be combined, other modules and DBs may exist, and functions described with reference to one module or DB may be split among more than one module or DB.
[170] Turning to Fig. 2, it presents a schematic illustration of an example flow chart of a method 200 of generating a user based media content associated with an event, in accordance with some demonstrative embodiments.
[171] In some demonstrative embodiments, the method 200 may be implemented by the hardware processing circuitry 110 and may include, for example, receiving from the plurality of databases, e.g. databases 120, 122, 124, 126, 128 and 130, data associate with the event of the common interest of the user and the one or more other users.
[172] In some demonstrative embodiments, the hardware processing circuitry 110 may receive an indication to generate one or more output media contents for the user ( text box 205). This indication may associated with an event of interest to the user (e.g. event indication 112) and/or with at least one area of interest of the user indicati ve of the event of interest. The event of interest to the user may he an event of common interest to the user and other user(s).
[ 173] Hardware processing circuitry 1 10 may receive data. The data received may include one or more input media content(s), and degree of excitement data (e.g. physiological data) of other user(s) (text box 210). In some examples the input media content(s) include media content gathered by the at least one data source from at least one of one or more social networks, one or more content platforms, one or more feeds from the Internet and one or more websites
[174] First data may be received. In some demonstrative embodiments, the first data includes: event metadata associated with the event of interest to the user, media metadata related to the input media content(s), degree of excitement metadata of other user(s), user preference data (e .g. personal data) associated with the user (a first indication of a degree of relation), user preference data (e.g. personal data) associated with the other user(s) (a second indication of a degree of a relation) (text box 220). In some examples, the event metadata, the media metadata and/or the degree of excitement data may include time information (e.g. timestamps or time ranges) associated with the event, the input media content(s) and/or the degree of excitement.
[175] In some examples, the event metadata, the media metadata and/or the degree of excitement data may include degree of relation data. The event metadata may comprise a third indication of a degree of relation, of the event to the area(s) of interest; the media metadata may comprise a fourth indication of a degree of relation, of the at least one input media content to the area(s) of interest; and the degree of excitement metadata of the other user(s) may comprise a fifth indication of a degree of relation, of the degree of excitement data to the area(s) of interest.
[176] In some demonstrative embodiments, one or more of the following data may also be received: data indicative of a degree of relationship of the user to other user(s) (e.g. social data), degree of excitement data of the user, degree of excitement metadata of the user, and/or user media preference data (media type preference and/or media size preference) (text box 225).
[177] First data may be received. In some demonstrative embodiments, the first data includes: event metadata associated with the event of interest to the user, media metadata related to the input media content(s), degree of excitement metadata of other user(s), user preference data (e.g. personal data) associated with the user (a first indication of a degree of relation), user preference data (e.g. personal data) associated with the other user(s) (a second indication of a degree of a relation). (text box 220). In some examples, the event metadata, the media metadata and/or the degree of excitement data may include time information (e.g. timestamps or time ranges) associated with the event, the input media content(s) and/or the degree of excitement.
[178] In some examples, the event metadata, the media metadata and/or the degree of excitement data may include degree of relation data. The event metadata may comprise a third indication of a degree of relation, of the event to the area(s) of interest; the media metadata may comprise a fourth indication of a degree of relation, of the at least one input media content to the area(s) of interest; and the degree of excitement metadata of the other user(s) may comprise a fifth indication of a degree of relation, of the degree of excitement data to the area(s) of interest.
[179] In some demonstrati ve embodiments, one or more of the following data may also be received: data indicative of a degree of relationship of the user to other user(s) (e.g. social data), degree of excitement data of the user, degree of excitement metadata of the user, and/or user media preference data (text box 225). In some examples, degree of excitement data of the user and degree of excitement data of the other user(s) can be based on physiological data of the relevant user. In some examples, user media preference data includes media type preference and/or media size preference. In some examples, the additional data may also be referred to herein as second data.
[180] The hardware processing circuitry' 110 may, in some examples, identify' one or more candidate output media content(s). The candidate output media contents may include at least a portion of the input media content(s). (Non-limiting examples of this are disclosed with reference to Figs. 5 and 9.) The identification process may include processing such as combining and reducing at least the First Data. In some examples, the combining and reducing data is based at least partly on a degree of relationship of the time associated with the event, the time associated with the at least one input media content and the time associated with the degree of excitement data hi some examples, combining and reducing the data is based on the degree of relationship of the area(s) of interest of the user indicative of the event of interest (e.g. as indicated in event indication 112), and the first indication, the second indication, the third indication, the fourth indication and the fifth indication. In some examples, the identification process may include combining and reducing first data and the data indicative of a degree of relationship of the user to the at least one other user. In some examples, the identification process may include combining and reducing the first data and the data of the degree of excitement metadata of the user. In some examples, die identification process may include combining and reducing the first data and the user media preference data, based on at least one of input media content type and media content size. In some examples, the identification process may include combining and reducing the second data.
[181] In some examples, the combining and reducing may be done by matching Keyword-Value pairs of Keyword-Value-Weight data structure(s), and by setting weight values of weight values within these data structures. In some examples, setting the weight value can be done by at least one of fusing tire weight values of records of the new data entities, multiplying the weights values, merging weight values by addition and merging weight values by weighted averaging In some examples, setting a weight value may include normalizing the weight value, where the sum of normalized weight values is equal to 1. Examples of these processes may be found in the description of Figs. 3 to 10, further herein .
[182] In some demonstrative embodiments, one or more output media contents 1 14 may be generated (text box 240). The generated output media contents may m some eases include at least a portion of the candidate output media content(s). In some examples, the generating includes selecting the output media content(s) from candidate output media content(s) using one or more output media selection criteria. Non-limiting examples of such selection criteria are disclosed further herein with reference to Fig. 10. The hardware processing circuitry 1 10 may provide the user with the one or more media output content(s) 114 (text box 250).
[ 183] For example, the user preferences may include the user specific field of interest associate with the event. For example, in a football match the user preferences may include for example the highlight moments of his favourite player and/or his favourite team.
[184] In some demonstrative embodiments, the content creation system 100 may process any data associate with the event and any data that may be generated at the time of the event to provide the output media content 114, e.g. video, text messages, images, voice and etc., based on the user preferences, a time of creation of the user physiological data 123 and the media content 13. For example, data processing may include: weighting the strength of relationship of the user preferences with the user the user physiological, the time of creation the user physiological data 123 and/or the media content 131 associated with the event and the time of creation the user physiological data 123.
[185] In some example embodiments, the content creation system 100 may provide the user the output media content 1 14 according to the weight of the parameters that defined by the user preferences, for example, media content with the highest weight.
[186] For example, generating the one or more output media contents 1 14 may be done, by calling, e.g. by Orchestrator module 150, in a predetermined sequence, one or more engine modules, e.g. engine modules 165, 170, 175, 180, 190 and 195, and/or from one or more handler modules, e.g. handler modules 155 and 160 and generating the one or more output media contents 1 14 based on data generated by the one or more engine modules e.g. engine modules 165, 170, 175, 180, 190 and 195, and/or from one or more handler modules, e.g. handier modules 155 and 160.
[ 187] In some other demonstrative embodiments, the content creation system 100 may generate the output media content 1 14 based the user physiological data, social and personal data and the event metadata, based on the user preferences, a time of creation of the user physiological data 123 and the media content 131 by processing at least one of the input media content 127, the user physiological data, social and personal data and the event metadata according to any other algorithms, data processing methods and by using any other hardware structure and/or any other modules.
[188] In some demonstrative embodiments, generating the one or more output media contents 1 14 may be done, for example, by selecting moments in time of the event according to the user physiological data generated at the time of the event 1 14.
[ 189] In some example embodiments, providing the input media contents 131 may be done by gathering from at least one or more social networks one or more media contents and generating the input media contents 131 based on the one or more media content.
[190] In some demonstrative embodiments, the hardware processing circuitry 110 may generate the one or more output media content 114, according to one or more data entities, e.g. one or more data structures, generated by at least one of the one or more engine modules, e.g. engine modules 165, 170, 175, 180, 190 and 195, and/or from one or more handler modules, e.g. handier modules 155 and 160.
[191] In some demonstrative embodiments, the hardware processing circuitry 110 may be configured to set a degree of relationship of the one or more user preferences of the user preferences with the time of creation of the user physiological data and a time of creation of the one or more media content by attaching to the one or more media content one or more weights to indicate the degree of relationship of the one or more media content to the one or more user preferences.
[ 192] One non-limiting example of the operation of the plurality of engine modules 165, 170, 175, 180, 190 and 195, and the one or more handier modules, e.g. handler modules 155 and 160 will be described now with reference to Figs. 3-10. The figures describe some example methods that may be performed by each module. In some examples, the methods described with reference to Figs. 3-10 may be performed in a sequence such as that described with reference to Figs. 14 and 15 further herein.
[193] Turning to Figs. 3a and 3b, they present schematic illustrations of a flow chart of a method 300 of generating a friends-event key data entity, e.g. data structure, by a social links engine module 170, in accordance with some demonstrative embodiments. In some demonstrative embodiments, social links engine 170 may be called first by Grchestrator module 150 in order to generate output media content 114. For example, social links engine 170 may tap into social relationships of a given person, e.g. the user, and may generate a data structure, e.g. a list and/or a table and/or the like, of the user friends. For example, the data entity, e.g., tire data structure, the list and/or the table, may be prioritized by the degree of social relationships and the friends’ association to the event, which may be indicated by event indication 1 12.
[194] In some demonstrative embodiments, social links engine 170 may receive an event indication, e.g., event indication 112, a user name, data 129, e.g. DB_USERPROFILE, from user profile DB 128, data 12.7, e.g. DB EVENTMETADATA, from event metadata DB 126, and may output a data structure, e.g. a list and/or a table, of persons linked to the user and related to the event. For example, the data structure, e.g. the list and/or the table, may include weights (Wi), which may be used to derive the priority of the data entity, e.g., the data structure, elements, e.g. the list elements and/or the table elements. In some examples, event indication 112 can include the event, Έ". In some examples, the user name may be received as part of user indication 113.
[195] For example, the event (E) may be ELFF’ 16, the user name (PI) may be Bob, the data may include, for example, DB_U SERPROFILE and/or DB EVENTMETADATA, and the output may include FRIENDS_EVENT_WEIGHTS_FRIENDSWEIGHTS data entity (Table 12), if desired.
[196] In some demonstrative embodiments, the method 300 may include, generating an Event keys data entity (Table 5), e.g., an Event keys table, including a plurality of keywords associated with an event, a plurality of values associated with the event, and a plurality of weights associated with the plurality of keywords and the plurality of values (text box 310). For example, social links engine 170 may receive data including a plurality of keywords associated with the event from event metadata DB 127, e.g., DB_EVENTMETADATA (Table 3), and may generate, for example, the Event keys data entity (Table 5), e.g., an EVENTKEYS table, which may be associated with the event“ELFF’16” and may include Keyword, Value and Weight fields as is shown below in the example of Table 5. Table 3 can be a non-limiting example of e vent metadata associated with the event of interest to the user.
Figure imgf000060_0001
Table 5
[197] In some demonstrative embodiments, the method 300 may include generating a Friends keys data entity (Table 6), e.g. a friends keys table, including a plurality of users associated with the event, a plurality of keywords associated with users, a plurality of values associated with the users, and a plurality of weights associated with the users, the plurality of keywords and the plurality of values (text box 320) For example, social links engine 170 may receive data 129 related to a plurality of friends associated with the event from user profile DB 128, e.g. DB USERPROF1LE (Table 4), and from the Event keys data entity ("fable 6). For example, social links engine 170 may select from the received data, data fields such as, for example, user, keyword, value, and weight and may generate, for example, a Friends keys data entity (Table 2), e.g. a FRIENDSKEYS table, as is shown below in the example of Table 6. Table 4 in some cases a non-limiting example of a data entity that includes user preference data associated with the user, user preference data associated with the other user(s) (the friends) and data indicative of the degree of relationship of the user to the other(s).
Figure imgf000061_0001
Table 6
[198] In some demonstrative embodiments, the method 300 may include generating a Friends weights data entity (Table 7) including a plurality of users associated with the event and a plurality of weights associated with the users, the plurality of keywords and the plurality of values (text box 330). For example, social links engine 170 may receive data including a plurality of weights of friends associated with the event from user profile DB 128, e.g. DB_USERPROFILE, and may select the Value field, e.g. Value column and/or Value attribute, and the Weight field, e.g. Weight column and/or Weight attribute, from user profile DB 128 to generate a Friends weights data entity (Table 7), e.g. a FRIENDS WEIGHTS table, as is shown below in the example of Table 7. In some examples cases, this process can derive a degree of relationship between the user to the other user(s).
Figure imgf000061_0002
Table 7
[199] In some demonstrati ve embodiments, the method 300 may include cleaning up unused keywords at an Event keys data entity (Table 5), e.g. an Event keys table, to include keywords, ,e g. only keywords, and values that are in the Friend keys data entity (Table 6) (text box 340). For example, social links engine 170 may receive data including a plurality of weights of friends associated with the event from user profile DB 128, e.g. DB_USERPROFILE, and may select the Keyword field, e.g. Keyword column and/or Keyword attribute, the Value field, e.g. Value column and/or Value attribute, and the Weight field, e.g. Weight column and/or Weight attribute, from Friends keys data entity (Table 6), e.g. the FRJENDSKEYS data entity, to generate the Event keys data entity (Table 8), e.g. an EVENTKEYS data entity, as is shown below in the example of Table 8.
Figure imgf000062_0001
Table 8
[200] In some demonstrative embodiments, the method 300 may include normalizing the weights values at the Event keys data entity ( Table 8), the Friends keys data entity (Table 6) and Friends weights data entity (Table 7), by setting the total sum of the weights in the data entity to 1 (text box 350). For example, after normalization, the Event keys data entity (Table 8), the Friends keys data entity (Table 6) and the Friends weights data entity (Table 7) may be as in die example below:
Figure imgf000062_0002
Table 8’
Figure imgf000062_0003
Table 6’
Figure imgf000063_0002
Table T
[201] In some demonstrative embodiments, the method 300 may include, lasing the weights with the same keywords of the event keys data entity (Table 8’) with Friend keys data entity (Table T) to generate a ne data entity, for example, a Friends_Event_Keys_Weights data entity (Table 9), by multiplying the weights of Event keys data entity (Table 8’) with Friend keys data entity (Table 6’) and normalizing the weights by setting the total sum of the weights of Friends_Event_Keys_Weights data entity (Table 9} to 1 (text box 360), as is shown below in the example of Table 9.
[FRIENDS EVENTJKE^ WEIGHTS
Figure imgf000063_0001
Figure imgf000063_0003
Table 9
[202] In some demonstrative embodiments, the method 300 may include fusing the weights with same keywords of the Friends_Event_Keys_Weights data entity (Table 9) with the Friends weights data entity (Table T) to generate a new data entity, for example, a Friends Event Keys Weights Friendwaights data entity (Table 10).
[203] For example, generating the Friends Event Keys Weights Friendweights data entity (Table 10) may be done by multiplying the weights of Friends_Event_Keys_Weights data entity (Table 9) with Friends weights data entity (Table T ) and normalizing the weights by setting the total sum of the weights of Friends Event Keys Weights Friendwaights data entity (Table 10) to 1 (text box 365), as is shown below' in the example of Table 10.
Figure imgf000064_0001
Table 10
[204] In some demonstrative embodiments, the method 300 may include creating, a new data entity, for example, an Event Keys Weights Friendsweights data entity (Table 11) by merging rows with the same keyword value pairs from the Friends_Event_Keys_Weights_Friendweigbts data entity (Table 10), summing the respective weights and dropping the user fields (text box 370), as is shown below in the example of Table 11.
Figure imgf000064_0002
Table 11
[205] In some demonstrative embodiments, tire method 300 may include creating a new data entity, for example, a Friends_Event_Weights_Friendsweights data entity (Table 12), by merging rows with the same user from the Friends _Event_Keys_Weights_Friend_ weights data entity (Table 10), summing the respective weights and dropping the keyword and Value fields (Text box 380), as is shown below in the example of Table 12. In some examples, Table 12 is anon-limiting example of a data entity indicative of the association of the user, and of other user(s), with the event of interest to the user based on the degree of relationship between die user to the other user(s). in some examples, this step may include combining and reducing data of at least the following: the event metadata associated with the event of interest to the user, the user preference data associated wi th the user, the user preference data associated with the users, and the data indicative of the degree of relationship of the user to other users.
Figure imgf000065_0001
Table 12
[206] In some demonstrative embodiments, the method 300 may include, providing the data entity, e.g. the Friends Event Weights Friendsweights table (Table 12) to Orchestrator module 150 (text box 380).
[207] Turning to Fig. 4, it presents a schematic illustration of a flow chart of a method 400 of generating a preferred media data entity by the content preference engine module 165, in accordance with some demonstrative embodiments.
[208] In some demonstrative embodiments, content preference engine module 165 may tap into the profile of the user (P 1 ) and may generate a data entity, e .g . , a list and/o r a table and/or the like, of preferred media types and sizes
[209] In some demonstrative embodiments, content preference engine module 165 may be configured to generate the data entity having one or more media types and one or more media sizes associated with the one or more media types.
[210] In some demonstrative embodiments, content preference engine module 165 may be called by Orchestrator module 150 to generate a preferred media data entity, e.g., PREFERRED_MEDIA (Table 9), by receiving from the event indication 112 an indication on the event, for example ELFFM 6, from the user indication 113 an indication of a user name, for example, Bob, and data 129 from user profile DB 128 e.g., DB JLISERPRQF1LE.
[21 1] In some examples, content preference engine module 165 may receive from user profile DB 128 (Table 4) a plurality of media types, a plurality of media sizes and a plurality of media weights, associated with the user, and generate the preferred media data entity, e.g., PREFERREDJMEDIA (Table 13). For example, the preferred media data entity , e.g , PREFERRED_MEDIA (Table 13) may include a media type field, a media size field and a Weight field, e.g.. Weight column and/or Weight attribute, (text box 410). Table 13 is a non-limiting example of user media preference data.
Figure imgf000066_0001
Table 13
[212] In some demonstrative embodiments, content preference engine module 165 may merge media sizes of medias of the same type, based on a weight of the media type and averaging the media weight (text box 420). For example, content preference engine module 165 may merge double audio sizes by building the weighted average of the size values and averaging the weight values to generate, for example, PREFERRED MEDIA (Table 13-1).
Figure imgf000066_0002
Table 13-1
[213] In some demonstrative embodiments, content preference engine module 165 may normalize the weights values by setting the total sum of the weights to 1 (text box 430), as is shown in Table 13-2 below'. Table 13-2 is a non-limiting example of a data entity indicative of one or more media types and of one or media sizes associated with the one or more media types.
Figure imgf000066_0003
Figure imgf000067_0001
Table 13-2
[214] In some demonstrative embodiments, content preference engine module 165 may be configured to provide the data entity, for example, the Preferred Media table (Table 13-2) to Orchestrator module 150, (text box 440).
[215] In some demonstrative embodiments, content preference engine module 165 may be configured to provide a default media type and size for a new user. For example, if the user Bob no preferred media is defined, in some examples a default preference for media type and size can be used in this method. Similarly, in some cases this default preference could be entered into DB_USERPROFILE for the user, and perhaps be modified dynamically as the user's preferences are learned based on other behavior of theirs.
[216] N ote also that in the example disclosed, the user preference for media types and size is the same for all events. In other examples, these preferences could be related to events (e.g. for sports Bob wants large size video, for news event he prefers text or audio.)
[217] Turning to Fig. 5, it presents a schematic illustration of a flow chart of a method 500 of generating an available media data entity by a media handler module 155, in accordance with some demonstrative embodiments.
[218] In some demonstrative embodiments, media handler module 155 may tap into the media, e.g. existing media, and the user profiles to generate a data entity, e.g. a list and/or a table, of media related to the event indicated by event indication 1 12, e.g. ELFF’16, and/or to a user indicated by user indication 113 , e.g. Bob. For example, media handler 155, may associate the media directly, for example, via the user, e.g. Bob, and/or indirectly, for example, via the user friends, e.g. Alice and Carl
[219] In some example embodiments, media handler module 155, may be configured generate a data entity, e.g. AVAILABLE MEDIA (Table 16), including media related to the event and the user. For example, the media may be prioritized according to a degree of relationship between the event, e.g. ELFF’16 and the user, e.g. Bob
[220] In some demonstrative embodiments, media handler module 155 may be configured to generate a data entity, e.g. AVAILABLE MEDIA (Table 16), by receiving data from the at least one of user profile DB 128, event metadata DB 124 , media metadata 120 DB and/or a data structure, for example, FRIENDS_EVENT_WEIGHTS_FRIENDSWEIGHTS (Table 12).
[221 ] In some demonstrative embodiments, media handler module 155 may be configured to generate a USERS MEDL4METADATA table (Table 14) by joining, for example, User_Profile_DB table (Table 4) of user profile DB 128 and Media_Metadata_DB table (Table 1 ) of media metadata 120 DB, selecting media metadata related to users according to keyword value pairs, e.g. the same keyword value pairs, multiplying weights corresponding to the selected keyword, e.g., multiplying weights User_Profi!e_DB table (Table 4) with weights of Media_Metadata_DB table (Table 1) and by 10, and dropping lines with non-corresponding keyword value pairs (text box 510). For example, in the case of Table 14, the line for the Discipline "Volleyball" has been dropped. This is a non-limiting example of association of the available media contents with the user preference data associated with the user and with the user preference data associated with other user(s), by combining and reducing data of the media metadata related to the input media content and the data entity indicati ve of the association of the user and the other user(s) with the event of interest to the user.
Figure imgf000068_0001
Figure imgf000069_0001
Table 14
[222] In some demonstrative embodiments, media handler module 155, may be configured to generate, for example, Table 15,
USERS_MEDIAMETADATA_EVENTMETADATA, by joining Table 14 USERS MEDIAMETADATA with Event Metadata DB table (Table 3) of event metadata DB 124 filtered on a selected event, selecting the media metadata related to the selected event related to keyword value pairs, e.g., same keyword value pairs, multiplying the weights corresponding to the selected keyword, e.g., multiplying weights Event Metadata DB table (Table 3) with weights of Table 14 USERS MED 1AMETADATA and multiplying by 10, and dropping lines with non corresponding keyword value pairs (text box 520). In the case of Table 15-1, for example, the lines corresponding to the Vertical "Fashion" have been dropped. This is a non-limiting example of association of the available media contents with the event, by combining and reducing data of the media metadata related to the input media content, and the event metadata associated with the event of interest to the user.
Figure imgf000070_0001
Figure imgf000071_0001
Table 15-1
[223] In some example embodiments, media handler module 155 , may be configured to join, for example, the USERS_MEDIAMETADATA_EVENTMETADATA table (Table 15-1) with, for example, Friend Event Weights _Friend_Weights table (Table 12), by multiplying the weights, remove the column "User", and merge rows with the same attribute at the User_MediaMetadata_EventMetada table ("fable 15-2), summing up the weights (text box 530)
Figure imgf000071_0002
Figure imgf000072_0001
Table 15-2
In some demonstrative embodiments, media handler module 155, may be configured to generate an Available Media data entity (Table 16) by merging rows with related Media ID values, e.g., the same Media ID values, at the USERS_MEDIAMETADATA_EVENTMETADATA table (Table 15-2), and by- summing the weights, dropping Keyword field, e.g., Keyword column and/or Keyword attribute, Value field, e.g., Value column and/or Value atribute, and normalize the weights (text box 540). Table 16 may in some examples be referred to also as a first data entity indicative of the input media content(s).
Figure imgf000072_0002
Table 16
[224] In some demonstrative embodiments, media handler module 155 , may be configured to provide the data entity, e.g., the Available Media table (Table 16) to the Orchestrator module 150 (text box 550) .
[225] Turning to Fig. 6, it presents a schematic illustration of a flow chart of a method 600 of generating a friends physiological (or loT) data entity based on data captured from sensors by a physiological data handier module, e.g., loT data handler 160 , in accordance with some demonstrative embodiments. [226] In some demonstrative embodiments, physiological data handler module, e.g., IoT data handler 160, may be configured to tap into the IoT data, e.g., physiological data, and the user profiles, to generate a data entity, e.g., a list and/or a table, of FRIENDSIOT table (Table 20) based on IoT data, e.g., existing IoT Data, related to the event as indicated by event indicator 112, and the user as indicated by user indication 113. For example, an association of the IoT data may be direct, e.g., via the user, and/or indirect, e.g., by friends of the user.
[227] For example, the data entity, e.g., Table 20, may be prioritized by the degree of relation to the event and the user and/or the friends of tire user.
[228] In some demonstrative embodiments, IoT Data Handler module 160 may be configured to generate the data entity including, for example, the physiological data of the user associated to the event metadata. For example, the physiological data may be prioritized according to a degree of relationship between the event e.g., ELFF’16 and the user, e.g , Bob.
[229] In some demonstrative embodiments, IoT Data Handler module 160 may be configured to generate the data entity, e.g., FRIENDSIOT table (Table 20), by receiving data from at least one of, for example,
FRIENDS_EVENT_WEIGHTS_FRIENDSWEIGFITS (Table 12),
DB USERPROFILES table (Table 4) of user profile DB 128 , Event Metadata DB 126 (Table 3), and/or DB_IOTMETADATA table (Table 2) of IoT metadata DB 124.
[230] In some demonstrative embodiments, IoT Data Handler module 160 may be configured to generate, for example, an IOTMETADATA FRIENDS table (Table 17) by removing from DB IOTMETADATA table (Table 2) entries that are not associated with a person in the friend list, e.g. Table 4 (text box 610). For example, in the case of Table 17, the entry associated with the user "Dora" has been removed. Hie generation of Table 17 may be referred to also as a non-limiting example of generation based on association of the degree of excitement data with the user and/or with the other user(s), including combining and reducing data of the degree of excitement metadata of the other user(s) and the data entity indicative of the association of the user and other user(s) with the event of interest to the user.
Figure imgf000073_0001
Figure imgf000074_0001
Table 17
[231] In some demonstrative embodiments, loT Data Handler module 160 may be configured to generate, for example, IOTMET AD ATA__ FRIENDS table (Table 18) from table IOTMETADATA FRIENDS (Table 17) by removing all entries that are not associated with the event, e.g, ELFF’ 16, as indicated for example by Table 3 (Text box 620). The generation of Table 18 may be referred to also as a non-limiting example of generation based on association of the degree of excitement data with the event, including combining and reducing data of the degree of excitement metadata of the other user(s) and the event metadata associated with the event of interest to the user. In the example of Table 18, the entries for Team=Madrid and VerticaNFashion are removed, as they are not associated with the event ELFF’ 16.
Figure imgf000074_0002
Table 18 [232] In some demonstrative embodiments, loT Data Handler module 160 may be configured to join, for example, the IOTMETADATA_FRIENDS table (Table 18) with FRIENDS_EVENT_WEIGHTS_FRIENDSWEIGHTS table (Table 12) to generate an IOTMETADATA_FRIENDS table (Table 19) (text box 630). Note that though Alice is in Table 12, she does not appear in Table 19, as she has no record in Table 18.
Figure imgf000075_0001
Table 19
[233] In some demonstrative embodiments, ToT Data Handler module 160 may be configured to generate the FRIENDSIOT table (Table 20) from, for example, the merged IOTMETADATA FRIENDS table (Table 19), by summing up the weights and dropping the User field, keyword field and Value field, e.g., Value column and/or Value attribute, and normalizing the weights by setting the sum of all weight values in the weight field to 1 (text box 640). Table 20 is an non-limiting example of a data entity indicative of the degree of excitement data of the oilier user(s).
Figure imgf000075_0002
Table 20 [234] In some demonstrative embodiments, loT Data Handler module 160 may be configured to provide the data entity, for example, the FRIENDSIOT table (Table 20), for example, to the Orchestrator module 150 (text box 650).
[235] Turning to Fig. 7, it presents a schematic illustration of a flow chart of a method 700 of generating a friends available media data entity based on data captured from sensors by a IoT-media association engine module 175, in accordance with some demonstrative embodiments
[236] In some demonstrative embodiments, the IoT-Media association engine 175 may be configured to provide available loT data. For example, the available loT data of the user and/or the friends of the user, may be associated with the available media. The engine may take into consideration that the media and loT data should cover the same time. The available media may be generated at o verlapping time of generation of the IoT data. The available IoT data, e.g. physiological data, may be prioritized by the degree of the relation of the media with respect to the IoT data associated with the event, e.g. ELFF’16.
[237] In some demonstrative embodiments, the IoT-Media association engine 175 may be configured to generate a data entity, for example, a
FRIENDSIOT_AVAILABLE_MEDTA table (Table 21). For example, the
FRIENDSIOT AVAILABLE MEDIA table (Table 21) may include one or more available media contents generated at one or more points of time during the event, e.g., ELFF’16. The one or more available media contents may be prioritized according to the physiological data of the user, e.g. IoT data, and/or user friends’ physiological data, e.g. IoT data, which may be generated at the one or more points of time during the event. For example, the data entity, e.g. FRIENDSIOT_AVAILABLE_MEDIA table (Table 21 ), may be generated according to data applied by an available media data entity and a friends physiological data entity, e.g. IoT data of the user friends.
[238] In some demonstrative embodiments, the IoT-Media association engine 175 may be configured to generate a data entity, for example, a
FRIENDSIOT. AVAILABLE MEDIA table (Table 21) according to at least one of FRIENDSIOT table (Table 20) and/or AVAILABLE_MEDIA table (Table 16).
[239] In some examples, the IoT-Media association engine 175 may generate, for example, a FRIENDSIOT . AVAILABLE . MEDIA table (Table 21) by joining table FRIENDSIOT (Table 20) with AVAILABLE_MEDIA table (Table 16) according to joint criteria, for example, overlapping time frames. For example, one each pair of Media and loT records with overlapping time frames, multiplying the weights, dropping lines with non-corresponding time frames, and re-nonnalizing the weights, for example, by setting the sum of all weight values in the weight field to 1 (text box 710). Table 21 is one non-limiting example of a data entity indicative of association between the input media content(s) and the degree of excitement data of the other user(s). The generation of Table 21 may also be referred to as one non-limiting example of generation based on one or more points of time during the event of the interest to the user, including combining and reducing the first data entity indicative of the input media content(s) and the data entity indicative of association between the input media content(s) and the degree of excitement data of the other user(s). Note that in the creation of Table 21, several of these points in time define intervals of time during the event.
FMENDSIOT_AVAILABLE_MEDIA
Figure imgf000077_0001
Figure imgf000077_0002
Table 21
[240] In some demonstrative embodiments, the ioT-Media association engine 175 may be configured to provide the data entity, e.g.FRIENDSIOT AVAILABLE MEDIA table (Table 21) to Orchestrator module 150 (text box 720).
[241 ] Turning to Fig 8, it presents a schematic illustration of a flow chart of a method 800 of generating a merged friends physiological data entity based on the friend physiological (or loT) data entity by a moments determination engine module 180, in accordance with some demonstrative embodiments.
[242] In some demonstrative embodiments, moments determination engine module 180 may be configured to determine moments in time, e.g. exciting moments. The moments may include points in time of interest from relevant loT data of the user, e.g. Bob, and friends of the user.
[243] For example, moments detennination engine module 180may be configured to generate a data entity, e.g. MERGED_FRIENDS10T table (Table 22), which may include a list of moments and a relevancy of the moments, e.g., strong moment and/or weak moment, and may be prioritized by the degree of relation of the respective input data to the event, e.g. ELFF’16, and the user, e.g., Bob.
[244] In some demonstrative embodiments, moments determination engine module 180 may be configured to generate the data entity, e.g., MOMENTS table (Table 23), including media generated at moments in time of the event, e.g ELFF’16, associated with the user’s physiological data and friends of the user’s physiological data, e.g. Bob and/or friends IoT data. The physiological data, e.g. IoT data, may be prioritized, for example, according to a degree of relationship of friends available media, a preferred media and the moments to the event and the user
[245] In some demonstrative embodiments, moments determination engine module 180 may be configured to generate the data entity, e.g. MOMENTS table (Table 23) based on FRIENDSIOT table (Table 20).
[246] In some demonstrative embodiments, moments determination engine module 180 may be configured to generate a data entity, e.g. a MERGED FRIENDSIOT table (Table 22), from the FRIENDSIOT table (Table 20) by taking, for a covered time interval, e.g each covered time interval, for example, a row with the highest weight (text box 810). In oilier examples, other criteria could be used, for example giving priority to IoT sensors that have better moments detection properties hi some examples, a data entity can exist which weights the loTIDs by the quality of their moments detection properties.
Figure imgf000079_0001
Table 22
[247] In some demonstrative embodiments, moments determination engine module 180 may he configured to generate the MOMENTS table (Table 23), for a row, e.g., each row, of MERGEDJFRIENDSIOT table (Table 22), by finding, for example, the times of 3 highest peaks of physiological measurements of the user in the corresponding loT Data together with the peaks of physiological measurements of the user which may be a relative high to the other physiological measurements, expanding the peak times, for example, in two seconds in each temporal direction, putting the expended peak times together with the corresponding rows weight into the MOMENTS table (Table 23) and normalizing the weights at the weight field and relevancy at a relevance field (text box 820). High peaks of physiological measurements may be an example indication of high levels or degree of excitement, strong moments, and thus of relatively high relevancy. In some examples, determination of the highest peaks can he done using analytics tools known in the art. In some examples, case a peak is a plateau, the time of the middle of the plateau can be selected. Note that the choice of the three highest peaks of physiological measurements, as well as the choice of two seconds for expanding peak times, are non limiting examples.
[248] Table 23 is a non-limiting example of a data entity indicative of points in time within one or more times of interest of an event of interest to the user, generated according to the degree of excitement data. In this example, the data entity was generated according to an indication of relevance that is based on the degree of excitement data of the other user(s). e.g. based on the most highly weighted degree of excitement corresponding to that time of interest.
Figure imgf000080_0001
Table 23
[249] In some demonstrative embodiments, moments determination engine module 180 may be configured to provide the data entity, e.g., MOMENTS table (Table 23), to Orchestrator module 150 (text box 830).
[250] In other examples, moments determination engine module 180 may be configured to generate any other data entity according to any other selected criteria.
[251] In some examples, the Moments are not time instances, but rather time intervals of different durations.
[252] Turning to Fig. 9, it presents a schematic illustration of a flow chart of a method 900 of generating a selected media data entity based on the preferred media data entity and friends available media data entity by a media selection engine module 190, in accordance with some demonstrative embodiments.
[253] In some demonstrative embodiments, media selection engine module 190 may be configured to determine which of a preselected media and the IoT data may correspond to detected moments and to generate a data entity, e.g. a list and/or a table based on the determination. For example, media selection engine module 190 may be configured to generate a SELECTEDJVIEDIA table (Table 25). The SELECTED MEDIA table (Table 25) may include the relevancy of the moment, e.g., a‘strong” moment and/or a“weak” moment, and the moments may be prioritized, for example, by a degree of relation of the respective input data, e.g., physiological measurement, to the event of common interest, e.g. ELFF' 16 and the user, e.g. Bob.
[254] In some demonstrative embodiments, media selection engine module 190 may be configured to generate the data entity, e.g. the SELECTED MEDIA table (Table 25). For example, the data entity, e.g. the SELECTED_MEDIA table (Table 25), may include an indication of media generated at one or more moments of the event of interest, e.g. ELFF’ 16. For example, the indication of the media, may indicate media associated with the physiological data of the user and that of friends of the user. The media may be prioritized according to a degree of relationship of friends available media, a preferred media, the moments of the event, and the user.
[255] In some demonstrative embodiments, media selection engine module 190 may¬ be configured to receive a first data entity, e.g., the FRIENDSIOT_AVAILABLE_MEDTA table (Table 21 ), a second data entity, e.g., the PREFERRED MEDIA table (Table 13-2), a third data entity, e.g., the MOMENTS table (Table 23) and may generate a fourth data entity e.g., the SELECTED_MEDIA table (25) based on data of the first, the second and the third data entity.
[256] For example, media selection engine module 190 may be configured to generate the SELECTED _MEDIA table (Table 25) by generating first a SELECTED_FRIENDSIOT_MEDIA table (Table 24). The media selection engine module 190 may be configured to generate the SELECTED_FRIENDSIOT_MEDIA table (Table 24) by joining the PREFERRED MEDIA table (Table 13-2) with the FRIENDSIOT AVAILABLE JV1EDIA table (Table 21) on a Type field, removing a Size field from the jointed table, multiplying respective weights from the PREFERRED_MEDIA table (Table 13-2) and Table 21 FRIENDSIOT AVAILABLE MEDLA and re -normalizing tire weights of the joined table, e.g., SELECTED FRIEN DSIOT MEDIA table (Table 24) (text block 910). In some examples, the generation of Table 24 includes combining and reducing the data entity indicative of association between input media content(s) and the data entity indicative of one or more media types and of one or media sizes associated with the one or more media types.
Figure imgf000082_0001
Table 24
[257] In some demonstrative embodiments, media selection engine module 190 may¬ be configured to generate the SELECTED_MEDIA table (table 25), as is shown in text box 920, for example, by:
a. Joining the tables SELECTED FKIENDSiQT . MEDIA (Table 24) and MOMENTS (Table 23) on IoTID field and providing a joint table. b. Deleting from the joint table a row, where an Overlapping Time field of the SELECTED ERIE DSIOT MEDIA table (Table 24) is disjoined with a Time field of the MOMENTS table (Table 23). c. Keeping common intervals of Overlapping Time field of SELECTED FRIENDSIOT MEDIA table (Table 24), and the Time field of the MOMENTS table (Table 23) in an Overlapping Time field of the joint table. d. Merging weights at the joint table by multiplying weights of SELECTED_FRIE3SfDSIOT_MEDIA table (Table 20) with weights of MOMENTS table (Table 23) and re-normalizing.
e. Deleting the Time field from the joined table to provide the SELECTED MEDIA table (Table 25).
Figure imgf000083_0001
Table 25
[258] Table 25 is a non-limiting example of a second data entity, that is indicative of the available media content(s), where available media contents correspond to the points in time within the tiine(s) of interest of the event of interest to the user. In some examples, the second data entity is weighted based on the preferences of the user. In some examples, the generation of Table 25 includes combining and reducing the data entity indicative of points in time within at time(s) of interest, the data entity indicative of association between input media content(s), the data entity indicative of one or more media types and of one or media sizes associated with the one or more media types and the degree of excitement data of other user(s). In some other demonstrative embodiments, media selection engine module 190 may provide the SELECTED_MEDIA table (Table 25) to Orchestrator module 150 (text box 930). [259] Turning to Fig. 10, it presents a schematic illustration of a flow chart of a method 1100 of generating an output media data entity based on the preferred media data structure by content creation engine module 195, in accordance with some demonstrative embodiments .
[260] In some demonstrative embodiments, the content creation engine module 195 may be configured to generate, for example, a content personalized to the user, e.g., Bob, in the context of the event of common interest, e.g., ELFF’16, according to the user preferences, detected moments, and available loT data, e.g., physiological data of the user, and media.
[261] In some demonstrative embodiments, the content creation engine module 195 may be configured to generate a data entity, e.g., a MEDIA_SELECTED_NORM table (Table 26). The data entity, e.g., the MEDIA_ SELECTED _NORM table (Table 26), may include, for example, a content personalized to the user, e.g., Bob, associated to the event of common interest, e.g., ELFF’16, according to user preferences, moments of the event of a user interest, physiological data of the user, and a media generated at the moments of the event of the user interest.
[262] In some examples, the content creation engine module 195 may he configured to generate the data entity, e.g , the MEDIA_SELECTED_NORM table (Table 26), from, for example, a first data entity, e.g., the SELECTED _ MEDIA table (Table 25) and a second data entity, e.g., the PREFERRED_MEDIA table (Table 13-2).
[263] In some demonstrative embodiments, the content creation engine module 195 may be configured to generate the data entity, e.g., a MEDIA_SELECTED_NORM table (Table 26) by multiplying a Relevance field and Weight field of the SELECTED JVffiDIA table (Table 25) to create a Weight field of the MEDIA_SELECTED_ NORM table (Table 26) and for a media type, e.g., each media type keeping 3 rows with the highest weights (text box 1010). The creation of Table 26 may thus be based, in some example cases, on Relevance Fields and Weight Fields of Table 25. Note also that the choice of three rows is a non-limiting example.
[264] In some examples, the content creation engine module 195 may be configured to do the following for, the media type, e.g., each media type (text box 1020):
a. Lookup the preferred size in PREFERRED MEDIA table (Table 13-2). b. Expand or cut down the media content till the preferred size is reached. c. In case the time is not expandable due to time overlapping and/or there is no IoT/video at that time, enter to the table the desired content as is. [265] In some demonstrative embodiments, the content creation engine module 195 may be configured to provide the MEDIA_SELECTED_ NORM table (Table 26) to the Orchestrator module 150 (Text box 1030). Table 26 is one non-limiting example of a third data entity indicative of the available media contents(s), created as part of generating one or more output media contents to be provided to the user. In some examples, this third data entity can be indicative of the one or more media sizes associated with the one or more media types. In some examples, this third data entity is generated based on a relevance field associated with the available media content(s) and on a weight field associated with them.
Figure imgf000085_0001
Table 26
[2661 Some possible example advantages of using methods and systems, as described in the presently disclosed subject matter, for generating output media contents, will be mentioned. In some cases, such methods and systems may make use of a data structure such as a KVW data structure, e.g. as described with reference to Figs. 3-10. This may enable providing to users media associated with an event of their interest, in an efficient way. Given that a system has access, in some examples, to a large number of media content items associated with a particular event, tire system need not send them all to the user. Sending them all, or more items than are needed by the user, and/or items of greater length (e.g. number of seconds) than are needed by the user, may entail use of a comparatively large transmission bandwidth, and possibly cause the content to occupy a comparatively large amount of storage on the user's terminal equipment (e.g. PC, cellphone). The alternative of sending no content at all may of course also not be acceptable, from the standpoint of sendee to the user. [267] In contrast, the methods described herein, in some cases making use of KVW data structures and their manipulation, may in some examples provide a smaller volume of media to the user, while meeting the user's needs of consuming relevant content. Tins may be achieved by identifying those media content items, and those portions of them, that are most strongly related to one or more of the event, the areas of interest of the user and his or her friends, the strength of relationship with each friend, the degree of excitement associated with certain moments as measured for example in various IoT sensors, and the user's preference for media types and sizes. This can be performed, according to some examples, by combining and reducing data of multiple inputs and of different characteristics (e.g. media, user profiles, excitement etc.), to derive degrees of relationship of comparatively high values and thus high relevance. The media content items output to the user (e.g. Bob) can thus be optimized in some examples. In some example, the less relevant media content items, and/or the less relevant portions of those items, need not be sent. For one non-limiting illustrative example, video of an entire 90 minute football match may not be sent, but rather only the 3 goals made, constituting only 2-3 minutes of video.
[268] One example of deriving the most relevant output media content is as follows: the method also enables combining various pieces of data to derive a more accurate measure of relevance. For example, in Table 4 we see that, although Carl has a very high level of interest of 0.8 for Basketball, measurements related to Carl regarding Basketball may have comparatively less relevance in choosing media for Bob, since Bob's friendship with Carl is at a level of 0.2, considerably less than Bob's friendship with Alice (level of 0.5). Another example is giving more weight to Bob's relationship to himself, then to his relationship with friends such as Alice. As exemplified above, this in some examples results in more weight given to Bob's areas of interest, Bob's excitement levels etc., as compared to the areas of interest, and excitement levels etc., of a friend such as Alice.
[269] An additional example advantage is deriving the most relevant output media content while minimizing use of processing and storage resources. By using reduction at various stages of combinations, as exemplified in Figs. 3-10, in some examples the data entities, used and/or created at various steps of the process of identifying candidate output media contents, can be kept relatively smaller in size. The use of relatively smaller data entities at a next step of the process can in turn reduce, in a relative way, the processing resources required for the next step. [270] The presently disclosed subject matter discusses generation of output media eontent(s), based on, for example, input media content, degree of excitement data and moments in time. However, in some examples, the concepts and methods disclosed can be generalized Similar methods can accept and/or function on, for example, any kind of time bound data sets. Instead of determining moments of excitement, so as to detennine a specific output data, for example, the invention can perform any type of detection on a second data set ("data set 2", exemplified herein by degree of excitement data from loT) and create a respective compilation from a first data set ("data set 1") with a multiplicity of potential material or information from which to select, originating from multiple sources (exemplified herein by input media contents). Thus, in some examples, the system can be used for arbitrary context-sensitive multi-instance relation- aware data compilation. "Context-sensitive" may in some examples refer to sensiti vity to the environment of the data, e.g. date and time of day, as exemplified herein. In other examples, the sensitivity can be, as non-limiting examples, to parameters other than time - e.g. to the season, w'eather, location, state of health of an individual, etc. "Multi instance” may in some examples refer to the presence of multiple instances of an entity. An example from the present disclosure is the multiple users, and the ability to create a relevant compilation for Bob based on infonnation associated with his friends and other contact people. The sources of material of the first data set, the items of material themselves, and the portion of the items, are selected based on the second data set, and its sensitivity to contexts derived in some cases from additional data (exemplified herein by event data and user profile data). Relations between all types of entities (exemplified herein by events, media IDs, loTID, users etc.) can be derived (for example using KVW data structures) to enable the selection based on relevance.
[271] Some demonstrative implementations may be used, for example, to provide a core engine for user profiling, targeted advertisements/campaigns and promotions, orchestrate an integrated consumer experience, increase consumer engagement, promote tailored content through heterogeneous distribution channels, turn intelligence into action, and action into results with embedding advance loT analytics in media, facilitate the expansion of the diversified user base and the online community and any other use.
[272] Some demonstrative embodiments may be used, for example, in areas benefitting from context-sensitive multi-instance relation-aware data compilation, and may be geared to other scenarios where personalized content for a user may be created, for context targeting user groups(instead of individual users) and/or any other areas of use cases
[273] Thus, in some examples, the system disclosed e.g with regard to Fig 1 may he referred to as a computing system configured to create a compilation output from a first data set , the computing system comprising a processing circuitry and configured to: receive an indication to generate a compilation;
receive, from the at least one data source, the first data set, comprising a time associated with first data of the first data set, wherein the first data set is relation-aware; receive, from the at least one data source, a second data set, comprising a time associated with second data of the second data set, wherein the second data set is relation- aware;
identifying at least one candidate compilation output, wherein the identifying comprises combining and reducing data of at least the firs data set and the second data set,
wherein the combining and reducing data is based at least partly on a degree of relationship of the time associated with the first data and the time associated with the second data,
wherein the at least one candidate compilation output includes at least a portion of the first data set;
generate the at least one compilation output, wherein at least one compilation output media content includes at least a portion of the at least one candidate compilation output; and
provide the user with the at least one compilation output.
[274] Such a system can in some examples utilize methods disclosed herein, including to but not limited to the use of KVW data structures such as disclosed herein; setting weight values by at least one of fusing the weight values of records of the new data entities, multiplying the weights values, merging weight values by addition and merging weight values by weighted averaging; and normalizing the weight value, wherein the sum of normalized weight values is equal to 1.
[275] Similarly, it should be noted that methods such as the examples disclosed with regard to Figs. 3 to 10 consider in some cases matching of shared time intervals (that two intervals being compared start and end at the same date and time). In some cases they consider overlapping of time intervals, whether full overlap (one interval contains the other, e.g. 9:00 to 10:00 is within 9:00 to 1 1 :00} or partial overlap (e.g. comparing 9:00 to 10:00 with 9:30 to 10:30). In other example applications, the concept of partial overlap may be relevant, even not in the context of time. E.g. if record one is associated with red, blue and yellow, and record two is associated with red, yellow, green and orange, there is a partial overlap (red and yellow). In other example applications, on the contrary, a criteria for "matching", or of selection of a relationship between two entities, may be lack of overlap (if Harry is sitting in seat 51, and Iris is sitting in seat 63, that may be considered a positive result).
[276] As described above, in non-limiting examples of some aspects of the presently disclosed subject matter, the computing system may be configured to generate one or more output media contents associated with an event of interest tothe user. In some examples, in order to provide the output media content 114 the computing system is configured to generate a data structure, e.g. a data entity, which may be used to generate the output media content. A method and a system for generation of the data structure will be described below with reference to Figs. 11 and 12. Non-limiting examples of this data structure, and its uses, include Tables 1 -26 described above, relating to the example application of generating output media content.
[277] In some examples, the data stmcture(s) can include some or all of the following fields:
® one or more entity fields,
® a keyword field,
® a value field,
* a weight field.
[278] In some examples, this data structure may also be referred to herein as a Keyword-Value -Weight (KVW) data structure. A system may include one or more of such data structures. In the example of Tables 1-26, numerous instances of such a data structure exist.
[279] These fields may be grouped together within a record or entry of the data structure. The combination of the keyword and value fields may be referred to herein also as the KV fields or a Keyword -V alue pair. The combination of the keyword, value and weight fields may be referred to herein also as the KVW fields. Note also that one non-limiting example of records in the data structure are rows, such as exemplified in Tables 1-26. One non-limiting example of fields in the data structure are columns, such as exemplified in Tables 1-26.
[280] Two non-limiting examples of a KVW data structure with one entity type are Table 3, DB_EVENTMETADATA, where the entity value is "Event", and Table 4, DB_USERPROFILE, where the entity value is " User". A non-limiting example of a KVW data structure containing more than one entity field is DB_IOTMETADATA, Table 2, where at least the two fields IoTID and User may be entity fields. Note that entity fields of different KVW data structures may be, in a particular system, of different entity types, and not the of the same entity type. One non-limiting example of this is user, event, lotID, MedialD etc. that appear in different tables of Tables 1- 26 that support output media content generation.
[281] A keyword value in the keyword field can indicate on a feature (e.g. Team) associated with the one or more entity values, and a value m the value field can be associated to the keyword value (e.g. Team ~ Moscow', Istanbul). Non-limiting examples of keyword and value pairs are those of Table 4: keywords include Vertical (with values Fashion and Sport), Audiolength (with values 20 sec, 40 sec), and Friends (with values Alice, Bob, Carl).
[282] The weight field can include a weight value, or weight metric, to indicate a degree of relation of the one or more entity fields to the keyword field and the value field, e.g. as described below. In some example eases, the weight value may indicate the degree of relation of the one or more entity fields to the combination of the keyword field and the value field. For example, in Table 4, the degree of relation between the entity "Bob" and the keyword-value pair "Team + Moscow" is indicated by weight 0.4.
[283] In some demonstrative embodiments, a KVW data structure may be initialized with only three weight values, for example for simplicity reasons. The possible weight values may include 0 for no relation of the data applied by the row, 0.5 for medium relation of the data applied by the row , and 1 for high relation of the data applied by the row. In some other demonstrative embodiments, other or more values may be applied to the weight value in order to indicate more degrees of relation of the data associated with a record and/or with a plurality of records.
[284] In some examples, a KVW data structure may not contain an entity field. For example, EVENTKEYS, Table 5, contains only the KVW fields, without an entity field. [285] In some examples, keywords can be used to establish relationships between the entity fields of different KVW data structures. Each entity, such as person or event, can be associated with a set of keywords, where multiple associations of the same keyword (possibly with different values and/or weights) to one entity is explicitly allowed. As one non-limting example, the entity value Bob can be associated twice with the same keyword Friend - once with keyword-value =Friend + Alice, and once with keyword- value =Friend + Carl. For each instance of a keyword associated to an entity, there is a value for that keyword and a respective weight associated with it the keyword-value pair.
[286] One example feature of a KVW data structure is that, in some examples, it can be generated from portions of input data such as that in other existing data structures. This generation may be done according to one or more data aggregation methods. In some examples, data aggregation may include one or more of normalization, fusion and merging. As disclosed further herein, normalization may involve summing weights, fusing may involve multiplying weights, and merging may involve addition of weights or weighted averaging of values.
[287] One example of such a data aggregation method is combining at least two other data structures, also referred to herein as source (or input) data structures. In some examples, one or both of the source data structures are themselves KVW data structures. For example, Table 14 U SERS_MEDIAMETAD ATA is generated using Table 4 and Table 1. Table 14 is in turn used, together with Table 3, to generate new Table 15-1 USERS_MEDIAMETADATA_EVENTMETADATA. In this example, all of tire tables are KVW data structures - both the input Tables 1, 3 and 4, and also the output tables generated, Tables 14 and 15-1. Similarly, Table 15-1 may be in some examples be generated by a combination of the three Tables 1 , 3 and 4, and thus is an example of generating a KVW data structure by combining more than two source data structures. Note also that both Tables 14 and 15-1 are created by joining the relevant source data structures.
[288] Another example feature of a KVW data structure is that it, in some examples, it can capable of being utilized to generate a new data structure, by combining the KVW data structure with at least one other source data structure. The new' data structure may also be referred to herein as a result (or output) data structure. In some examples, the result of this combination is itself a KVW data structure. In one example, both source data structures are KVW data structures, and the combination is performed on the basts of the combination of keyword fields and value fields of the source data structures. One example of this is combining Tables 1 and 4 to generate resulting Table 14.
[289] In other examples, a KVW data structure may be combined with at least one other source data structure to generate a modified version of a data structure. In one example, both source data structures are KVW data structures, and the combination is performed on the basis of the combination of keyword fields and value fields of the two or more source data structures. In some examples, the modified data structure is itself a KVW data structure. As one non-limiting example, KVW data structure Table 18 IOTMETADATA_FRIENDS is a modified version of the same I0TMETADATA_FR1ENDS (Table 17), generated by combining Table 17 with Table
3. In some examples, the other source data structure, the new data structure, or the modified data structure, is not a KVW data structure. Such a structure may be referred to herein also as a "non-KVW data structure" . Non-limiting examples of non-KVW data structures include Table 7 and Table 12.
[290] In some examples, the KVW data structure can be generated, by merging two or more records of other source KVW data structures. In some example cases, the merging may be performed on the basis of the combination of the keyword field and the value field. In some examples the weight value for the record of the result KVW data structure can be set by multiplying weights values of records of the other KVW data structures. As a non-limiting example, the KVW data structure USERS_MEDIAMETADATA (Table 14} is generated by merging Table 1 and Table
4, on the basis of the Keyword and Value data fields, and multiplying weights.
[291] Note that in that example, the resulting Table 14 contains entity fields of both source data structures - both the User field of Table 4, and the MedialD, Type and Time fields of Table 1 - in addition to the KVW fields.
[292] In some examples, the entity types of the entity fields of the other, source, KVW data structures, which were used to generate a third KVW data structure, are not the same. The abovementioned Tables 4 and 1 , used to generate Table 14, are a non limiting example of this.
[2931 In some examples, the weight value of a result KVW data structure may be set by merging the weight values by weighted averaging. In some examples, the weight value of a KVW data structure may be set by merging the weight values by averaging. As a non-limiting example of this, Table 13-1 is generated from Table 13 by averaging of the weight values corresponding to Keyword (Type) = "Audio". Note that Tables 13-1 and 13 also exemplify another example feature of KVW data structures --- the possibility of setting a Value Field value by merging the values of a Value Field (in this case the field "Size"), by weighted averaging of the Value Field values corresponding to Keyword (Type) ::: "Audio" - where 20 see and 40 sec gave a weighted average value of 28 sec, based on the Weight values.
[294] In some examples, the weight value may be set, when merging two or more records of the K W data structure, by summing or adding the weights values of two or more records of the other KVW data structures. As one non-limiting example, Friends Event Keys Weights Friendweights, Table 11, was processed by merging records with matching combination of the keyword field and the value field from Friends_Event_Keys_Weights_Friendweights, Table 10, and summing the respective weights.
[295] In some examples, the KVW data structure can be generated, by merging two or more data fields of the source data structure or structures, and setting the weight value by multiplying weights values of the data fields. As one example, MEDIA SELECTED NORM, Table 26, is generated by merging data fields (columns) Relevance and Weight of SELECTED_MEDIA (Table 25), by multiplying the value of the two data fields.
[296] In some examples, the generation of the KVW data structure can also involve reducing data. For example, in the generation of Table 14 discussed above, the record of Table 1 corresponding to Keyword= "Discipline" and Value- ' Volleyball" does not appear in the resulting Table 14, since Table 4 has not Keyword-Value pair corresponding to that particular combination. In another example, modified data structure EVENTKEYS, Table 8, was generated based on an earlier version of EVENTKEYS, Table 5, by combining Table 5 with Table 6 and removing records of Table 5 (City ^Berlin and Team=Istanbul) that had no Keyword-Value pair match in Table 6.
[297] An additional example of reducing data, is the generation of Table 11 from Table 10, discussed above, where records with matching keyword-value pairs were merged, and in addition the entity field "User" was then removed, leaving a data structure containing only the KVW fields. By contrast, the generation of Table 12 from Table 10 FRIEND S_EVENT_WEIGFITS_FRIENDS WEIGHTS involved a different merging method that involved reducing data, one based on matching values of the entity field "User”. Table 12 contains only the entity and weight fields, and is a non-limiting example of generating a data structure that is not a KVW data structure, based on a KVW data structure.
[298] The generation of modified KVW data structure Table 15-2 (User MediaMetadata EventMetada), based on the source data structures Table 15-1 (USERS_MED1AMETADATA_EVENTMETADATA) and Table 12 (Friend_Event_Weights_Friend_Weights), exemplifies several of the example methods for KVW data structures disclosed above: generation of a modified data structure, removal of an entity field ("User") from the result data structure, and both multiplying of weight values and summing of weight values.
[299] In some example cases, as part of the generating of a KVW data structure, the weight value may then be normalized. For example, the sum of the normalized weight values of the rows may be equal to 1, e.g. as described below. In some demonstrative embodiments, the weight value may be normal ized at least by one of fusing the weight values, multiplying the weights values, merging weight values by addition, or merging weight values by weighted averaging, e.g., as described below. One non-limiting example of the utilization of normalization is the generation of Table 9, FRIENDS_EVENT_KEYS_WEIGHTS. In some examples, normalization is performed once, after the desired KVW data structures have been generated. In other examples, normalization is performed at each stage, after the generation of each intermediate KVW data structure, in an example process that yields one or more ultimate desired KVW data structures as the final output.
[300] In some examples, new data structures, or modified data structures, may be generated by processing one source KVW data structure, for example by merging records and, in some cases, reducing data - without performing combination with another source data structure. As one example, records with the same value of a KV- pair may be merged, and the entity fields removed from the data structure.
[301] Note that one or more of the methods disclosed herein may be used for generating a KVW data structure based on one or more other data structures. One or more of the methods disclosed herein may also be used for generating new data structures, or modified data structures, based on the KVW data structure, some case by combining the KVW data structure with at least one other data structure.
[302] Attention is turned first to Fig. 1 1, which is a schematic illustration of a block diagram of a computing system 1 100 for generating one or more data structures, in accordance with some demonstrative embodiments. For example, the computing system 1 100 may include and/or be included in the content creation system 100 and/or in computing system 1300, e.g as described below.
[303] In some demonstrative embodiments, computing system 1 100 may include an I/O (Input/Output) interface 1 110 configured to gather and distribute data from and to one or more data processing devices such as, for example, sensors, databases, social networks, the Internet or the like. In some eases, the interfaces to the Databases shown in Fig. i, and/or the interfaces to the inputs and outputs 1 12, 1 13 and 1 14, may be an example of I/O interface 11 10.
[304] In some demonstrative embodiments, computing system 1100 may comprise processing circuitry 1115. Processing circuitry 1 1 15 may comprise a processor 1120 (also referred to herein as a hardware processing circuitry) and memory 1 140. Processing circuitry 1115 is shown in Fig. 11 as a broken line. In some example cases, processing circuitry 1115 may be processing circuitry 107.
[305] In some demonstrative embodiments, processing circuitry' 1 1 15 may comprise a hardware processing circuitry 1120. Hardware processing circuitry 1 120 may be referred to herein interchangeably as processor 1120. For example, the hardware processing circuitry may be hardware processing circuitry' 110 and/or hardware processing circuitry' 1310 (see Fig 13). Hardware processing circuitry 1120 may be configured to generate a data structure 1 130, e.g. a data entity, based on portions of the data according to one or more data aggregation methods. Non-limiting examples of such methods are described further herein.
[306] In some demonstrative embodiments, computing system 110 may include a memory 1140, which may be configured to store the one or more data structures 1130, for example KVW data structures, as described herein. For example, memo!·}' 1130 may be memory 10, and/or may be memory 1308 of Fig 13, e.g. as described below'.
[307] Some example functions of the components described in Fig. 1 1 are now' disclosed.
[308] In some demonstrative embodiments, the processor 1120 may be configured to generate the KVW data structure 1130, by one or more of the methods disclosed herein.
[309] In some demonstrative embodiments, computing system 1100 may be configured to generate one or more data structures, e.g. data entities, including corresponding Keywords, Values, and Weights, which may be referred to as KVW data. For example, the KVW data may be filled in manually. Other example methods of populating KVW fields, including initially assigning associated weight, are disclosed above regarding generation of an output media content. Fixed weight values may provide a better than state of the art performance of the system, e.g. by seting the weight of the user to 100 and all the other weights (Wi) to one.
[310] In some demonstrative embodiments, the respective KVW data may be filled in manually when the number of users, the number of events and a respecti ve media is relatively small at a time and/or may be limited and the respective KVW data may be filled in manually
[311] In some other example embodiments, semi-automatic and/or automatic KVW data generation may be achieved by applying content analytics, e.g. Twitter feeds, transitive relations, e.g. a video on the event may inherit KVW data from the event, conversion of available '‘regular” metadata, social analytics on the user profiles, etc.
[312] Turning to Fig. 12, there is presented a schematic illustration of a flow chart of a method of generating a data structure, in accordance with some demonstrative embodiments. In some demonstrative embodiments, these functions may be performed by components of Fig. 11.
[313] In some demonstrative embodiments, computing system 1100 may gather from, and distribute data to, one or more data processing devices (text block 1210), e.g., as described herein. In some example cases, the one or more data processing devices may include, the content creation system 100, databases, e.g., databases 120, 122, 124 126,128 and 130, servers, mobile device, computers, cars, cell phones or the like. In some examples, this may be done by I/O interface 1 1 10.
[314] In some demonstrative embodiments, the one or more data structures may include, for example, tables, lists, arrays, or the like, which may be generated by the processor 1120 of computing system 1100 .
[315] In some demonstrative embodiments, the processor 1120 may be configured to generate the data structure based on portions of the gathered data according to one or more data aggregation methods. For example, tire data structure, e.g., Tables 1-26, may include one or more records , and one or more data fields. For example, the one or more data fields includes at least one of the following fields: Keyword, Value and Weight and the data of the fields may be stored at the one or more records (text block 1220). One non-limiting example of such generation may be method 300, wiiere Table 12 was generated based on data gathered from Tables 3 and 4. In some examples, the generated data structure may be stored in memory 1140. [316] In some demonstrative embodiments, the processor 1 120 may set, at each record, the weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the value field (text box 1230). Example methods of setting the weight value are disclosed herein.
[317] In some demonstrative embodiments, hardware processing circuitry 1120 may normali ze the weight value at each record by equalizing a sum of the weight values of record, e.g., all the record, of the data structure to 1 (text box 1240).
[318] In some demonstrative embodiments, the weight field may include a weight metric. For example, the weight metric may be applied to determine values for different weights. For example, the weight metric may apply 3 values to a weight value:“0” may represent no influence, “0.5” may represent medium influence, and‘T’ may represent strong influence.
[319] For example, the weights may be influenced by factors applied to various data entities in the data aggregation processes throughout a workflow' of generating the data structure. The weights may be used to weight the degree of relation between the one or more entities of the content creation system 100. For example, tire weights may be influenced from social relationships between the user and his friends into generation of the content. For example, a close friend may have a higher influence than a not-so-c!ose friend and/or a friend of a friend.
[320] In some demonstrative embodiments, the weights may be influenced from factors such as, for example, IoT data, e.g., physiological data and/or the like.
[321] In some other demonstrative embodiments, some other weight metrics, which may include more than 3 values and/or less than 3 values, may be applied in order to determine the values to the Weight field, e.g., Weight column and/or Weight attribute.
[322] Example methods of populating KVW fields, including initially assigning associated weight, are disclosed above regarding generation of an output media content.
[323] In some other demonstrative embodiments, keywords may be used to establish relationships between one or more entities. For example, an entity of the one or more entities, e.g., each entity, may include a person or an event, and may be associated with a set of keywords. A keyword may with multiple associations, e.g., the same keyword with different values and/or weights, related to one entity, may be used. For example, for an instance of a keyword, e.g., each instance of the keyword, may be associated with an entity. The keyword may include a value related to the entity and a respective weight may relate to the entity . [324] Some non-limiting examples of KVW data structures appear in Tables 1-26, in the context of the generation of the output media content. Example entity fields described in those tables, include events, users, MedialD, loTID, Type and Time. Example Keywords described in those tables involve subjects such as: event location (City): areas of interest of a user (Vertical, Discipline, Team, Designer); media preferences (Mediatype, Audiolength, Textlength); social relationships of users (Friend). Values include, for example, Fashion and Sport for the keyword Vertical, Basketball and Volleyball for the keyword Discipline, and Madrid and Moscow' for die keyword Teams. These examples are fields which are relevant to die application of generating the output media content.
[325] Similarly, the methods described with reference to Figs. 3-10 exemplify several methods of generating data structures per method 1200. For example, in method 500, existing table USERS_MEDIAMETADATA_EVENTMETADATA is modified to yield Table 15-2, by merging the original version of USERS_MEDIAMETADATA_EVENTMETADATA (Table 15-1 ) with Table 12. in the example of method 700, Tables 16 and 20 are joined, to yield a new' table, Table 21. All of the methods disclosed herein with regard to KVW data structures may be utilized, in some examples, to implement the methods of content generation disclosed w th regard to Figs. 3-10, 14 and 15.
[326] Some possible example advantages of using a data structure such as a KVW data structure, as described in the presently disclosed subject matter, will be mentioned. In some cases, for example, two or more tables may contain different information, and may be based on different types of entities. For example, one table may relate to people, e.g. or users of a service, while another may relate to media content items, such as video and audio clips generated for example by different content providers. It may still be possible to manipulate or process such tables, which contain different information, e.g. by comparison, merging and other methods, by making use of keywords and values and possibly their combinations to match between records of dissimilar entities (e.g. user and media content item) on different tables. Thus, it may be possible to determine a weight value relating, for example, a user Bob to media content item Cam2 on a particular date, and thus to provide Bob with the most appropriate media content item, by making use of KVW data structures.
[327] Similarly, use of KVW data structures in computerized systems may in some examples allow inference and deduction of relationships that were not obvious. As one example, use of a KVW data structure may allow easy detection, by straightforward manipulation based on KV fields, that if Bob is related to Alice, and Alice to Carl, Bob has a relationship to Carl of a certain degree (weight). As another example, a KVW data structure may enable definition of a relationship of the entity value User=:Bob both to his Friend==Aiice (keyword-value pair), as well as to tire keyword-value pair "Friend=Bob" (i.e. to himself). That is , the value in the "value" field of a record can be equal to the entity value of an "entity" field. This structure this enables comparative weighting of relationships of an entity to itself, as well to other entities, when determining tire weights to assign various attributes of each. Another relationship that can be inferred with in a simple manner using KVW data structure is that between two different entities. For example, entity User with value Bob, in one data structure, can be related to an entity loT!D with valise BR4, since both have defined a degree of relationship to the K V pair Team=Moscow, and in some examples the strength of the relationship between these entities can be derived straightforwardly by multiplying the weights of the two relevant records. In these example cases, the inference and deduction of relationships may allow prov iding useful outputs (e.g. pro viding Bob with the most appropriate media content item, based on interests of a wider circle of friends and friends of friends, and weighting in the correct proportion Bob's own interests and that of his friends) that in some cases may not have been possible using other data structures.
[328] In some examples, new data structures may be created, data structures may be combined etc., new relationships may be recorded as data, and weights of relationships may be modified, all based on the use of keyword and value fields and weights, while using as input existing database tables (for example). In some examples, this provides an example advantage, in that instead of having to develop or obtain sophisticated algorithms or analytics, the desired result may be provided by simple combinations of structures such database tables, using for example standard SQL queries to perform standard actions such as e.g joins of tables, merger of records and duplication removal, based on keyword and value fields. This in turn may, in some examples, enable simpler and thus more robust computer programs, with a lower chance of bugs. Similarly, in some cases re-use of existing components of code may be possible across different software components, for example the engines 155, 160, 15 etc., since the engines all process data structures with common structural components. Tins again may also decrease the bugs in these software components hi some examples a relatively more bug-free running of tire computer programs allows the computer to generate the correct outputs for given input data, and thus perforin the computer's task in a more accurate manner. In this manner, computer perfonnance may be improved. Similarly, in some examples die decrease in the number of bugs decreases the chance that the program, and/or the computer, will crash. In this manner, computer reliability may be improved.
[329] In some examples, use of KVW data structures can enable storage space savings. This may in some examples apply both to data structures that are input to a process (e.g. Tables 1 -4), as well as to those data structures generated while performing an application task such as output media content creation (for example Tables 5-26).
[330] In some examples the savings in space can be achieved as a result of combination. As one-non-limiting example of this, the data structure IOTMETADATA_ FRIENDS, Table 17 is decreased in size to Table 18, on the basis of a combination with Table 3 which removes a record that is not of interest. The same data structure is then converted to Table 19, on the basis of a combination with Table 3, and again the superfluous record corresponding to "Team-Madrid" is removed.
[331] In some examples the ability to merge records, and/or to merge data fields, described herein, can decrease the storage space of a data structure . One non limiting example is merging the records of different enti ties based on identical KV-pair values, to eliminate unneeded duplication. One example of this is the conversion of Table 15- 1 to a smaller version Table 15-2 of the same data structure, where three records for "Cam! " that share KV-pair "Team -Moscow" were compressed into one record, by eliminating the field User which is not needed for the remainder of the particular task.
[332] Note that in some examples the weight value set by these methods reflects the weight values of the source records and/or data structures, so sufficient infonnation remains in the smaller resultant data structure. In both combination and merger, the storage space occupied by a particular data structure is decreased, while maintaining in that data structure all information that is relevant to performance of the relevant task. By comparison, in some examples of a system that do not make use of the Key- Value- Weight fields, larger data structures may have to be maintained for more stages of performance of a task, by comparison to a case of use of a KVW data structure.
[333] An additional advantage, m some examples, is that a KVW -based solution such as disclosed here may in some cases be stateless, involving only combinations of data structures and merging. This may in some examples reduce the inter- and intra component constraints to the minimally required output-input-chaining dependencies. This can in some example allow the maximum degree of scalability and execution re ordering to optimize the workflow.
[334] An additional advantage, in some examples, is that merging of records and fields, and/or use of normalization, at various stages of manipulating KVW data structures, and not only after all calculations have been performed, may simplify the calculation load, and may require the multiplication of values associated with a relatively smaller number of records. As one example, consider two tables of 100 records each, where the size of each table can be reduced to 70 records by merging rows. Combining two tables of 100 records each requires more computation effort than combining two tables of 70 records. In some examples, this may decrease the load on the processor. In some other demonstrative embodiments, instead of using keywords in relational databases it is possible to use other means such as, for example, semantic databases and respective inference methods or any other means.
[335] Turning to Fig. 13, it presents an illustration of a block diagram of a machine in the form of a computing system 1300 which includes a set of instructions, wlien executed, can cause the machine to perform any one or more of the methodologies discussed hereinabove, in accordance with some demonstrative embodiments. For example, the machine may operate as a standalone device.
[336] In some demonstrative embodiments, computing system 1300, may include and/or he included in computing system 105 (Fig. 1) and/or computing system 1 100 (Fig. 11). Fig. 13 may in some example cases provide more detail on some possible implementations of components described with regard to Figs. 1 and 11. Note also, for example, that the processor 110 and memory 108 of Fig. 1 are shown at a high level. On the other hand, Fig. 1 provides details on example modules that may in some cases reside in processor 110. Fig. 1 also shows some specific example database functions, that in some example cases may interact with processor 110.
[337] In some demonstrative embodiments, the machine may be connected, e.g. using a network, to one or more other machines. For example, in a networked deployment, the machine may operate the capacity of a server or a client user machine in server- client user network environment, and/or as a peer machine in a peer-to-peer and/or distributed network environment.
[338] In some demonstrative embodiments, the machine may include a server computer, a client user computer, a personal computer (PC), a tablet PC, a laptop computer, a desktop computer, a control system, a network router, switch or bridge, or any machine capable of executing a set of instmctions (sequential or otherwise) that specify actions to be taken by that machine. It should be understood that in other demonstrative embodiments, a device may include any electronic device that provides voice, video or data communication. Furthermore, while a single machine is illustrated, the term "machine" shall also be understood to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein
[339] In some demonstrative embodiments, computing system 1300 may include a processing circuitry 1307. Processing circuitry 107 may comprise a processor 1310, and memory' 1308. Processing circuitry'· 1307 is shown in Fig. 13 as a broken line.
[340] Processing circuitry 1307 may include a processor 1310. For example, processor 1310 may include a central processing unit (CPU), a graphics processing unit (GPU), and/or any other processing unit. Processor 1310 may be referred to herein also as hardware processing circuitry'· 1310. In some demonstrative embodiments, this may be hardware processing circuitry 110 or hardware processing circuitry 1120.
[341] In some demonstrative embodiments, the processing circuitry 1307 of computing system 1300 may further include a memory' 1308 (shown as a broken line). In some demonstrative embodiments, this may' be memory' 108 or memory· 1 140. Memory' 1308 may include one or more of the following: a main memory' 1320, a static memory' 1330 and a machine-readable medium 1380, which communicate with each other via a bus 1395. Note that a bus such as bus 1395 may exist also in the systems of Figs. 1 and 11, although they are not shown in those figures.
[342] In some demonstrative embodiments, computing system 1300 may include a machine-readable medium 1380, which may be configured to stored one or more sets of instructions 1305, e.g. software, embodying any one or more of the methodologies or functions described herein, including those methods illustrated above.
[343] hi some demonstrative embodiments, the instmctions 1305 may also reside, completely or at least partially, within the main memory 1320, the static memory' 1330, and/or within the processor 1310 during execution thereof by the computing system 1200. For example, the main memory 1320 and the processor 1310 also may constitute machine-readable media.
[344] The computing system 1300 may further include a video display unit 1350 (e.g. a liquid crystal display (LCD), a flat panel, a solid state display, a cathode ray tube (CRT)) and/or any type of display. [345] In some demonstrative embodiments, computing system 1300 may include an input device 1360, e.g. a keyboard, a touch pad or the like, a cursor control device 1370, e.g. a mouse, a signal generation device 1390, e.g. a speaker and/or a remote control and/or the like, and a network interface device 1340 which may be operably coupled to a network 1345, e.g. a server, the Internet, a cloud and/or the like. In some demonstrative embodiments, some or all of these components 1350, 1360, 1370, 1380, and 1340 may be components of the I/O interface 1110.
[346] In some demonstrative embodiments, dedicated hardware implementations including, but not limited to, application specific integrated circuits, programmable logic arrays and other hardware devices, can likewise be constructed to implement the methods described herein. Applications that may include the apparatus and systems of various embodiments broadly include a variety of electronic and computing systems. Some embodiments may implement functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application-specific integrated circuit. Thus, the example system is applicable to software, firmware, and hardware implementations.
[347] In some demonstrative embodiments, the methods described herein are intended for operation as software programs running on a computer processor. Furthermore, software implementations can include, but are not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing, and may also be constructed to implement the methods described herein.
[348] In some demonstrative embodiments, the machine readable medium 1380 contains instructions 1305, and/or that which receives and executes instructions 1305 from a propagated signal, so that a device connected to a network environment 1345 may send and/or receive voice, video or data, and to communicate over the network 1345 using the instructions 1305. For example, the instructions 1305 may be transmitted and/or received over network 1345 via the network interface device 1340.
[349] While the machine-readable medium 1380 is shown in an example embodiment to be a single medium, the term "machine-readable medium" should be understood to include a single medium or multiple media, e.g. a centralized or distributed database, and/or associated caches and servers, that may store one or more sets of instructions. The term "machine-readable medium" may also be taken to include any medium that is capable of storing, encoding or carrying out a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
[350] The term "machine-readable medium" may accordingly be taken to include, but not be limited to: solid-state memories such as a memory card or other package that houses one or more read-only (non-volatile) memories, random access memories, or other re -writable (volatile) memories; magneto-optical or optical medium such as a disk or tape; and carrier wave signals such as a signal embodying computer instructions in a transmission medium; and/or a digital file attachment to e-mail or other self-contained information archive or set of archives considered a distribution medium equi valent to a tangible storage medium. Accordingly, the disclosure is considered to include any one or more of a machine-readable medium or a distribution medium, as listed herein and including art-recognized equivalents and successor media, in which the software implementations herein are stored.
[351] Figs. 1, 1 1 , 13 illustrate only generalized schematics of the system architecture, describing, by way of non-limiting example, some aspects of the presently disclosed subject matter only in an informative manner, for purposes of clarity of explanation. Only certain components are shown, as needed to exemplify the presently disclosed subject matter. Other components and sub-components, not shown, may exist.
[352] It will be understood that that the teachings of the presently disclosed subject matter are not bound by what is described with reference to Figs. 1, 11 and 13.
[353] Each system component in Figs. 1 , 11 and 13 can be made up of any combination of software, hardware and/or firmware, executed on a suitable device or de vices, that perform the functions as defined and explained herein. Equivalent and/or modified functionality, as described with respect to each system component, can be consolidated or divided in another manner. Thus, in some examples of the presently disclosed subject matter, the system may include fewer, more, modified and/or different components, modules and functions than those shown in Figs. 1, 11 and 13. One or more of these components can be centralized in one location or dispersed and distributed over more than one location. For example, the present disclosure is of having the Orchestrator calling all the engines, for simplicity of exposition. However, on other implementations, components can call each other directly. Each component in Figs. 1 , 1 1 and 13 may represent a plurality of the particular component, possibly in a distributed architecture, which are adapted to independently and/or cooperatively operate to process various data and electrical inputs, and for enabling operations related to signal detection. In some cases multiple instances of a component, may be utilized for reasons of performance, redundancy and/or availability. Similarly, in some eases, multiple instances of a component may be utilized for reasons of functionality or application. For example, different portions of the particular functionality may be placed in different instances of the component.
[354] The communication between the various components of Figs. 1, 1 1 and 13, in cases where the components are not located entirely in one location or in one physical component, can be realized by any signalling system or communication components, modules, protocols, software languages and drive signals, and can be wired and/or wireless as appropriate.
[355] Although the present specification describes components and functions implemented in the embodiments with reference to particular standards and protocols, the disclosure is not limited to such standards and protocols. Each of the standards for Internet and other packet switched network transmission (e.g. TCP/IP, UDP/IP, HTML, HTTP) represents examples of the state of the art. Such standards are periodically superseded by faster or more efficient equivalents having essentially the same functions. Accordingly, replacement standards and protocols having the same functions are considered equivalents.
[356] Turning now to Figs. 14A-C, they illustrate one example of a data flow' for generating output media, in accordance with certain demonstrative embodiments. In some example cases, this data flow' may make use of the methods exemplified with reference to Figs. 3 to 1Q. In some cases, the data flow' of Figs. 14 may be implemented with use of the systems 100 and/or 1300.
[357] The rectangles in Fig. 14 refer to procedures or methods, for example the methods described with reference to Figs. 3 to 10. Legend 1405 shows that each rec tangle (block), marked with a reference numeral of the block, also lists a reference numeral of an example method, the relevant example figure, and the module that may in some examples perform the method. The figure also shows Tables, w'hich may in some example cases be tables described herein with reference to Figs. 3 to 10. A brief description of each example table, as well as the table name from the above exposition, is also provided. These example tables represent data that may in some cases be output from one method, and may in turn be input to another method. The figure also show's databases, which may for example those of Fig. 1. The arrow's indicate example data flows, whereby data from databases and/or input tables are input to the methods, and whereby output tables are output by the methods
[358] The order presented in Fig. 14 is anon-limiting example, presented for purposes of clarity of exposition. It will be readily apparent that other methods may be used, that methods may be combined and/or separated, that methods may be performed and order different than that shown, and that tables other than those shown may be used to implement the overall method exemplified by Fig. 14. As one non-limiting example, in some implementations content preferences engine 165 and media selection engine 190 can be combined into one engine, and their methods combined.
[359] The example data flow of Fig. 14 may in some examples start with block 1430. This block may correspond to method 300 of Figs. 3a and 3b, exemplified by block 1530 of Fig. 5, and implemented in some examples by social links engine 170. Example inputs are event indication 112 and user indication 113, the Event Metadata Database 126 (Table 3) and User Profile Database 128 (Table 4). An example output is the FRIENDS_EVENT_WEIGHTS_FRIENDSWEIGHTS data entity (Table 12).
[360] The example data flow' of Fig. 14 may m some other examples start with block 1440. This block may correspond to method 400 of Fig. 4, exemplified by block 1540 and implemented in some examples by content preferences engine 165. Example inputs are user indication 113 and User Profile Database 128 (Table 4). An example output is the PREFERRED MEDIA data entity (Table 13-2).
[361] Block 1450 may correspond to method 500 of Fig. 5, exemplified by block 1550 and implemented in some examples by media handler 155. Example inputs are indication 112, indication 113, Table 12, Event Metadata Database 126 (Table 3) , Media Metadata Database 120 (Table 1) and User Profile Database 128 (Table 4). An example output is the AV AILABLE_MEDIA. data entity (Table 16).
[362] Block 1460 may correspond to method 600 of Fig. 6, exemplified by block 1560 and implemented in some examples by IoT data handler 160. In some examples, it can be performed before or in parallel with block 1450. Example inputs are indication 112, indication 113, Event Metadata Database 126 (Table 3) , IoT Metadata Database 124 (Table 2) and User Profile Database 128 (Table 4). An example output is the FR I ENDS 104' data entity (Table 20).
[363] Block 1470 may correspond to method 700 of Fig. 7, exemplified by block 1570 and implemented in some examples by IoT-media association engine 175. Example inputs are Table 6 and Table 20. An example output is the data entity FRIENDSIOT AVAILABLE MEDIA (Table 21). Block 1480 may correspond to method 800 of Fig. 8, exemplified by block 1580 and implemented in some examples by moments determination engine 180 In some examples, it can be perfonned before or in parallel with block 1470. Example inputs are Table 20 and Raw IoT database 122. An example output is the MOMENTS data entity (Table 23).
[364] Block 1490 may correspond to method 900 of Fig. 9, exemplified by block 1590 and implemented in some examples by media selection engine 190. Example inputs are Table 21 , Table 23 and Table 13-2. An example output is the SELECTED MEDIA data entity (Table 25).
[365] Block 1495 may correspond to method 1000 of Fig. 10, exemplified by block 1595 and implemented in some examples by content creation engine 195 Example inputs are Table 25 and Table 13-2. An example output is the MEDIA_SELECTED_NORM data entity (Table 26).
[366] Fig. 15, disclosed further above, illustrates one example of a process for generating output media, in accordance with certain demonstrative embodiments. In some example cases, this process may make use of the methods described with reference to Figs. 3 to 10, based on data flows such as described with reference to Fig. 14. In some cases, the methods of Fig. 15 may be implemented with use of the systems 1QQ and/or 1300.
[367] As used herein, the phrase "for example," "such as", "for instance" and variants thereof describe non-limiting embodiments of the presently disclosed subject matter. Reference in the specification to "one case", "some cases", "other cases", "one example", "some examples”, "other examples" or variants thereof means that a particular described method, procedure, component, structure, feature or characteristic described in connection with the embodiments) is included in at least one embodiment of die presently disclosed subject matter, but not necessarily in all embodiments. The appearance of the same term does not necessarily refer to the same embodiment(s) or example(s).
[368] Usage of conditional language, such as '‘may,“might’, or variants thereof should be construed as conveying that one or more examples of the subject matter may include, while one or more other examples of the subject matter may not necessarily include, certain methods, procedures, components and features. Thus such conditional language is not generally intended to imply that a particular described method, procedure, component or circuit is necessarily included in all examples of the subject mater. Moreover, the usage of non-conditional language does not necessarily imply that a particular described method, procedure, component or circuit is necessarily included in all examples of the subject matter.
[369] It is noted that the teachings of the presently disclosed subject matter are not bound by the flow charts illustrated in the various figures. The operations can occur out of the illustrated order. Similarly, some of the operations or steps can be integrated into a consolidated operation or can be broken down to several operations, and/or other operations may be added. For example, operations 360 and 365 shown in succession can be executed substantially concurrently or in the reverse order. It is also noted that whilst flow charts are described with reference to system elements that realize them, such as for example processing circuitry 107, this is by no means binding, and the operations can be performed by elements other than those described herein.
[370] In embodiments of the presently disclosed subject mater, fewer, more and/or different stages than those shown in the figures can be executed. In embodiments of the presently disclosed subject matter one or more stages illustrated in the figures can be executed in a different order and/or one or more groups of stages may be executed simultaneously. As one non-limiting example, in some implementations the Orchestrator can call the social links engine 170 (method 300) and the content preferences engine 165 (method 400) in the opposite order, or in parallel.
[37 i] It is appreciated that certain embodiments, methods, procedures, components or features of the presently disclosed subject matter, which are, for clarity, described in the context of separate embodiments or examples, may also be provided in combination in a single embodiment or examples. Conversely, various embodiments, methods, procedures, components or features of the presently disclosed subject mater, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub -combination.
[372] It should also be noted that each of the figures herein, and the text discussion of each figure, describe one aspect of the presently disclosed subject matter in an informative manner only, by way of non-limiting example, for clarity of explanation only. It will be understood that that the teachings of the presently disclosed subject mater are not bound by what is described with reference to any of the figures or described in other documents referenced in this application
[373] It is to be understood that the invention is not limited in its application to the details set forth in the description contained herein or illustrated in the drawings. The invention is capable of other embodiments and of being practiced and earned out in various ways. Hence, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting. As such, those skilled in the art will appreciate that the conception upon which this disclosure is based may readily be utilized as a basis for designing other structures, methods, and systems for earning out the several purposes of the presently disclosed subject matter.
[374] It will also be understood that the system according to the invention may be, at least partly, implemented on a suitably programmed computer. Likewise, the invention contemplates a computer program being readable by a machine or computer for executing the method of the invention . The invention further contemplates a non- transitory computer-readable memory tangibly embodying a program of instructions executable by the computer for executing the method of the invention.
[375] The presently disclosed subject matter further contemplates a non -transitory computer readable storage medium having a computer readable program code embodied therein, configured to be executed so as to perform the method of the presently disclosed subject matter.
[376] In the claims that follow, alphanumeric characters and Roman numerals used to designate claim elements are provided for convenience only, and do not imply any particular order of performing the elements.
[377] It should be noted that the word“comprising” as used throughout the appended claims is to be interpreted to mean“including but not limited to”.
[378] Those skilled in the art will readily appreciate that various modifications and changes can be applied to the embodiments of the invention as hereinbefore descri bed without departing from its scope, defined in and by the appended claims.

Claims

1. A computing system configured to generate one or more output media contents for a user, the computing system comprising a processing circuitry and configured to: receive an indication to generate one or more output media contents for the user, wherein the indication is associated with at least one of: an event of interest to the user; at least one area of interest of the user indicative of the event of interest; receive, from at least one data source, event metadata associated with the event of interest to the user, the event metadata comprising a time associated with the event; receive, from the at least one data source, at least one input media content; receive, from the at least one data source, media metadata related to the at least one input media content, comprising a time associated with the at least one input media content;
receive, from the at least one data source, degree of excitement data of at least one other user;
receive, from the at least one data source, degree of excitement metadata of the at least one other user, comprising a time associated with the degree of excitement data; receive, from the at least one data source, user preference data associated with the user, wherein the user preference data associated with the user comprises a first indication of a degree of relation of the user to the at least one area of interest;
receive, from the at least one data source, user preference data associated with the at least one other user, wherein the user preference data associated with the at least one other user comprises a second indication of a degree of a relation of the at least one other user to the at least one area of interest, wherein the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the at least one other user, the user preference data associated with the user, and the user preference data associated with the at least one other user constitute first data;
identify at least one candidate output media content, wherein the identifying comprises combining and reducing data of at least the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the at least one other user, the user preference data associated with the user, and the user preference data associated with the at least one other user,
wherein the combining and reducing data is based at least partly on a degree of relationship of the time associated with the event, the time associated with the at least one input media content and the time associated with the degree of excitement data, wherein the at least one candidate output media content includes at least a portion of said at least one input media content;
generate said at least one output media content, wherein at least one output media content includes at least a portion of said at least one candidate output media content; and
provide the user with the at least one output media content.
2. The computing system of claim 1, wherein the identifying at least one candidate output media content further comprises normalizing data.
3. The computing system of claim 1 or 2, wherein the generating of said at least one output media content comprises selecting the at least one output media content from tire at least one candidate output media content using at least one output media selection criterion.
4. The computing system of any one of the preceding claims,
wherein the event metadata comprises a third indication of a degree of relation of the event to the at least one area of interest, the media metadata related to the at least one input media content comprises a fourth indication of a degree of relation of the at least one input media content to the at least one area of interest, and the degree of excitement metadata of the at least one other user comprises a fifth indication of a degree of rel ation of the degree of excitement data to the at least one area of interest,
wherein the combining and reducing data is further based on a degree of relationship of the at least one area of interest of the user indicative of the event of interest, the first indication of a degree of relation of the user to the at least one area of interest, the second indication of a degree of a relation of the at least one other user to the at least one area of interest, the third indication of a degree of relation of the event to the at least one area of interest, the fourth indication of a degree of relation of the at least one input media content to the at least one area of interest, and the fifth indication of a degree of relation of the degree of excitement data to the at least one area of interest.
5. The computing system of any one of the preceding claims, further configured to receive, from the at least one data source, data indicative of a degree of relationship of the user to the at least one other user, wherein the identifying at least one candidate output media content further comprises combining and reducing the data indicative of a degree of relationship of the user to the at least one other user and the first data.
6. The computing system any one of the preceding claims, wherein the degree of excitement data of the at least one other user is based on physiological data of the at least one other user.
7. The computing system of any one of the preceding claims, further configured to
to receive, from the at least one data source, degree of excitement data of the user; and
to receive, from the at least one data source, degree of excitement metadata of the user;
wherein the identifying at least one candidate output media content further comprises combining and reducing the data of the degree of excitement metadata of the user and the first data.
8. The computing system of Claim 7, wherein the degree of excitement data of the user is based on physiological data of the user.
9. The computing system of any one of the preceding claims, wherein the user preference data associated with the user further comprises user media preference data, the user media preference data comprising at least one of media type preference and media size preference,
wherein the media metadata related to the at least one input media content comprises at least one of input media content type and input media content size, wherein identifying at least one candidate output media content further comprises combining and reducing the first data and the user media preference data, based on at least one of input media content type and media content size.
10. The computing system of any one of the preceding claims, wherein the at least one input media content comprise at least one media content gathered by the at least one data source from at least one of one or more social networks, one or more content platforms, one or more feeds from the Internet and one or more websites.
11. The computing system of any one of the preceding claims, wherein at least part of the first data comprises at least one data structure comprising one or more entity fields, a keyword field, a value field and a weight field, wherein the weight field includes a weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the valise field, the at least one data structure constituting at least one Keyword-Value-Weight (KVW) data structure, and
wherein the combining and reducing the first data comprises matching Keyword-Value pairs of the at least one Keyword-Value-Weight (KVW) data structure
12. The computing system of Claim 11, the combining and reducing data comprising setting the weight value by at least one of fusing the weight values of records, multiplying the weights values, merging weight values by addition and merging weight values by weighted averaging.
13. The computing system of Claim 12, the setting a weight value for a record of a data entity comprising normalizing the weight value, wherein the sum of normalized weight values is equal to 1
14. The computing system of Claim 5, wherein the identifying comprises generating a data entity indicative of the association of the user and the at least one other user with the event of interest to the user, based on the degree of relationship between the user to the at least one other user.
15. The computing system of the previous claim, wherein the generating of the data entity indicative of the association of the user and the at least one other user with the event of interest to the user comprises combining and reducing data of at least the event metadata associated with the event of interest to the user, the user preference data associated with the user, the user preference data associated with the at least one other user, and the data indicative of the degree of relationship of the user to the at least one other user.
16. The computing system of Claim 9, wherein the identifying comprises generating a data entity indicative of one or more media types and of one or media sizes associated with the one or more media types, according to user media preference data.
17. The computing system of Claim 14 or 15, wherein the identifying comprises generating a first data entity indicative of the at least one input media content, based on at least one of: association of the one or more available media contents with the event, and association of the one or more available media contents with the user preference data associated with the user and with tire user preference data associated with the at least one other user.
18. The computing system of the previous claim, wherein the generating of the first data entity indicative of one or more available media contents comprises combining and reducing data of at least the media metadata related to the at least one input media content, the event metadata associated with the event of interest to the user and the data entity indicative of the association of the user and the at least one other user with the event of interest to the user.
19. The computing system of claims 17 or 18, wherein the identifying comprises generating a data entity indicative of the degree of excitement data of at least one other user, based on at least one of: association of the degree of excitement data with the event, and association of the degree of excitement data with at least one of the user and the at least one other user.
20. The computing system of the previous claim, wherein the generating of the data entity indicative of the degree of excitement data of at least one other user comprises combining and reducing data of at least the degree of excitement metadata of the at least one other user, the event metadata associated with the event of interest to the user and the data entity indicative of the association of tire user and the at least one other user with the event of interest to the user.
21. The computing system of claims 19 or 20, wherein the identifying comprises generating a data entity indicative of association between the at least one input media content and the degree of excitement data of at least one other user, the data entity indicative of association between the at least one input media content and the degree of excitement data of at least one other user generated based on one or more points in time during the event of the interest to the user.
22. The computing system of tire previous claim, wherein the generating of the data entity indicati v e of association between the at least one input media content and the degree of excitement data of at least one other user comprises combining and reducing the first data entity indicative of the at least one input media content and the data entity indicative of association between the at least one input media content and the degree of excitement data of the at least one other user.
23. The computing system of claims 21 or 22, wherein the identifying comprises generating a data entity indicative of points in time within at least one time of interest of the event of interest to the user, according to the degree of excitement data.
24. The computing system of the previous claim, wherein the generating the data entity indicative of points in time is according to an indication of relevance that is based on the degree of excitement data of the at least one other user.
25. The computing system of claim 23 or 24, wherein for each time of interest of the at least one time of interest, generating the data entity indicative of points in time is based on the most highly weighted degree of excitement data of the degree of excitement data corresponding to the each time of interest.
26. The computing system of any one of claims 23 to 25, wherein the identifying comprises generating a second data entity indicative of the one or more available media contents, wherein the one or more available media contents correspond to the points in time within the at least one time of interest of the event of interest to the user.
27. The computing system of the previous claim, wherein the second data entity is weighted based on the preferences of the user.
28. The computing system of claim 26 or 27, wherein the generating of second data entity indicative of the one or more available media contents comprises combining and reducing the data entity indicative of points in time within at least one time of interest, the data entity indicative of association between the at least one input media content and the degree of excitement data of at least one other user and the data entity indicative of one or more media types and of one or media sizes associated with the one or more media types.
29 The computing system of any one of claims 26 to 28, wherein generating said at least one output media content comprises generating a third data entity indicati ve of the one or more available media contents, wherein the generation of the third data entity is based on a rele vance field associated with the one or more available media contents and on a weight field associated with the one or more available media contents.
30. The computing system of the previous claim, wherein the third data entity is indicative of the one or more media sizes associated with the one or more media types
31. The computing system of any one of the preceding claims, wherein the event of interest to the user comprises at least one of a sports match, a contest, a concert, a show, a happening, a vacation or a trip.
32. The computing system of claim 6, wherein the physiological data of the at least one o ther user comprise at least one of a blood pressure, a heartbeat, a heart rate, a breathing rate, a step counter and a body temperature.
33. The computing system of claim 9, wherein the output media content comprises at least one of a video clip, an audio recording, an image and a text captured at the time of the event interest to the user, according to the user media preference data.
34. A computing system configured to create a compilation output from a first data set , the computing system comprising a processing circuitry and configured to: receive an indication to generate a compilation;
receive, from the at least one data source, the first data set, comprising a time associated with first data of the first data set, wherein die first data set is relation-aware; receive, from the at least one data source, a second data set, comprising a time associated with second data of the second data set, wherein the second data set is relation- aware:
identifying at least one candidate compilation output, wherein the identifying comprises combining and reducing data of at least the firs data set and the second data set,
wherein the combining and reducing data is based on a degree of relationship of the time associated with the first data and the time associated with the second data, wherein the at least one candidate compilation output includes at least a portion of said first data set;
generate said at least one compilation output, wherein at least one compilation output media content includes at least a portion of said at least one candidate compilation output; and provide the user with the at least one compilation output.
35. A computer-implemented method to generate one or more output media contents for a user, the method comprising using a processing circuitry of a content creation system to: receive an indication to generate one or more output media contents for the user, wherein the indication is associated with at least one of: an event of interest to the user; at least one area of interest of the user indicative of the event of interest; receive, from at least one data source, event metadata associated with the event of interest to tire user, the event metadata comprising a time associated with the event; receive, from the at least one data source, at least one input media content; receive, from the at least one data source, media metadata related to the at least one input media content, comprising a time associated with the at least one input media content;
receive, from the at least one data source, degree of excitement data of at least one other user;
receive, from the at least one data source, degree of excitement metadata of the at least one other user, comprising a time associated with the degree of excitement data; receive, from the at least one data source, user preference data associated with the user, wherein the user preference data associated with the user comprises a first indication of a degree of relation of the user to the at least one area of interest;
receive, from the at least one data source, user preference data associated with the at least one other user, wherein the user preference data associated with the at least one other user comprises a second indication of a degree of a relation of the at least one other user to the at least one area of interest, wherein the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the at least one other user, the user preference data associated with the user, and the user preference data associated with the at least one other user constitute first data;
identify at least one candidate output media content, wherein the identifying comprises combining and reducing data of at least the event metadata associated with the event of interest to the user, the media metadata related to tire at least one input media content, the degree of excitement metadata of the at least one other user, the user preference data associated with the user, and the user preference data associated with the at least one other user,
wherein the combining and reducing data is based on a degree of relationship of the time associated with the event, the time associated with the at least one input media content and the time associated with the degree of excitement data,
wherein the at least one candidate output media content includes at least a portion of said at least one input media content;
generate said at least one output media content, wherein at least one output media content includes at least a portion of said at least one candidate output media content; and
provide the user with the at least one output media content.
36. A non -transitory computer readable storage medium tangibly embodying a program of instructions that, when executed by a computer comprising a processing circuitry, cause the computer to perform a method to generate one or more output media contents for a user, the method comprising: receiving an indication to generate one or more output media contents for the user, wherein the indication is associated with at least one of: an event of interest to the user; at least one area of interest of the user indicative of the event of interest; receiving, from at least one data source, event metadata associated with the event of interest to the user, the event metadata comprising a time associated with the event;
receiving, from the at least one data source, at least one input media content; receiving, from the at least one data source, media metadata related to the at least one input media content, comprising a time associated with the at least one input media content;
receiving, from the at least one data source, degree of excitement data of at least one other user;
receiving, from the at least one data source, degree of excitement metadata of the at least one other user, comprising a time associated w ith the degree of excitement data; receiving, from the at least one data source, user preference data associated with the user, wherein the user preference data associated with the user comprises a first indication of a degree of relation of the user to the at least one area of interest;
receiving, from the at least one data source, user preference data associated with the at least one other user, wherein the user preference data associated with the at least one other user comprises a second indication of a degree of a relation of the at least one other user to the at least one area of interest, wherein the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the at least one other user, the user preference data associated with the user, and the user preference data associated with the at least one other user constitute first data;
identifying at least one candidate output media content, wherein the identifying comprises combining and reducing data of at least the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media con ten t, the degree of excitement metadata of tire at least one other user, the user preference data associated with the user, and the user preference data associated with the at least one other user,
wherein the combining and reducing data is based on a degree of relationship of the time associated with the event, the time associated with the at least one input media content and the tim e associated with the degree of excitement data,
wherein the at least one candidate output media content includes at least a portion of said at least one input media content;
generating said at least one output media content, wherein at least one output media content includes at least a portion of said at least one candidate output media content; and
providing the user with the at least one output media content.
37. A computing system for generating a data structure, the computing system comprising: an input/output (I/O) interface configured to gather from and distribute data to one or more data processing devices;
a processor to generate at least one data structure based on portions of the data according to one or more data aggregation methods, the at least one data structure comprising one or more entity fields, a keyword field, a value field and a weight field, wherein the weight field includes a weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the value field, the at least one data structure constituting at least one Keyword-Value-Weight (KVW) data structure; and a memory configured to store the at least one data structure, wherein the at least one data structure is capable of being utilized by the processing circuitry to generate at least one of: at least one new data structure, at least one modified data structure, by combining the at least one data structure with at least one other data structure.
38. The computing system of claim 37, wherein the KVW data structure comprises at least one record, a record of the at least one record comprises:
one or more entity values in the one or more entity fields;
a keyword value in the keyword field to indicate on a feature associated with the one or more entity values; and
a value in the value field to be associated to the keyword value; and the weight value set by the processor to indicate the degree of relation of the one or more entity fields to the keyword field and the value field.
39. The computing system of claim 38, wherein the value in the value field can be equal to an enti ty value of an entity field of the entity field of the one or more entity fields.
40. Ήie computing system of claim 38 or 39, wherein the one or more data aggregation methods comprise:
combining at least two source KVW data structures on the basis of the combination of keyword fields and value fields of the at least two source data structures: and
seting the weight value to a record of the data structure by at least one of: multiplying weights values of records of the at least two source KVW data structures, summing the weights values, merging the weight values by weighted averaging.
41. The computing system of claim 4Q, wherein the one or more data aggregation methods further comprise: merging two or more data fields of the at least two source KVW data structures.
42. The computing system of claim 40 or 41, wherein the processor circuitry is further configured to: merge two or more records of the KVW data structure by summing weights values of the two or more records, wherein the combination of the keyword field and the value field in the two or more records match.
43. The computing system of any one of claims 37 to 42, wherein the generation comprises reducing data.
44. The computing system of any one of claims 38 to 43, the generating of the KVW data structure further comprising: normalizing the weight values, wherein tire sum of the normalized weight values of the records of the KVW data structure are equal to 1.
45. The computing system of any one of claims 37 to 44, wherein the weight value comprises 0 for no relation of the data comprised in the record, 0.5 for medium relation of the data comprised in the record, and 1 for high relation of the data comprised in the record.
46. The computing system of claim any one of claims 37 to 45, wherein entity types of the entity fields of the source KVW data structures are not the same.
47. The computing sy stem of claim of any one of Claims 37 to 46, wherein the at least one data structure is capable of being utilized by the processing circuitry to generate one or more output media contents for a user, the computing system further configured to: receive an indication to generate one or more output media contents for the user, wherein the indication is associated with at least one of: an event of interest to the user; at least one area of interest of the user indicative of the event of interest; receive, from at least one data source, event metadata associated with the event of interest to the user, the event metadata comprising a time associated with the event; receive, from the at least one data source, at least one input media content; receive, from the at least one data source, media metadata related to the at least one input media content, comprising a time associated with tire at least one input media content;
receive, from the at least one data source, degree of excitement data of at least one other user;
receive, from the at least one data source, degree of excitement metadata of the at least one other user, comprising a tim e associated with the degree of excitement data; receive, from the at least one data source, user preference data associated with the user, wherein the user preference data associated with the user comprises a first indication of a degree of relation of the user to the at least one area of interest;
receive, from the at least one data source, user preference data associated with the at least one other user, wherein the user preference data associated with the at least one other user comprises a second indication of a degree of a relation of the at least one other user to the at least one area of interest, wherein the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of the at least one other user, the user preference data associated with the user, and the user preference data associated with the at least one other user constitute first data;
identifying at least one candidate output media content, wherein the identifying comprises combining and reducing data of at least the event metadata associated with the event of interest to the user, the media metadata related to the at least one input media content, the degree of excitement metadata of tire at least one other user, the user preference data associated with the user, and the user preference data associated with the at least one other user,
wherein the combining and reducing data is based on a degree of relationship of the time associated with the event, the time associated with the at least one input media content and the time associated with the degree of excitement data, wherein the combining and reducing the first data comprises matching Keyword-Value pairs of the at least one Keyword-Value-Weight (KVW) data structure,
wherein the at least one candidate output media content includes at least a postion of said at least one input media content;
generate said at least one output media content, wherein at least one output media content includes at least a portion of said at least one candidate output media content; and
provide the user with the at least one output media content
48. A non-iransitory program storage device readable by a computer tangibly embodying computer readable instructions executable by the computer to perform a method; the method comprising:
gathering data from one or more data processing devices and distribute data to the one or more data processing devices; generating at least one data structure based on portions of the data according to one or snore data aggregation methods, the at least one data structure comprising one or snore entity fields, a keyword field, a value field and a weight field, wherein the weight field includes a weight value to indicate a degree of relation of the one or more entity fields to the keyword field and the value field, the at least one data structure constituting at least one Keyword- Value-Weight (KVW) data structure; and storisrg the at least one data structure at a memory'·,
wherein tire at least one data structure is capable of being utilized by a processing circuitry of the computer to generate at least one of: at least one new data structure, at least one modified data structure, by combining the at least one data structure with at least one other data structure.
PCT/IL2018/051285 2017-11-29 2018-11-26 System and method of generating media content and related data structures WO2019106658A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762591990P 2017-11-29 2017-11-29
US62/591,990 2017-11-29

Publications (1)

Publication Number Publication Date
WO2019106658A1 true WO2019106658A1 (en) 2019-06-06

Family

ID=66663880

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2018/051285 WO2019106658A1 (en) 2017-11-29 2018-11-26 System and method of generating media content and related data structures

Country Status (1)

Country Link
WO (1) WO2019106658A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115186092A (en) * 2022-07-11 2022-10-14 贝壳找房(北京)科技有限公司 Online interaction processing method and apparatus, storage medium, and program product

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110246383A1 (en) * 2010-03-30 2011-10-06 Microsoft Corporation Summary presentation of media consumption
US20130268620A1 (en) * 2012-04-04 2013-10-10 Matthew Osminer Apparatus and methods for automated highlight reel creation in a content delivery network
US20160105708A1 (en) * 2014-10-09 2016-04-14 Thuuz, Inc. Customized generation of highlight show with narrative component
WO2017007810A1 (en) * 2015-07-06 2017-01-12 Equifax, Inc. Modifying data structures to indicate derived relationships among entity data objects
US20170164014A1 (en) * 2015-12-04 2017-06-08 Sling Media, Inc. Processing of multiple media streams
US20170230700A1 (en) * 2012-08-31 2017-08-10 Facebook, Inc. Sharing Television and Video Programming Through Social Networking

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110246383A1 (en) * 2010-03-30 2011-10-06 Microsoft Corporation Summary presentation of media consumption
US20130268620A1 (en) * 2012-04-04 2013-10-10 Matthew Osminer Apparatus and methods for automated highlight reel creation in a content delivery network
US20170230700A1 (en) * 2012-08-31 2017-08-10 Facebook, Inc. Sharing Television and Video Programming Through Social Networking
US20160105708A1 (en) * 2014-10-09 2016-04-14 Thuuz, Inc. Customized generation of highlight show with narrative component
WO2017007810A1 (en) * 2015-07-06 2017-01-12 Equifax, Inc. Modifying data structures to indicate derived relationships among entity data objects
US20170164014A1 (en) * 2015-12-04 2017-06-08 Sling Media, Inc. Processing of multiple media streams

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115186092A (en) * 2022-07-11 2022-10-14 贝壳找房(北京)科技有限公司 Online interaction processing method and apparatus, storage medium, and program product
CN115186092B (en) * 2022-07-11 2023-06-20 贝壳找房(北京)科技有限公司 Online interactive processing method and device, storage medium and program product

Similar Documents

Publication Publication Date Title
US11582182B2 (en) Multi-user media presentation system
US20230045762A1 (en) Sharing digital media assets for presentation within an online social network
US20190200102A1 (en) User interface elements for content selection in media narrative presentation
US10346459B1 (en) Capture and annotation of segmented content representing insights
US20150331856A1 (en) Time-based content aggregator
CN105519123B (en) The Media Stream of live crowdsourcing
US20180077440A1 (en) System and method of creating, analyzing, and categorizing media
US9955231B2 (en) Relevant video content pushed to a mobile phone
US10139917B1 (en) Gesture-initiated actions in videoconferences
US20160234556A1 (en) System and Method for Organizing, Ranking and Identifying Users as Official Mobile Device Video Correspondents
JP6385447B2 (en) Video providing method and video providing system
US20110196888A1 (en) Correlating Digital Media with Complementary Content
US20130268516A1 (en) Systems And Methods For Analyzing And Visualizing Social Events
WO2013088307A1 (en) History log of user's activities and associated emotional states
US20140095504A1 (en) Systems and methods for cataloging user-generated content
US20230262139A1 (en) Communications channels in media systems
US10104429B2 (en) Methods and systems of dynamic content analysis
US10764381B2 (en) Communications channels in media systems
US20190303679A1 (en) Partner matching method in costarring video, terminal, and computer readable storage medium
US20180184157A1 (en) Communications channels in media systems
KR20120092457A (en) Apparatus and method for recommending contents in real time
JP2007129531A (en) Program presentation system
US20120203706A1 (en) Listings check-in service
US20230156245A1 (en) Systems and methods for processing and presenting media data to allow virtual engagement in events
US20140289328A1 (en) Occasion-based social network

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18883535

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18883535

Country of ref document: EP

Kind code of ref document: A1