WO2011007291A1 - Procédé de création d'une liste de lecture - Google Patents
Procédé de création d'une liste de lecture Download PDFInfo
- Publication number
- WO2011007291A1 WO2011007291A1 PCT/IB2010/053088 IB2010053088W WO2011007291A1 WO 2011007291 A1 WO2011007291 A1 WO 2011007291A1 IB 2010053088 W IB2010053088 W IB 2010053088W WO 2011007291 A1 WO2011007291 A1 WO 2011007291A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- media objects
- trajectory
- playlist
- media
- generating
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 32
- 230000000007 visual effect Effects 0.000 claims description 25
- 230000001795 light effect Effects 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 9
- 230000005236 sound signal Effects 0.000 claims description 3
- 230000006870 function Effects 0.000 claims description 2
- 230000000694 effects Effects 0.000 description 13
- 230000037007 arousal Effects 0.000 description 10
- 239000003086 colorant Substances 0.000 description 10
- 230000036651 mood Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 4
- 230000008451 emotion Effects 0.000 description 3
- 230000036962 time dependent Effects 0.000 description 3
- 230000004397 blinking Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 241000282414 Homo sapiens Species 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/43—Querying
- G06F16/432—Query formulation
- G06F16/434—Query formulation using image data, e.g. images, photos, pictures taken by a user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/43—Querying
- G06F16/438—Presentation of query results
- G06F16/4387—Presentation of query results by the use of playlists
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/44—Browsing; Visualisation therefor
- G06F16/444—Spatial browsing, e.g. 2D maps, 3D or virtual spaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/48—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
- G11B27/105—Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
Definitions
- the present invention relates to a method for generating a playlist comprising at least first media objects and second media objects to be presented simultaneously, to an electronic device adapted for simultaneously presenting at least first media objects and second media objects, and to a computer program.
- playlist refers to a list of media objects that are designated to be presented subsequently.
- media objects are formed by information comprising characteristics perceivable by human beings with their senses.
- media objects may comprise time-dependent characteristics, i.e. may be capable of changing their appearance over time.
- a media object can be formed by visual, audible, audio-visual or tactile information which preferably comprises time- dependent characteristics.
- such media objects are called modalities.
- a media object can be formed by an audio signal which is changing over time such as music, by a video signal, or by a light signal changing over time such as lighting of different colors or other light effects.
- Media objects can for example be formed by audio files, in particular by audio files representing songs.
- audio file types which may form media objects are CD audio files, MP3 audio files, WMA audio files, and the like.
- Further examples for media objects are for example breeze (or wind) effects and vibration or rumble effects or other tactile information and respective control signals for such effects.
- a media object can e.g. comprise more than two modalities such as e.g. a
- such media objects may be stored in a database, e.g. a database comprised in a stationary or mobile device, such as a stationary or mobile computer, a PDA (personal digital assistant), a mobile phone, an MP3 player, mobile storage mediums, such as CDs or DVDs, USB storage devices, and the like.
- a database e.g. a database comprised in a stationary or mobile device, such as a stationary or mobile computer, a PDA (personal digital assistant), a mobile phone, an MP3 player, mobile storage mediums, such as CDs or DVDs, USB storage devices, and the like.
- a network such as the internet or a local network.
- One of the possible ways of describing an atmosphere which is known in the art is by using so-called mood labels, for example sad, aggressive, tender, and so on.
- a more generic way of describing an atmosphere by a mood or emotion is by plotting it on a two- dimensional valence-arousal space wherein valence defines the positive/negative aspect (e.g. happy-sad) and arousal defines the relaxed/excited aspect (e.g. tender/carefree).
- valence defines the positive/negative aspect
- arousal defines the relaxed/excited aspect (e.g. tender/carefree).
- US 2009/0063971 Al describes a system and method for presenting media information to users.
- Media objects such as songs are analyzed to determine a set of three or more objective characteristics that describe the media object.
- Icons representing the media objects are then presented to a user in a display representing a three-dimensional space in which each dimension corresponds to a different characteristic.
- the icons are located within the three-dimensional space based on their characteristics. It is described that a user may create a gradient curve for one or more characteristics. These curves may be created by drawing a line through the three-dimensional space or by some other user input.
- US 2007/0038671 Al describes a program and an electronic device adapted to perform actions directed toward generating a playlist for a mobile device. It is described to capture an image, perform image analysis of the image to extract at least one attribute of the image, and use the attribute(s) so extracted to generate a playlist.
- the image can be a line drawing entered by a user through a user interface, such as a touch pad, coupled to the mobile device.
- the image is analyzed to extract attributes, such as line thickness and stroke speed, which are subsequently used to generate a playlist.
- This object is solved by a method for generating a playlist comprising at least first media objects and second media objects to be presented simultaneously according to claim 1.
- the method comprises the steps: loading a plurality of first media objects from a database; assigning coordinates to the first media objects; assigning coordinates to second media objects; generating an at least two-dimensional representation representing first media objects and corresponding second media objects; receiving a user input in form of a trajectory in the at least two-dimensional representation; and generating a playlist comprising subsequent sets of combinations of first media objects and second media objects to be presented simultaneously based on the trajectory.
- a playlist comprising two media objects to be presented in combination can be conveniently prepared by a single user input.
- the at least two-dimensional representation can preferably be presented on a suitable visual user interface such as a display known in the art.
- the visual user interface is also adapted for receiving the user input. This can e.g. be achieved by a touch screen or digital pen interface as the visual user interface.
- the user input can be in the form of a curve which is drawn on the visual user input device.
- the user input interface is not restricted to these types described before and e.g.
- a playlist comprising at least two media objects (first and second media objects) to be presented simultaneously is provided.
- a playlist comprising more than two media objects (e.g. three or more media objects) to be presented simultaneously is generated.
- audio information can be used as first media objects, lighting as second media objects, and e.g. tactile information such as wind or vibrations as further media objects.
- the combinations of first media objects and second media objects present in the playlist are selected depending on the positions of the coordinates of the first and second media objects relative to the trajectory.
- a comprehensible way for adding media objects to the playlist is provided which a user will intuitively understand.
- all the first media objects with coordinates within a defined maximum distance from the trajectory can be added to the playlist.
- the corresponding second media objects can e.g. be formed by those situated at the same coordinates as the first media objects or by those situated at the corresponding position (e.g. closest distance) of the trajectory.
- the order of the combinations of the first media objects and second media objects in the playlist is determined by the direction of the trajectory.
- the ordering of the media objects in the playlist is determined in a comprehensible way. For example, combinations of first and second media objects corresponding to the beginning of the trajectory can be arranged at the beginning of the playlist and combinations
- corresponding to the end of the trajectory can be arranged at the end of the playlist.
- the user generate the playlist in a very intuitive and convenient manner.
- the position of the trajectory in the at least two-dimensional representation, the direction of the trajectory, and at least one further feature of the user input are determined.
- further information can also be input via a single user input such that multi-modality (such as for instance a combination of music, color of light, and dynamic light effects) can be controlled conveniently.
- the at least one further feature is at least one of the smoothness of the trajectory, the speed with which the trajectory is generated, and the pressure with which a user input is performed.
- the smoothness of the trajectory and the speed at which the trajectory is generated can e.g. easily be analyzed using known techniques. This can be performed e.g.
- trajectory when the trajectory is input (virtually) with a computer mouse or the like or when the trajectory is input via a touch screen, touch pad, or digital pen interface, or the like.
- the pressure with which the user input is performed can e.g. be detected with a suitably adapted touch screen, touch pad, or digital pen interface.
- position and direction of the trajectory can be used to determine which first and second media objects are added to the playlist and in which order, and the further feature can be used to determine dynamical effects of the first media object or the second media object, or the like.
- dynamic properties of the second media objects are determined by the at least one further feature. This can e.g. be dynamic light effects such as a change in color or brightness, blinking or flashing effects, etc.
- the media objects in the database are classified according to at least two objective characteristics.
- the at least two objective characteristics i.e. corresponding values of these objective characteristics
- valence representing the positive/negative aspect, i.e. to which extent a song is happy or sad
- arousal representing the relaxed/excited aspect, i.e. whether a song is rather tender or carefree.
- other objective characteristics which have proved to present satisfactory results with respect to classification of music are e.g. valence (representing the positive/negative aspect, i.e. to which extent a song is happy or sad) and arousal (representing the relaxed/excited aspect, i.e. whether a song is rather tender or carefree).
- other objective characteristics which have proved to present satisfactory results with respect to classification of music are e.g. valence (representing the positive/negative aspect, i.e. to which extent a song is happy or sad) and arousal (representing the relaxed/excited aspect, i.e. whether a song is rather tender or carefree).
- the first media objects are audio files.
- audio files as first media objects, there is a demand to provide methods and devices for generating playlists in a convenient manner.
- the audio files can for instance be formed by music files such as songs which can be provided in different file types as known in the art.
- the second media objects are light effects to be displayed together with corresponding first media objects. It has been found out in studies that in particular the combination of music and lighting effects is essential in creating an atmosphere in a certain location.
- the object is also solved by an electronic device adapted for simultaneously presenting at least first media objects and second media objects according to claim 10.
- the electronic device comprises: a memory for storing a database comprising a plurality of first media objects; a visual user interface adapted for displaying information to a user; and a processor adapted to perform the following steps: load a plurality of first media objects from the database; assign coordinates to first media objects; assign coordinates to second media objects; generate an at least two-dimensional representation representing first media objects and corresponding second media objects on the visual user interface; receive a user input in form of a trajectory in the at least two-dimensional representation; and generate a playlist comprising subsequent sets of combinations of first media objects and second media objects to be presented simultaneously based on the trajectory.
- the electronic device achieves substantially the same advantages which have been described above with respect to the method. It should be noted that at least first media objects and second media objects means that a plurality of media objects (three or more media objects) to be presented simultaneously is also possible.
- the visual user interface is adapted to function as a user input device.
- the visual user interface is a bi-directional user interface which enables both providing information to the user and receiving information from the user.
- the visual user interface can be realized by a touch screen or digital pen interface.
- the visual user interface can e.g. be provided on the electronic device itself or separate, e.g. on a remote control.
- the electronic device comprises a speaker for presenting sound signals as first media objects.
- the device is capable of presenting the first media objects to a user without requiring further components.
- the speaker can e.g. be provided as a built-in speaker or as an external speaker.
- the speaker can also be realized as head phones or ear phones.
- the electronic device comprises a light source for presenting light signals as second media objects.
- the device is capable of presenting the second media objects to the user without requiring further components.
- the light source can for instance be formed by a living color lamp.
- a speaker for outputting music and of one or more light sources for generating (colored) light effects is relevant.
- the object is also solved by a computer program comprising program code for executing, when executed on a digital data processor, the method according to any one of claims 1 to 9.
- the computer program achieves the advantages which have been described above with respect to the method.
- the computer program has the advantage that the method can be realized on any hardware comprising a suitable digital data processor such as e.g. a stationary or mobile computer, a PDA, a mobile phone, and the like.
- the computer program is provided as a computer program product, i.e. in a tangible form.
- the computer program is stored on a machine-readable carrier.
- a machine-readable carrier can for instance be formed by a CD, a DVD, a USB-device, or other storage medium known in the art.
- Fig. 1 schematically shows an electronic device adapted for simultaneously presenting first media objects and second media objects.
- Fig. 2 is a schematic illustration of a two-dimensional representation of two objective characteristics on a visual user interface.
- Fig. 3 is a schematic illustration of a visual representation of first media
- Fig. 4 is a schematic block diagram for explaining a method for generating a playlist.
- the electronic device 1 adapted for simultaneously presenting at least first media objects and second media objects is a mobile device.
- the electronic device 1 comprises a digital pen interface as a visual user interface 3 which is capable of visually presenting information to a user and of receiving a user input generated by means of a digital pen 6.
- the visual user interface 3 simultaneously is a user input interface.
- the visual user interface 3 is integrated to a casing 2 of the electronic device 3 which contains the necessary electronic components such as at least one processor 8, at least one memory 7, and the like.
- the electronic device 1 is formed by a PDA (personal digital assistant).
- the visual user interface can also be provided separate from the electronic device, e.g. can be integrated to a remote control or provided as a separate device.
- the electronic device 1 is provided with a speaker 4 for presenting first media objects to a user.
- the first media objects are song files which are stored in a database in the electronic device 1 and which can be loaded to be presented via the speaker 4.
- an external speaker is shown in Fig. 1, the speaker 4 may also be built-in or provided as head phones or ear phones.
- the electronic device 1 is provided with a light source 5.
- a light source 5 Although only one light source 5 is exemplarily shown in Fig. 1, a plurality of light sources may be provided.
- the light source 5 is schematically shown in Fig. 1 as a separate device connected to the electronic device 1 , the light source 5 (or sources) may for instance also be integrated with the electronic device 1.
- the light source 5 is adapted such that light of different colors can be presented to a user. In particular, different colors can be presented in a time-dependent way. Further, the light source 5 is adapted such that dynamic light effects can be presented to a user.
- the light source 5 can be a large light source or combination of light sources capable of generating different lighting effects in a location or may be a small light source or combination of light sources (e.g. only illuminating the electronic device 1).
- the electronic device 1 is particularly adapted for presenting light effects of different color to a user as second media objects.
- the first media objects are formed by audio files and the second media objects are formed by light effects (e.g. lighting of different colors).
- the electronic device 1 is provided with a digital data processor 8 as is known for e.g. portable devices such as PDAs.
- the digital data processor 8 is adapted such that the process steps which will be described in the following are executed by the digital data processor.
- this can be realized by the digital data processor 8 being a common general purpose processor and by an appropriate software running on the processor or by the digital data processor 8 being a special purpose hardware particularly adapted for this purpose or by a combination of special purpose and general-purpose or multi-purpose elements.
- the electronic device 1 is provided with a suitable memory 7 in which the database of first media objects and possibly other data objects can be stored.
- the electronic device 1 is further provided with suitable circuitry known in the art to enable operation of the electronic device 1.
- a plurality first media objects is loaded from the database.
- the first media objects are audio files, in particular songs.
- the database is located in the memory 7 in the electronic device 1.
- the first media objects are loaded from an external database.
- the first media objects in the database are classified according to at least two different objective
- the first media objects being song files are classified according to the objective characteristics "valence”, i.e. a value how sad or happy, respectively, the content of the media object is, and "arousal”, i.e. a value how relaxed or excited, respectively, the content of the media object is.
- valence i.e. a value how sad or happy, respectively
- arousal i.e. a value how relaxed or excited, respectively, the content of the media object is.
- a step S2 depending on the "valence" value and on the "arousal” value assigned to a first media object in the database, coordinates on a two-dimensional plane are assigned to the first media objects.
- a first coordinate is assigned according to the "valence” value and a second coordinate is assigned according to the "arousal” value of the respective first media object.
- the first media objects are mapped on a two-dimensional valence- arousal plane according to the atmosphere or mood/emotion the content of the first media objects conveys to a user.
- a step S3 different colors are assigned to the coordinates of the two-dimensional valence-arousal plane such that a certain color corresponds to each point on this plane.
- the colors can e.g. be automatically determined in such a way that specific colors are assigned to certain moods (i.e. coordinate pairs on the graphical representation such as the valence-arousal plane) or can e.g. be set by a user according to its preference.
- the assignment of colors to moods/coordinates on the graphical representation is also stored in the database.
- step S4 a two-dimensional graphical representation representing the first media objects and the corresponding second media objects is provided.
- a two-dimensional graphical representation of the audio files according to their valence-arousal coordinates and of the corresponding colors of light is provided on the visual user interface 3.
- An example for such a representation is given in Fig. 3 wherein the black filled and empty circles 10 indicate different songs which can be found at different locations on the valence- arousal plane due to the difference in the assigned coordinates.
- a first media object e.g.
- the background color of the representation may represent the corresponding color which is assigned to the respective coordinates/first media objects. However, it should be noted that this is not necessary.
- icons such as the circles 10 in Fig. 3
- first media objects are shown in the representation as long as positions in the representation are assigned to the first media objects, i.e. as long as a fixed relation between the coordinates in the representation and first media objects exists.
- a user input in form of a trajectory in the at least two-dimensional representation is received.
- this can be achieved by a user making a curve with a digital pen in the representation if the visual user interface device 3 is formed as a digital pen interface.
- this can e.g. be achieved by making a curve with a finger or other pointing device in the case of a touch pad.
- this can be done (virtually) by moving a cursor on the representation, e.g. with a mouse or other known navigation device.
- An example for such a trajectory in the graphical representation is shown as the curve C4 in Fig. 3.
- the curves Cl, C2, and C3 in Fig. 2 represent further examples for trajectories in the representation which can be input by a user.
- step S6 a playlist comprising subsequent sets of combinations of first media objects and second media objects to be presented simultaneously is generated based on the trajectory.
- the combinations of first media objects and second media objects present in the playlist are selected depending on the positions of the coordinates of the first and second media objects relative to the trajectory. Further, the order of the combinations of the first media objects and second media objects in the playlist is determined by the direction of the trajectory. How this is achieved will now be described.
- the device 1 is adapted such that all first media objects which are within a certain distance from the trajectory are added to the playlist.
- all the first media objects formed by audio files
- the direction D4 in which the trajectory is input by the user is schematically indicated by an arrow in Fig. 3.
- the selected first media objects are added to the playlist corresponding to their location in the direction of the trajectory C4.
- first rather “sad” songs will be placed in the playlist and the subsequent songs will be increasingly more "happy”. All the selected songs will be rather "excited” (and thus not “relaxed”).
- trajectories Cl, C2, and C3 are shown in Fig. 3 (icons indicating the exact position of first media objects are omitted in Fig. 3).
- Fig. 3 Further examples of possible trajectories Cl, C2, and C3 (and corresponding directions Dl, D2, D3) are shown in Fig. 3 (icons indicating the exact position of first media objects are omitted in Fig. 3).
- the trajectory Cl only “sad” first media objects will be selected starting with a neutral value for "arousal” (neither “relaxed” nor “excited”) and the subsequent first media objects will be increasingly more “excited”.
- the trajectory C2 rather “sad” songs will be added first and thereafter increasingly “excited” and more "happy” songs will be added, while at the end of the playlist still increasingly “happy” songs will be added which are decreasingly “excited”.
- second media objects to be presented simultaneously with the first media objects are also selected by inputting the trajectory.
- the second media objects are colors of lighting to be presented together with the audio files (songs).
- the first media object which will be placed on the first position in the playlist based on the trajectory C4 is represented by the circle 10'.
- the color of light which is assigned to the coordinates of the circle 10' will be presented together with this first media object.
- this color of light (as a second media object) will be placed in the first position of the playlist together with the first media object.
- the first media object indicated by the circle 10" will be placed together with the second media object (i.e. another color of light) having the corresponding coordinates. This procedure is repeated for the following first and second media objects in the playlist.
- first and second media objects are added to the playlist in a comprehensible manner by a single user input on the visual user interface 3.
- At least one further feature of the trajectory (Cl, C2, C3, or C4) is determined and exploited for generating the playlist.
- the further feature can e.g. be formed by the smoothness of the trajectory, the speed with which the trajectory is generated, or the pressure with which a user input is performed.
- more than one further feature can be advantageously determined and exploited for determining further properties of the playlist or the
- dynamic properties of the second media objects are determined by the at least one further feature.
- the dynamic properties are dynamic light effects such as changes in brightness over time, blinking or flashing effects. These dynamic properties can e.g. be formed by the dynamic aspect of colored light. How this is achieved will now be described with reference to Fig. 2.
- a rather jagged trajectory is given, while with respect to the trajectory C2, a smoother trajectory is given.
- the smoother the line e.g. trajectory C2 the less dynamics in the color of light are selected, while the more jagged the line (e.g. trajectory Cl) the more dynamics in the color of the light are selected.
- the smoother the line e.g. trajectory C2 the less dynamics in the color of light are selected, while the more jagged the line (e.g. trajectory Cl) the more dynamics in the color of the light are selected.
- it is also possible that such a feature of the trajectory is separately analyzed for several sections of the trajectory, e.g.
- the pressure with which the user input is performed can be detected.
- the user input device has to be capable of the detecting the pressure.
- the thickness of the trajectory will e.g. depend on the pressure with which the input is performed. This case is schematically shown for the trajectory C3 in Fig. 2.
- the first and last sections of the trajectory are performed with lower pressure, while the intermediate section is performed with increased pressure (schematically indicated by the differing thickness of the trajectory C3).
- the dynamic effects can be chosen such that the more pressure the user exerts during input the more dynamics are selected, while with less pressure exerted less dynamics are selected.
- trajectory can be additionally or alternatively exploited.
- One further example is the speed with which the trajectory is drawn by the user. For example, a high speed can be used to select more dynamics and a lower speed to select less dynamics. The speed can e.g. be detected with suitable run-time algorithms.
- the at least one further feature is determined and exploited for determining a further property of the second media object
- the at least one further feature can e.g. also be used for selecting a further property of the first media object (such as e.g. the volume in the case of audio files being the first media objects).
- audio files have been described as specific first media objects and light effects have been described as specific second media objects, the invention is not limited to this combination and other combinations of first and second media objects as defined in the introductory portion shall also be encompassed.
- first and second media objects to be presented simultaneously has been described with respect to the embodiments, a combination of more (different) media objects to be presented simultaneously is possible.
- tactile effects can be added as third media objects to be presented together with the first and second media objects.
- these further media objects are also added to the playlist by exploiting further features of the trajectory input by the user.
Abstract
L'invention concerne un procédé pour créer une liste de lecture comprenant des premiers objets multimédia, et des seconds objets multimédia destinés à être présentés simultanément. Le procédé comprend les étapes suivantes: chargement d'une pluralité de premiers objets multimédia à partir d'une base de données (S1); attribution de coordonnées aux premiers objets multimédia (S2); attribution de coordonnées aux seconds objets multimédias (S3), création d''une représentation au moins bidimensionnelle représentant les premiers objets multimédias et les seconds objets multimédias correspondants (S4); réception d'une entrée utilisateur sous forme d'une trajectoire (C1, C2, C3, C4) dans la représentation au moins bidimensionnelle (S5); et création d'une liste de lecture contenant des ensembles consécutifs de combinaisons de premiers objets multimédias et de seconds objets multimédias destinés à être présentés simultanément sur la base de la trajectoire (S6).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP09165270.1 | 2009-07-13 | ||
EP09165270 | 2009-07-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011007291A1 true WO2011007291A1 (fr) | 2011-01-20 |
Family
ID=42985408
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2010/053088 WO2011007291A1 (fr) | 2009-07-13 | 2010-07-06 | Procédé de création d'une liste de lecture |
Country Status (2)
Country | Link |
---|---|
TW (1) | TW201110012A (fr) |
WO (1) | WO2011007291A1 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018011057A1 (fr) | 2016-07-15 | 2018-01-18 | Philips Lighting Holding B.V. | Commande d'éclairage |
WO2022043041A1 (fr) * | 2020-08-26 | 2022-03-03 | Signify Holding B.V. | Détermination d'un ordre de reproduction de chansons sur la base de différences entre des scripts de lumière |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1244033A2 (fr) * | 2001-03-21 | 2002-09-25 | Matsushita Electric Industrial Co., Ltd. | Dispositif pour la génération de listes d'écoute et dispositif, système, procédé, programme et support d'enregistrement pour la provision d'informations audio |
US20070038671A1 (en) | 2005-08-09 | 2007-02-15 | Nokia Corporation | Method, apparatus, and computer program product providing image controlled playlist generation |
WO2007072338A1 (fr) * | 2005-12-20 | 2007-06-28 | Koninklijke Philips Electronics, N.V. | Source d'eclairage produisant une sortie spectrale s'etendant au-dela d'une interface utilisateur d'un dispositif mobile |
US20080163056A1 (en) * | 2006-12-28 | 2008-07-03 | Thibaut Lamadon | Method and apparatus for providing a graphical representation of content |
US20090063971A1 (en) | 2007-08-31 | 2009-03-05 | Yahoo! Inc. | Media discovery interface |
-
2010
- 2010-07-06 WO PCT/IB2010/053088 patent/WO2011007291A1/fr active Application Filing
- 2010-07-12 TW TW099122878A patent/TW201110012A/zh unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1244033A2 (fr) * | 2001-03-21 | 2002-09-25 | Matsushita Electric Industrial Co., Ltd. | Dispositif pour la génération de listes d'écoute et dispositif, système, procédé, programme et support d'enregistrement pour la provision d'informations audio |
US20070038671A1 (en) | 2005-08-09 | 2007-02-15 | Nokia Corporation | Method, apparatus, and computer program product providing image controlled playlist generation |
WO2007072338A1 (fr) * | 2005-12-20 | 2007-06-28 | Koninklijke Philips Electronics, N.V. | Source d'eclairage produisant une sortie spectrale s'etendant au-dela d'une interface utilisateur d'un dispositif mobile |
US20080163056A1 (en) * | 2006-12-28 | 2008-07-03 | Thibaut Lamadon | Method and apparatus for providing a graphical representation of content |
US20090063971A1 (en) | 2007-08-31 | 2009-03-05 | Yahoo! Inc. | Media discovery interface |
Non-Patent Citations (3)
Title |
---|
HARVILLE M ET AL: "MediaBeads: An Architecture for Path-Enhanced Media Applications", IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, 27 June 2004 (2004-06-27), pages 1 - 5, XP002304362 * |
P. KNEES ET AL: "An innovative three dimensional user interface for exploring music collections enriched with meta-information from the web", MM '06 PROCEEDINGS OF THE 14TH ANNUAL ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, 23 October 2006 (2006-10-23) - 27 October 2006 (2006-10-27), Santa Barbara, California, USA, pages 8PP, XP002607831 * |
VIGNOLI F ET AL: "Visual Playlist Generation on the Artist Map", PROCEEDINGS ANNUAL INTERNATIONAL SYMPOSIUM ON MUSIC INFORMATIONRETRIEVAL, 14 September 2005 (2005-09-14), pages 520 - 523, XP002360088 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018011057A1 (fr) | 2016-07-15 | 2018-01-18 | Philips Lighting Holding B.V. | Commande d'éclairage |
WO2022043041A1 (fr) * | 2020-08-26 | 2022-03-03 | Signify Holding B.V. | Détermination d'un ordre de reproduction de chansons sur la base de différences entre des scripts de lumière |
Also Published As
Publication number | Publication date |
---|---|
TW201110012A (en) | 2011-03-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101275355B1 (ko) | 리스트 항목 속성에 기반한 리스트 탐색에 관한 동적 제어 | |
US9570091B2 (en) | Music playing system and music playing method based on speech emotion recognition | |
US8923995B2 (en) | Directional audio interface for portable media device | |
US9529492B2 (en) | Reproduction of file series | |
JP5666122B2 (ja) | 表示情報制御装置および方法 | |
AU2010259077B2 (en) | User interface for media playback | |
KR102115397B1 (ko) | 휴대 장치 및 휴대 장치의 재생목록 표시 방법 | |
US8248436B2 (en) | Method for graphically displaying pieces of music | |
JPWO2007066662A1 (ja) | コンテンツ検索装置、コンテンツ検索システム、コンテンツ検索システム用サーバ装置、コンテンツ検索方法及びコンピュータプログラム並びに検索機能付きコンテンツ出力装置 | |
KR20130084543A (ko) | 사용자 인터페이스 제공 장치 및 방법 | |
US10628017B2 (en) | Hovering field | |
CN114564604B (zh) | 媒体合集生成方法、装置、电子设备及存储介质 | |
CN103207751A (zh) | 一种用于运行应用的方法、装置和设备 | |
CN105684012B (zh) | 提供情境信息 | |
WO2011007291A1 (fr) | Procédé de création d'une liste de lecture | |
JP2012208898A (ja) | 情報処理装置、プレイリスト生成方法及びプレイリスト生成プログラム | |
JP2012058877A (ja) | プレイリスト作成装置 | |
JP5670732B2 (ja) | コンテンツアイテムのコレクションのうちの少なくとも1つを選択する方法 | |
US20160170593A1 (en) | A Hovering Field | |
US10420190B2 (en) | Audio apparatus, driving method for audio apparatus, and computer readable recording medium | |
CN117221630A (zh) | 多媒体内容的推荐方法、装置、设备、介质及程序产品 | |
TWI494794B (zh) | 電子裝置與其資料選取方法 | |
US20150012569A1 (en) | Method, apparatus and computer program product for conversion of a media file | |
KR20140014537A (ko) | 미디어 컨텐츠 재생 방법 및 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10740333 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10740333 Country of ref document: EP Kind code of ref document: A1 |