WO2017094328A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2017094328A1
WO2017094328A1 PCT/JP2016/077514 JP2016077514W WO2017094328A1 WO 2017094328 A1 WO2017094328 A1 WO 2017094328A1 JP 2016077514 W JP2016077514 W JP 2016077514W WO 2017094328 A1 WO2017094328 A1 WO 2017094328A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
user
notification
music data
music
Prior art date
Application number
PCT/JP2016/077514
Other languages
French (fr)
Japanese (ja)
Inventor
律子 金野
幸由 廣瀬
進太郎 増井
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2017094328A1 publication Critical patent/WO2017094328A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10KSOUND-PRODUCING DEVICES; METHODS OR DEVICES FOR PROTECTING AGAINST, OR FOR DAMPING, NOISE OR OTHER ACOUSTIC WAVES IN GENERAL; ACOUSTICS NOT OTHERWISE PROVIDED FOR
    • G10K15/00Acoustics not otherwise provided for
    • G10K15/02Synthesis of acoustic waves

Definitions

  • the present disclosure relates to an information processing apparatus, an information processing method, and a program.
  • Patent Documents an information processing apparatus that generates a user preference vector of a user based on content meta information corresponding to the content used by the user and introduces other users based on the user preference vector has been proposed (for example, Patent Documents). 1).
  • Patent Document 1 does not notify the user what part of the music the other user actually moved.
  • the present disclosure proposes a new and improved information processing apparatus, information processing method, and program capable of notifying a part that a user feels when listening to music.
  • An information acquisition unit for acquiring notification information for a part of the second music data related to the first music data, generated based on the sensitivity information of the first user at the time of reproduction of the first music data; Notification for outputting the notification information to the notification unit so that the notification information is notified to at least one of the first user and the second user in association with the portion at the time of reproduction of the second music data.
  • An information processing apparatus having an information output unit is provided.
  • a processor outputs the notification information to a notification unit so that the notification information is notified to the first user or the second user in association with the portion when the second music data is reproduced;
  • An information processing method is provided.
  • the computer is An information acquisition unit for acquiring notification information for a part of the second music data related to the first music data generated based on the sensitivity information of the first user at the time of reproduction of the first music data;
  • a notification information output unit that outputs the notification information to the notification unit so that the notification information is notified to the first user or the second user in association with the portion at the time of reproduction of the second music data;
  • a program is provided to function as.
  • FIG. 1 is a schematic diagram of an information processing system according to an embodiment of the present disclosure.
  • 4 is a conceptual diagram illustrating an example of a service provided in an information processing system according to an embodiment of the present disclosure.
  • FIG. 4 is a conceptual diagram illustrating an example of a service provided in an information processing system according to an embodiment of the present disclosure.
  • FIG. 4 is a conceptual diagram illustrating an example of a service provided in an information processing system according to an embodiment of the present disclosure.
  • FIG. It is a block diagram which shows the outline of a function structure of the server (1st information processing apparatus) which concerns on this embodiment. It is a table
  • FIG. 5 is a sequence diagram illustrating an example of an operation of an information processing system according to an embodiment of the present disclosure.
  • FIG. 5 is a sequence diagram illustrating an example of an operation of an information processing system according to an embodiment of the present disclosure.
  • FIG. 5 is a sequence diagram illustrating an example of an operation of an information processing system according to an embodiment of the present disclosure.
  • FIG. It is another example of the information used by the recommended user determination part of the server shown in FIG. It is another example of the information used by the recommended user determination part of the server shown in FIG. It is another example of the information used by the recommended user determination part of the server shown in FIG. It is another example of the data used in the preference information generation part of the server shown in FIG. It is another example of the data used in the preference information generation part of the server shown in FIG.
  • the display of the preference information which concerns on the modification of this indication is shown.
  • the display of the preference information which concerns on the modification of this indication is shown.
  • the display of the preference information which concerns on the modification of this indication is shown.
  • the display of the preference information which concerns on the modification of this indication is shown.
  • the display of the preference information which concerns on the modification of this indication is shown.
  • It is a schematic diagram which shows edit of the music data by the edit music data generation part with which the user terminal which concerns on the modification of this indication is provided.
  • FIG. 1 is a schematic diagram of an information processing system 1000 according to an embodiment of the present disclosure.
  • An information processing system 1000 illustrated in FIG. 1 includes a server 100 and a plurality of user terminals 200, which are communicably connected via a network 300.
  • the server 100 is an example of a first information processing terminal according to the present disclosure.
  • the server 100 collects information about the user's sensitivity to music (hereinafter also simply referred to as “sensitivity information”) from the user terminal 200, and the sensitivity information or the information Is transmitted to the other user terminal 200.
  • sensitivity information information about the user's sensitivity to music
  • sensitivity to music refers to the movement of the viewer's (user's) emotion that occurs during playback of music data, so-called “pitch”.
  • Such emotional movements can be detected based on, for example, changes in the user's biometric information during music data playback.
  • the biological information is not particularly limited, for example, heart rate, body temperature, sweating, blood pressure, sweating, pulse, breathing, blink, eye movement, staring time, pupil size, blood pressure, brain wave, body movement, posture, Examples thereof include skin temperature and skin electrical resistance.
  • Such changes in biological information can be detected by, for example, an input unit 226 described later, or other sensors such as a heart rate monitor, a blood pressure monitor, an electroencephalograph, a pulse meter, a thermometer, an acceleration sensor, a gyro sensor, or a geomagnetic sensor. . And the said information detected by the input part 226 or the sensor can be utilized as sensitivity information.
  • music data refers to data including music information. Therefore, the music data includes not only data including only sound information but also moving image data including images such as still images and moving images. The music data may further include other information such as information on lighting of the light emitting element, occurrence of vibration, operation of other applications, and the like. Further, in the present disclosure, the “music data portion” is a concept including not only a part of music data but also all.
  • the user terminal 200 is an example of a second information processing apparatus according to the present disclosure.
  • the user terminal 200 is an electronic device that can reproduce music data, such as a smartphone, a mobile phone, a tablet, a media player, a desktop computer, and a laptop computer.
  • Each user terminal 200 collects the sensitivity information of the user to be used and transmits it to the server 100.
  • each user terminal 200 receives sensitivity information of other users or information based thereon from the server 100, and notifies each user of these information.
  • the user terminal 200 is configured to be able to communicate with other user terminals 200 via or without the server 100. Thereby, the user who operates the user terminal 200 can perform conversation and information exchange with other users, for example, by chat, messenger, or the like.
  • the network 300 is a wired or wireless transmission path for information transmitted from a device connected to the network 300.
  • the network 300 may include a public line network such as the Internet, a telephone line network, a satellite communication network, various types of LANs (Local Area Network) including Ethernet (registered trademark), WAN (Wide Area Network), and the like.
  • the network 300 may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
  • the information processing system 1000 as described above can provide, for example, the services shown in FIGS. 2 to 4 to the user who owns the user terminal 200.
  • 2 to 4 are conceptual diagrams illustrating examples of services provided in the information processing system 1000 according to the embodiment of the present disclosure.
  • the information processing system 1000 provides services such as “recommended user display” shown in FIG. 2, “preference information display” shown in FIG. 3, and “reproduction of notification information” shown in FIG.
  • FIG. 2 is an example of a service that recommends a user to another user using sensitivity information.
  • the user terminal 200 moves the emotions of user A, user B, and user C (hereinafter also referred to as “sensitivity points”) 401 A, 401 B. , 401C occurrence and occurrence position are detected in association with each part of the music data.
  • Information about the detected sensitivity point is transmitted from each user terminal 200 to the server 100 as sensitivity information.
  • the server 100 compares these pieces of sensitivity information and determines which user to introduce (recommend) to the user A. For example, in FIG. 2, a portion 410 that is A melody, a portion 420 that is B melody, and a portion 430 that is chorus are reproduced in the music data 400.
  • the sensitivity point 401 ⁇ / b> A of the user A and the sensitivity point 401 ⁇ / b> B of the user B overlap at the initial stage of the portion 410.
  • the sensitivity point 401 ⁇ / b> A of the user A and the sensitivity point 401 ⁇ / b> C of the user C overlap in the initial part of the part 410 and in the middle of the part 420.
  • the server 100 determines that the user C who has generated the sensitivity point 401C having more overlap with the sensitivity point 401A should be introduced to the user A. Then, information about the user C is transmitted from the server 100 to the user terminal 200 of the user A, and the same information is displayed on the user terminal 200.
  • a user can use the information processing system 1000 to find a user who is more sensitive to music. And exchange between users with close sensibility to music is possible. That is, in the present disclosure, it is possible to provide a new and improved information processing apparatus, information processing system, information processing method, and program capable of presenting a user with a close sensitivity to music to a certain user.
  • FIG. 3 is an example of a service for notifying the user of notification information based on the sensitivity information of other users.
  • the music data 400 is reproduced on the user terminal 200 possessed by the user C, and the sensitivity point 401C corresponding to this is detected.
  • Information on the detected sensitivity points is transmitted from the user terminal 200 of the user C to the server 100 as sensitivity information.
  • the server 100 generates notification information 403 based on the received sensitivity information and transmits it to the user terminal 200 of the user A.
  • the user terminal 200 of the user A who has received the notification information 403 presents the notification information 403 to the user A along with the reproduction of the music when the music data 400 is reproduced.
  • the notification information 403 is sound information (sound effects).
  • the user A can know the part (so-called “knocked” part) in which the emotion of the user C moves in the music along with the reproduction of the music data 400. Therefore, the user A can feel the sensitivity of the music of the user C in detail and close to the user C, and can feel sympathy with the user C. As a result, even when the user is listening to music alone, the user can feel as if he / she is listening to music together with other users as in the case of participating in a live venue.
  • the present disclosure can provide a new and improved information processing apparatus, information processing method, and program capable of notifying a part where an emotion has moved during appreciation of music.
  • FIG. 4 shows an example of a service that represents a user together with preference information based on sensitivity information.
  • the user information image 500 shown in FIG. 4 the user image 501 of the user himself / herself is displayed, and the mood images 503 ⁇ / b> A, 503 ⁇ / b> B, and 503 ⁇ / b> C of the user are arranged around the user image 501.
  • Such preference information 503A, 503B, and 503C is generated based on sensitivity information obtained when the user reproduces music data, and is displayed in different colors according to the mood of each preference information.
  • the mood image 503A indicates preference information that is, for example, “Europhoric” and is displayed in red (dot pattern in the figure).
  • the mood image 503B indicates, for example, that it is happy (Happy) and is displayed in green (hatching including a broken line in the drawing).
  • the preference information 503C indicates, for example, that it is fun (Joyful) and is displayed in yellow (in the drawing, a pattern including a hexagon).
  • Such a user information image 500 is displayed not only on the user terminal 200 of the user for the mood images 503A, 503B, and 503C but also on the user terminals 200 of other users.
  • the mood in the present specification is an atmosphere that appears at the time of reproduction of a musical piece related to music data or a portion thereof, and includes, for example, a musical tone, a musical idea, a mood, and a mood.
  • Such moods include, for example, “Euphoric”, “Happy”, “Joyful”, “Mild”, “Sad”, “Solenn” as described below. ) ",” Bright “,” healing “,” Fresh “, and” Elegant ".
  • Such a kind of mood can be estimated based on the feature amount of music. Further, the mood of the music can be analyzed by a mood analysis unit described later.
  • the user who contacts the user information image 500 can clearly determine what preference the user displayed in the user information image 500 has for music.
  • the user preference information displayed in the user information image 500 is based on the fact that the user's emotions actually moved when music data was reproduced. Therefore, such preference information simply represents the user's actual preference more accurately than information on preference based on the genre of music or the playback history. That is, according to the present disclosure, it is possible to provide a new and improved information processing apparatus, information processing method, and program capable of displaying preference information that reflects a user's sensitivity to music.
  • FIG. 5 is a block diagram illustrating an outline of a functional configuration of the server 100 according to the present embodiment.
  • the server 100 includes a receiving unit 102, a user information database 104, a music information database 106, a recommended user determining unit 108, a preference information generating unit 110, a notification information generating unit 112, and a transmission. Part 114.
  • the receiving unit 102 is connected to the network 300 and can receive information from an electronic device such as the user terminal 200 via the network 300. Specifically, the receiving unit 102 is stored in the user terminal 200 from the user terminal 200, such as sensitivity information, user information, reproduction history, owned music information, and the like regarding the user who owns the user terminal 200. Receives meta information and mood information of music data.
  • the receiving unit 102 When receiving the user information and sensitivity information of the user having the user terminal, the receiving unit 102 inputs the user information and sensitivity information to the user information database 104. Further, when receiving the meta information and mood information of the music data, the receiving unit 102 inputs the music information to the music information database 106.
  • the user information database 104 constitutes a storage unit together with the music information database 106.
  • the user information database 104 stores information on a user who owns the user terminal 200.
  • user ID information such as user ID, user name, user image, information related to the user's favorite user, music data reproduction history, song ID of music data held, sensitivity information, preference information, Sound effect information etc. are mentioned.
  • the music information database 106 records information related to music data. Examples of information related to music data include meta information such as song ID, song name, artist information, album information, cover image, genre, and mood information of each music data.
  • the recommended user determination unit 108 determines a user to be recommended (presented) to a certain user based on the relevance of sensitivity information between users.
  • the recommended user determination unit 108 functions as a user specifying unit that compares the information including the sensitivity information and specifies the first user associated with the sensitivity information related to the sensitivity information about the second user.
  • the recommended user determining unit 108 determines a recommended user by calculating the product of the relative frequencies of the sensitivity points among the users, for example, as described in FIGS. 6 to 8 are tables relating to sensitivity information used by the recommended user determination unit 108 of the server 100 shown in FIG.
  • the recommended user determination unit 108 first acquires the sensitivity points for the music data A1 of each user from the user information database, and obtains data as shown in Table 601.
  • Table 601 shows the number of sensitivity points (the number of sensitivity points) for each portion of certain music data for the users X1, X2, and X3.
  • the user who is a candidate for recommendation is, for example, a user who holds at least some of the songs A1 to An of the songs held by the user who receives the recommendation user's presentation.
  • the music data for the music pieces A1 to An used for determining the recommended user is music data in which sensitivity information is generated by a method described later for the user who receives the recommendation user's presentation.
  • sensibility information is generated for all users who are candidates when determining recommended users.
  • the sensitivity points are generated based on the user's body movement with respect to each part of the music data. Furthermore, the number of sensibility points is a value obtained by integrating the frequency of body movements at the time of reproduction of music data for each piece of music data.
  • the music data portion is a section obtained by dividing the music data phrase into a plurality of phrases, more specifically, a start period, a middle period, and an end period.
  • the said phrase may be sufficient as the said part, the division obtained by dividing
  • the music data part is a section obtained by dividing each phrase of music data into a plurality of parts. Preferably there is.
  • the recommended user determining unit 108 calculates the relative frequency in the music data of the number of sensitivity points described in the table 601 for each user, and calculates the relative frequency information as shown in the table 603. obtain.
  • the recommended user determination unit 108 multiplies the relative frequency of the sensitivity points shown in FIG. 7 between the users for each piece of music data, and obtains the product of the relative frequencies of the sensitivity points between the users as shown in FIG. . Then, the sum (sum) of the products for each part is obtained as the degree of sensitivity relatedness between users in the music piece A1. For example, considering a case where one of the users X2 and X3 is recommended to the user X1, the user X2 having a larger sum of the products of the relative frequencies has a higher sensitivity relationship with the user X1 than the user X3. become. Also get.
  • the user X1 having a larger sum of the products of the relative frequencies has a higher sensitivity relationship with the user X2 than the user X3. become.
  • the recommended user determination unit 108 performs the above calculation for the music pieces A2 to An, and adds up the sum of the products of the parts for the music pieces A1 to An to obtain the degree of sensitivity related between users. Then, the recommended user determination unit 108 selects a user having a relatively high sensitivity relevance with the user who receives the recommendation user from the candidate users and determines it as a recommended user. Note that a plurality of recommended users may be determined.
  • the recommended user determination unit 108 transmits the user information about the determined recommended user to the user terminal 200 of the user who receives the recommendation user's presentation via the transmission unit 230.
  • the recommended user determination unit 108 may input information about the determined recommended user into the user information database 104.
  • user information on the determined recommended user is transmitted to the user terminal 200 of the user who receives the recommendation user's presentation.
  • the preference information generation unit 110 generates information related to the user's preference for music, that is, preference information.
  • the preference information generation unit 110 generates preference information based on the user's sensitivity information regarding the mood included in the music data.
  • preference information is generated not only based on the user's sensitivity information, but also based on the mood of music in the music data and the music held by the user. Further, there may be one or more pieces of music used for generating the sensitivity information.
  • the mood included in each musical piece may be one or plural.
  • the preference information includes user's sensitivity information for the first mood included in the first song, and user's sensitivity information for the second mood different from the second mood included in the second song, Can be obtained based on.
  • the first music piece and the second music piece may be the same or different music pieces.
  • 9 to 13 are examples of data used in the preference information generation unit 110 of the server 100 shown in FIG.
  • the type of mood and the number of users are limited to simplify the description.
  • the relative frequency of the number of sensibility points for each mood and the relative frequency of appearance for each mood are calculated, and preference information is obtained by averaging these.
  • the preference information generation unit 110 reads, from the user information database 104, the song ID of the song held by the user, that is, the list of songs held and the sensitivity information for each song of the user for each user. On the other hand, the preference information generation unit 110 reads the meta information including the mood information of the music corresponding to the music ID from the music information database 106.
  • the preference information generation unit 110 based on the list of songs held by the user and the mood information of the songs, as shown in the table 607 of FIG. Is accumulated.
  • the mood shown in FIG. 9 is a mood included in each part of the music held by each user.
  • Such a piece of music may be a phrase, a strain, a segment obtained by dividing these, or a measure.
  • the preference information generation unit 110 calculates the relative frequency of appearance of each mood included in the music held for each user.
  • the preference information generation unit 110 based on the list of songs held by the user and the sensitivity information about each song, as shown in Table 611 in FIG. 11, the sensitivity for each mood of the song held for each user. Accumulate points.
  • the preference information generation unit 110 calculates the relative frequency of the number of sensibility points for each mood of the music held for each user.
  • the relative frequency of the number of sensibility points and the relative frequency of appearance of the mood obtained above are averaged for each user and mood to obtain user preference information as shown in FIG. Since such user preference information is generated based on the user's sensibility information, compared to the case where the user's preference information is simply generated from the information of the music or the reproduction history held, there is an emotion to the actual user's music. The moving facts are reflected more accurately.
  • the relative frequency for each mood of the music held by the user X1 is 0.39.
  • the preference information of the user X1 is “Nori” is 0.59.
  • the relative frequency for each mood of music held by the user X2 is 0.37 for “severe” and 0.26 for “sad”.
  • the preference information of the user X2 is 0.26, which is the same for both “severe” and “sad”.
  • the relative frequencies of the music pieces held by the user X3 for each mood are 0.31 for “Nori nori”, 0.23 for “strict” and 0.19 for “fun”.
  • the preference information of the user X3 is 0.29 for “Nori”, 0.18 for “Strict”, and 0.20 for “Fun”. Therefore, the preference of the user X3 is reversed for “serious” and “fun”.
  • the preference information generation unit 110 stores the preference information generated in this way in the user information database 104. Alternatively, the preference information generation unit 110 transmits the preference information to the user terminal 200 via the transmission unit 114.
  • the notification information generation unit 112 generates notification information.
  • the notification information is information indicating a method for notifying the sensibility information of the user or another user together with the reproduction of the music when the music data is reproduced on the user terminal 200.
  • the notification information is generated based on the sensitivity information for the music data of the user.
  • the music data (second music data) reproduced together with the notification information may be the same as or different from the music data (first music data) used to generate the sensitivity information. Also good.
  • the second music data and the first music data are related. For example, both the first music data and the second music data may be data related to the same music piece.
  • the file format is such as stored in another storage device, or mp3 file and mp4 file.
  • the first music data may be data related to the first music
  • the second music data may be data related to the second music including a portion related to (similar to) a part of the first music.
  • Related parts include parts having the same or similar melodies, rhythms, and moods.
  • the target user is, for example, a user specified by the user terminal 200 that receives the notification information, a recommended user determined by the recommended user determining unit 108, a specific limited user, for example, a favorite user or a favorite user It can be a user belonging to a group or an arbitrarily determined user.
  • the notification information may be generated based on the sensitivity information of one user, or may be generated based on the sensitivity information of a plurality of users. These items can be changed by, for example, an instruction from the user terminal 200 that receives the notification information.
  • the notification means in the user terminal 200 can be, for example, sound such as sound effects, vibration such as vibration, light emission by an LED lamp, and display of images, characters, and the like. These notification means are appropriately selected in the user terminal 200.
  • the notification information can include sound information related to sound such as sound effects, vibration information related to vibration, light emission information related to light emission, and image display information related to image display.
  • sound information related to sound such as sound effects
  • vibration information related to vibration vibration
  • light emission information related to light emission and image display information related to image display.
  • image display information related to image display the function of the notification information generation unit 112 will be described with respect to generation of notification information, particularly sound information and image display information.
  • the notification information generation unit 112 acquires user information including sensitivity information, user images, and preference information about music data that is a target of the target user from the user information database 104. Further, the notification information generating unit 112 acquires metadata of the target music data from the music information database 106.
  • the notification information generation unit 112 generates sound information based on user information including metadata and metadata. Specifically, the notification information generation unit 112 outputs a sound (sound effect) to the portion of the music data corresponding to the user's sensitivity information, that is, the portion where the sensitivity points are detected, during reproduction of the music data. Create sound information.
  • the music data portion can be a phrase unit, a measure unit, or a divided segment unit. Moreover, it is preferable that the time for the sound effect to occur is before or after the unit division so as not to disturb the music appreciation.
  • the notification information generation unit 112 selects a sound effect to be used for sound information from a plurality of kinds of sound effects prepared in advance, for example.
  • the sound effect is not particularly limited, and can be, for example, an electronic sound, a sound that can occur in nature, or a simulated sound or an edited sound thereof.
  • Sounds that can be generated in nature include, for example, shouts such as “Bravo”, voices such as cheers, sounds from human body movements such as applause and footsteps, sounds derived from biological information such as instrument sounds and heart sounds, whistle, cracker sounds, etc. , Recording of audio from the audience at live venues, concert venues, etc.
  • the notification information generation unit 112 may select a sound effect based on user information or meta information of music data.
  • the notification information generation unit 112 may change the character of the voice or the wording of the shout based on, for example, the gender and age of the target user included in the user information.
  • generation part 112 may set a sound effect based on sound effect information, when sound effect information is contained in user information.
  • the sound effect information includes sound effect designation information for designating which sound effect should be selected and unique sound effect data relating to the user-specific sound effect. In particular, when a user-specific sound effect is used as the sound effect, the personality of the target user is reflected in the sound effect. Further, the notification information generation unit 112 may select a sound effect based on mood information included in the target music data.
  • the notification information generation unit 112 may change the volume of the sound effect and the type of the sound effect according to the number of users.
  • the notification information generation unit 112 may also set sound effects in accordance with output devices such as speakers and earphones provided in the user terminal 200.
  • the notification information generation unit 112 may set a virtual position where the sound effect occurs using a sound image localization technique. Further, the notification information generation unit 112 may set sound effects according to the environment around the user terminal 200.
  • the notification information generation unit 112 generates image display information for image display based on user information including sensitivity information and meta information of music data.
  • the image display information includes information regarding the displayed notification image and information regarding the method of displaying the notification image.
  • the notification image is not particularly limited, and may be any image such as a polygon, a star polygon, an ellipse, a circle, a fan-shaped geometric figure, and an image containing user information, other figures showing emotional movement, animation, etc. Can be.
  • the notification image is an image including user information
  • the user who owns the user terminal 200 can confirm the user whose emotion has moved during the reproduction of the music data.
  • An image including such user information is convenient because the target user can be identified particularly when there are a plurality of target users.
  • the image including user information is configured to include, for example, a user image included in the user information and user preference information.
  • the animation may include a figure indicating an amount according to the sensitivity information of the user whose sensitivity information is detected, for example, an amount according to a change in the biological information of the user.
  • the user who owns the terminal 200 can grasp the degree of emotional movement (pitch) with respect to the user's music from which the sensitivity information is detected.
  • the display method of the notification image can be any method indicating the target user's emotion moved at which position in the music data.
  • a method for example, a method of displaying a notification image for a certain period of time near the time when a sensitivity point of music data is detected, a sensitivity point along the time axis of a progress bar image of music data displayed on the display unit
  • a method of displaying a notification image in a portion where is detected is a method of displaying a notification image in a portion where is detected.
  • the notification information generation unit 112 can also generate notification information for light emission information and vibration information according to the above method.
  • the notification information generation unit 112 transmits the generated notification information to the user terminal 200 via the transmission unit 114.
  • the notification information may be transmitted periodically or may be transmitted every time the notification information is generated. Further, the transmission method may be changed according to the setting by the user terminal 200.
  • the user terminal 200 of a transmission destination can be at least one of a user (first user) involved in acquisition of sensitivity information and another user (second user). In other words, the notification information can be notified to at least one of the first user and the second user.
  • the transmission unit 114 is connected to the network 300 and can transmit information to an electronic device such as the user terminal 200 via the network 300. Specifically, the transmission unit 114 is generated in the user terminal 200 by the preference information generated by the preference information generation unit 110, the user information of the recommended user specified by the recommended user determination unit 108, and the notification information generation unit 112. Notification information can be sent. Further, various information stored in the user information database 104 and the music information database 106 can be transmitted to the user terminal 200 in response to a request from the user terminal 200.
  • FIG. 14 is a block diagram illustrating an outline of a functional configuration of the user terminal 200 according to the present embodiment.
  • the user terminal 200 includes a reception unit 202, an output control unit 204, an output unit 206, a music playback unit 214, a sensitivity information generation unit 216, a mood analysis unit 218, and a storage unit 220.
  • the receiving unit 202 is connected to the network 300, and can receive information from electronic devices such as the server 100 and other user terminals 200 via the network 300. Specifically, the receiving unit 202 receives user information such as preference information about the recommended user and other users (first user) and notification information from the server 100. Therefore, the reception unit 202 functions as an information acquisition unit. The receiving unit 202 can receive other users' messages from other user terminals 200. The receiving unit 202 inputs the received information to each component of the user terminal 200, for example, the storage unit 220 and the output control unit 204.
  • the output control unit 204 controls output of information in the output unit 206 described later. Specifically, the output control unit 204 inputs / outputs (inputs) information to the output unit 206 and instructs the output unit 206 to output the information.
  • the output control unit 204 is, for example, a display control unit that controls image display on the display unit 208 of the output unit 206, a sound output control unit that controls sound output of the speaker 210, and a vibration generation control that controls the vibration generation unit. It functions as a part.
  • the output control unit 204 also presents the user information to the display unit 208 that presents the user information of the recommended user (first user) so that the user (second user) who owns the user terminal 200 can recognize. It is an example of an information output part.
  • the output control unit 204 as a presentation information output unit can control the output unit 206 as well as output user information to the output unit 206. Furthermore, the output control unit 204 is an example of a preference information output unit that outputs preference information to the display unit 208 that displays preference information. Note that the output control unit 204 as the preference information output unit not only outputs preference information to the output unit 206 but can also control the output unit 206.
  • the output control unit 204 generates and updates a user interface screen and causes the display unit 208 to display it.
  • the generation and update of the user interface screen are triggered by, for example, an input by the user at the input unit 226, a request from each unit in the user terminal 200, or reception of information from the server 100 or another user terminal 200.
  • the configuration of the user interface screen will be described later.
  • the output control unit 204 is externally connected to the speaker 210 or the sound effect sound adapted to the user interface or the audio of the music data decoded by the music reproduction unit 214 according to the contents of the input trigger.
  • the speaker 210 and the external audio output device are controlled so as to be output from an external audio output device such as an earphone or an external speaker (both not shown).
  • the output control unit 204 controls the vibration generating unit 212 to generate vibration or the LED lamp (not shown) to emit light according to the contents of the input trigger.
  • the output control unit 204 performs control so that the output unit 206 outputs the notification information acquired from the server 100 via the receiving unit 202.
  • the output control unit 204 is a notification information output unit that outputs notification information to the output unit (notification unit) 206 so that the notification information is notified to the user in association with the music data part when the music data is reproduced. It is an example.
  • the output control unit 204 as the notification information output unit can control the output unit 206 as well as output the notification information to the output unit 206. More specifically, the output control unit 204 as the notification information output unit determines and changes the notification information notification method, and outputs instruction information about the determination and change to the output unit 206 together with the notification information. The output unit 206 is controlled.
  • the output control unit 204 includes information related to the user who has received the notification information, music data (first music data) related to the acquisition of sensitivity information, and music data (second music data) reproduced together with the notification information. ), The method of notification by the output unit (notification unit) 206 can be changed according to at least one of the information regarding at least one.
  • the change of the notification method is not only the change of the notification means such as sound, light, vibration or display, but also the degree of output in the notification even in the case of the same notification means (for example, volume, light quantity, vibration amount, display information)
  • Change of information in the notification for example, change of sound effect, change of light blinking method, change of vibration method, image to be displayed, change of character.
  • the notification method change is The output sound (sound information) can be changed or the output sound volume can be changed.
  • the output control unit 204 can determine a notification method by the output unit according to the type of music of the music data to be played.
  • the notification method can be determined according to the genre or mood of the music in the reproduction data.
  • the output control unit 204 may determine a notification method by the output unit 206 according to the number of pieces of notification information acquired for the music data. For example, the output control unit 204 may control the display unit 206 to limit the notification information displayed on the display unit 206 when the acquired notification information exceeds a certain number. In this case, for example, on a normal playback screen (player 710 described later), only an icon that indicates the presence of a plurality of notification information is displayed along the progress bar 713, while when the progress bar 713 is enlarged, a notification is displayed. The display unit 206 may be controlled so that details of the information are displayed.
  • the output control unit 204 may change the volume of the sound effect and the type of the sound effect according to the number of users who are the target of notification.
  • the output control unit 204 can change the notification method so that the number of users to be notified can be associated. For example, when the number of people is 10 or less, the sound effect is quiet, and when the number of people is 11 to 100, the sound effect is medium and the sound is medium. In the case of 101 people or more, it can be set as a loud sound effect of “GAYAGAWA”. Thereby, the user who received the notification can grasp how much other users are interested in the music to be appreciated.
  • the output control unit 204 can determine a notification method by the output unit 206 according to the surrounding environment of the user terminal 200 when the music data is reproduced. For example, the type of sound effect can be changed depending on whether the surrounding environment is relatively quiet or relatively noisy.
  • the output control unit 204 can change the notification method in accordance with an instruction from a user who owns the user terminal 200.
  • the notification information can be set not to be notified in response to a user instruction.
  • the user can select the type of notification information not to be notified (sound effect, notification image, etc.).
  • the user can appropriately select a user to be notified.
  • the output control unit 204 can change the notification method in accordance with an instruction from a user who is a notification target.
  • the output control unit 204 can control the output unit 206 so as to output a sound effect and a notification image specified by a user who is a notification target.
  • the output unit 206 outputs information in response to a control instruction from the output control unit 204.
  • the output unit 206 includes a display unit 208, a speaker 210, and a vibration generation unit 212.
  • the output unit 206 is an example of a presentation unit that presents a recommended user so that a user who owns the user terminal 200 can recognize. Further, the output unit 206 also functions as a notification unit that notifies the notification information described above.
  • the display unit 208 includes a display, and displays an image such as a still image and / or a moving image in accordance with a control instruction from the output control unit 204.
  • the speaker 210 is a device for generating sound waves.
  • the speaker 210 generates sound according to a control instruction from the output control unit 204.
  • the vibration generating unit 212 is configured by a device that can generate vibration by a motor or the like.
  • the vibration generation unit 212 generates a vibration in response to a control instruction from the output control unit 204 and causes the user terminal 200 to vibrate.
  • the music playback unit 214 includes a decoder, acquires music data from the music database 224 provided in the storage unit 220, and decodes the music data. Then, the music playback unit 214 outputs the decrypted music data as information including sound via the output control unit 204 and the output unit 206. The music playback unit 214 further inputs progress information about the progress of music during playback of music data to the sensibility information generation unit 216.
  • the sensibility information generation unit 216 detects a user's emotional movement (sensitivity point) with respect to the music data, and generates sensibility information. Specifically, when the music progress information is input from the music playback unit 214, the sensitivity information generation unit 216 detects the sensitivity points input from the input unit 226 and associates the sensitivity points with the time axis of the music data. . Next, the sensibility information generation unit 216 collects sensibility points associated with the time axis through music data, and generates sensibility information.
  • the sensitivity information generation unit 216 uses the music data portion as the time axis of the music data.
  • the music data portion can be classified as described above.
  • Such information on the music data portion is generated by analysis of the music data by the mood analysis unit 218 and can be acquired by the sensitivity information generation unit 216.
  • the sensitivity information is generated based on the user's body movement with respect to each part of the music data.
  • Sensitivity information includes body motion information calculated based on the frequency of body motion of the user for each part of the music data. More specifically, the body motion information is calculated by integrating the frequency of the user's body motion for each part of the music data for each part.
  • Body movement is an index that can be easily expressed as a movement of emotion when a user appreciates music and can be measured relatively objectively. Therefore, by using the body motion information related to such body motion as part of the sensitivity information, the accuracy of the sensitivity information is improved.
  • the mood analysis unit 218 analyzes the music data and obtains mood information regarding the music data. Specifically, the mood analysis unit 218 first acquires music data stored in the music database 224 of the storage unit 220. Next, the mood analysis unit 218 obtains time-pitch data by analyzing the waveform of the music obtained from the music data into two axes of energy for each time and pitch. Here, the pitch is analyzed into 12 pitches (doremi) per octave. For example, the mood analysis unit 218 divides the music data into portions corresponding to music for one second along the time axis, and extracts energy for each frequency band corresponding to each of the 12 scales of one octave.
  • the mood analysis unit 218 analyzes feature quantities such as beat structure, chord progression, key, and music structure according to the music theory based on the information obtained by the analysis.
  • the mood analysis unit 218 estimates the mood of each piece of music included in the music data based on the obtained feature amount.
  • moods include, for example, “Euphoric”, “Happy”, “Joyful”, “Mild”, “Sad”, “Solen”, “Solem” It can be classified into multiple types such as “Bright”, “Healing”, “Fresh”, and “Elegant”. Then, the mood analysis unit 218 determines a value corresponding to the feature amount of the portion of the music data for each classified mood.
  • the mood analysis unit 218 can estimate the mood having the highest value according to the feature amount among the plurality of classified moods as the mood of the target portion.
  • a plurality of moods can be assigned to each part.
  • a plurality of moods are determined in descending order of the value corresponding to the feature amount.
  • estimation may be performed based on, for example, a pattern table indicating a relationship between a feature amount and mood prepared in advance.
  • the above analysis can be performed, for example, by adopting a twelve sound analysis technique.
  • the mood analysis unit 218 generates mood information indicating the correspondence between the mood and each part of the music data.
  • the mood analysis unit 218 generates mood information based on music recognition information such as CDDB (Compact Data DataBase) and Music ID included in the meta information of the music data. May be.
  • music recognition information may also be obtained from an external database.
  • the mood analysis unit 218 provides not only mood information but also various analysis data for music data.
  • the mood analysis unit 218 can also classify music data into appropriate parts based on the various analysis data.
  • the mood analysis unit 218 inputs analysis information including the obtained mood information to the music database 224 and transmits it to the server 100 via the transmission unit 230.
  • the storage unit 220 stores various information necessary for controlling the user terminal 200.
  • the storage unit 220 includes a user information storage unit 222 and a music database 224.
  • the user information storage unit 222 stores information regarding the user who owns the user terminal 200. Examples of such information relating to the user include communication history with other users or groups in addition to the information stored in the user information database 104 described above.
  • the music database 224 stores music data of the music held by the user and information related to the music data, for example, meta information such as music ID, music name, artist information, album information, cover image, genre, and mood information of each music data. .
  • the input unit 226 is a device configured to be able to input information from a user or another device.
  • the input unit 226 includes a touch panel.
  • the input unit 226 receives, for example, various instructions from the user and information related to emotional movement of the user's music.
  • the emotional movement of the user with respect to the music is detected as a body movement during reproduction of the music data.
  • body movement is detected by a user input such as a user tapping a predetermined part of the touch panel.
  • changes in biological information including body movements as emotional movements of the user's music include a heart rate monitor, a sphygmomanometer, an electroencephalograph, a pulse meter, a thermometer, an acceleration sensor, It can also be automatically detected by a biological information detection unit such as a gyro sensor or a geomagnetic sensor.
  • the position information detection unit 228 is a device configured to be able to detect the position information of the user terminal, for example, a GPS (Global Positioning System).
  • the position information detection unit 228 acquires and inputs the detected position information of the user terminal 200 to the storage unit 220 and transmits it to the server 100 and other user terminals 200 via the transmission unit 230 as necessary. .
  • the transmission unit 230 is connected to the network 300, and can transmit information to the electronic device such as the server 100 or another user terminal 200 via the network 300. Specifically, the transmission unit 230 sends information related to the user, such as sensitivity information, user information, reproduction history, and stored music information, of the user who owns the user terminal 200 to the server 100 and music stored in the user terminal 200. Send meta information and mood information of data. In addition, a message to another user input by the input unit 226 is transmitted to the other user terminal 200.
  • FIG. 15 is a screen transition diagram of the user interface of the user terminal 200 according to the embodiment of the present disclosure
  • FIGS. 16 to 22 are examples of the user interface screen of the user terminal 200 according to the embodiment of the present disclosure.
  • the user interface screen of the user terminal 200 includes a main menu 700 as the first hierarchy, a player 710, a library 720, a recommendation 730, a contact 740, and a setting 750 as the second hierarchy. And a song 760, a user profile 770, a timeline 781, a group profile 782, a chat 783, and a my profile 784 as a third hierarchy.
  • These user interface screens are generated by the output control unit 204 and displayed on the display unit 208.
  • the main menu 700 shown in FIG. 16 is a screen displayed when the user terminal 200 is activated. Above the main menu 700, a user information image 500 of the user who owns the user terminal 200 is displayed.
  • a user image 501 having a circular outer shape is displayed, and an image (mood image) 503 related to the mood based on preference information about the user is displayed so as to surround the user image 501.
  • the display format of the user image 501 and the preference information 503 is the same as that shown in FIG.
  • a plurality of mood images 503A, 503B, and 503C in the present embodiment are displayed in different colors around the user image 501.
  • the mood images 503A, 503B, and 503C are images that are arranged along the edge of the user image 501 and form a ring in this embodiment.
  • the mood images 503A, 503B, and 503C include a plurality of different colors, and such colors represent different moods.
  • Such mood images 503A to 503C represent, for example, three of the user preference information having the largest numerical values in the above-described calculation. Thereby, the user who owns the user terminal 200 can objectively recognize his / her preference.
  • FIG. 17 is a diagram illustrating an example of a mood image of a user according to the present embodiment.
  • Each of the mood images 503A to 503J shown in FIG. 17 is color-coded (differentiated by hatching in the figure) and shows different preferences. More specifically, the mood image 503A is “Euphoric”, the mood image 503B is “Happy”, the mood image 503C is “Joyful”, and the mood image 503D is “Mild”. , Mood image 503E is “Sad”, mood image 503F is “Solen”, mood image 503G is “Bright”, and mood image 503H is “Healing”. The mood image 503I indicates “Fresh”, and the mood image 503J indicates “Elegant”.
  • a user name 505 and favorite information 507 are displayed on the right side of the user image 501 of the user information image 500.
  • a player button 701 a library button 702, a recommendation button 703, a contact button 704, and a setting button 705 are arranged below the user information image 500 of the main menu 700.
  • the user interface screen changes to the player 710, the library 720, the recommendation 730, the contact 740, and the setting 750, respectively.
  • a player 710 shown in FIG. 18 is a screen displayed when music data is reproduced.
  • a music information image 711 of music data is displayed in the upper center of the player 710, and an operation button 712 is disposed below the music information image 711.
  • an operation related to reproduction of music data can be performed.
  • the user interface screen changes to a song 760 that is a screen for displaying information related to the music.
  • a progress bar 713 indicating the progress of the music is arranged below the operation button 712.
  • the user's sensitivity point 716 and the user information image 500 as sensitivity information of other users. It is shown. Images and arrangement positions of the sensitivity point 716 and the user information image 500 are generated based on the notification information.
  • the user can visually recognize the part where the emotion of another user has moved (knocked), and compares his / her sensitivity point 716 with the user information image 500 as sensitivity information of the other user.
  • preference information is described in the user information image 500, the user compares the preference information with his / her preference, and examines the compatibility of his / her sensibility with his / her own music. be able to.
  • a sensitivity input button 714 is arranged on the right side of the operation button 712.
  • body movement information for configuring the sensitivity information is input to the input unit 226.
  • a moving bubble animation 715 is displayed around the music information image 711.
  • the animation 715 is generated based on the notification information and is displayed based on its own sensitivity information. For example, in the animation 715, more bubbles are displayed in a portion where the sum of the sensitivity points of the self is large or in a large musical piece. With such an animation 715, the user can recognize the sensitivity to his / her music again, and can raise his / her mood.
  • a comment 717 is further displayed together with the user information image 500 below the progress bar 713 and the user information image 500 along the progress bar 713.
  • a comment on the music performed by another user is displayed.
  • the user interface screen changes to the user profile 770 of the user corresponding to the user information image 500.
  • the user interface screen shifts to a timeline 781 that displays the music viewed by the user corresponding to the user information image 500 in time series. .
  • music data held by the user is displayed in the library 720.
  • the user interface screen is shifted to the player 710 and the music data can be reproduced.
  • the user interface screen can transition to the song 760 and browse music information about the music data.
  • a recommendation 730 shown in FIG. 19 is a screen that displays the recommended users determined by the recommended user determination unit 108.
  • the recommendation 730 includes a user tab 731, a group tab 732, a near-by tab 733, and a search button 734.
  • the user information images 500 of recommended users determined by the recommended user determining unit 108 are listed according to the degree of recommendation.
  • a user image 501A and preference information 503 are displayed.
  • the user possessing the user terminal 200 can sufficiently determine not only the degree of recommendation of the recommended user but also the preference for music.
  • the user interface screen changes to a user profile 770 that displays a corresponding recommended user.
  • groups recommended to the user are listed. Such a group is selected from the group to which the recommended user determined by the recommended user determination unit 108 belongs, and the group whose characteristics of the music or artist posted in the chat in the group are close to the user's preference information. .
  • the user interface screen changes to a group profile 782 that displays the corresponding group.
  • the user presses the near buy tab 733 among the user's favorite users and the users in the group in which the user is participating, users existing in the vicinity of the user are listed.
  • the user interface screen changes to a chat 783 that allows chatting with the corresponding user.
  • a search button 734 arranged in the information from the user tab 731, the group tab 732, and the near-by tab 733 is displayed.
  • the search button 734 When the user presses the search button 734, the user interface screen changes to a screen for searching for another user or group.
  • the contact 740 shown in FIG. 20 is a screen for the user who owns the user terminal 200 to contact other users.
  • the contact 740 includes a user tab 741, a favorite tab 742, a group tab 743, and a search button 744.
  • user information images 500 of favorite users registered in the user's favorites are listed.
  • a user image 501A and preference information 503 are displayed.
  • the user interface screen is changed to a chat 783 that allows chat with the corresponding user.
  • a search button 744 arranged in the information from the user tab 741, the favorite tab 742, and the group tab 743 is displayed.
  • the search button 744 When the user presses the search button 744, the user interface screen changes to a screen for searching for another user or group.
  • the setting 750 is a screen for performing various settings of the user terminal 200 and the server 100.
  • a user profile or user-specific notification information is set by user input.
  • the setting 750 for example, in the notification information acquired by the user terminal 200, whether the origin of the sensitivity information that is the source of the notification information should be a single user, a plurality of users, a favorite, Or all users should be selected.
  • the song 760 shown in FIG. 21 is a screen that displays information related to music data and comments of each user regarding the music related to the music data.
  • a song information image 761 of music data is displayed on the upper part of the song 760, and an operation button 762 is arranged beside it. By pressing the operation button 762, an operation relating to the reproduction of music data can be performed.
  • a user image 501 of the user who owns the user terminal 200 and preference information surrounding the user image 501 are displayed together with a comment 763.
  • the comment 763 is a comment on the music posted by the user who owns the user terminal 200.
  • a reply button 765 is disposed below the comment 763. When the user presses the reply button 765, a response comment to the comment 763 can be input.
  • a comment 764 for the music is displayed together with a user image 501A of another user who posted the comment 764 and a mood image 503 arranged around the user image 501A.
  • a reply button 765 is arranged below the comment 764. When the user presses the reply button 765, a response comment can be input to the adjacent comment 764.
  • a user profile 770 shown in FIG. 22 is a screen that displays information about users other than the user who owns the user terminal 200.
  • a user image 501A of the target user and a mood image 503 arranged around the user image 501 are displayed.
  • a user name 505A and favorite information 507A are displayed below the user image 501A. The number of favorite users added by the target user is described in the favorite information 507A.
  • a favorite addition button 771 and a message transmission button 772 are displayed below the user name 505A and favorite information 507A.
  • the favorite addition button 771 is pressed, the target user is added to the favorite user of the user who owns the user terminal 200.
  • message transmission button 772 is pressed, the user interface screen shifts to chat 783. As a result, a message can be transmitted to the target user.
  • a feed tab 773, a song tab 774, a group tab 775, and a favorite tab 776 are arranged in the horizontal direction.
  • an action history 777 of a user to be displayed such as a music appreciation history, a favorite user addition history, a group participation history, and the like are displayed.
  • an action history 777 of a user to be displayed such as a music appreciation history, a favorite user addition history, a group participation history, and the like are displayed.
  • music data possessed by the target user is listed.
  • group tab 775 is pressed, groups in which the target user is participating are listed.
  • favorite tab 776 is pressed, favorite users of the target users are listed.
  • the user profile 770 of the corresponding user is displayed.
  • the group profile 782 of the corresponding group is displayed.
  • the image for the displayed music data is pressed when the feed tab 773 or the song tab 774 is pressed, the song 760 for the corresponding music data is displayed.
  • a timeline 781 shown in FIG. 15 is a screen for listing songs that a favorite user has watched.
  • the user interface screen changes to the user profile 770.
  • the user interface screen transitions to a song 760 indicating the corresponding music data.
  • the group profile 782 is a screen that displays information about the group.
  • a group is a structural unit that is usually configured by a plurality of users and formed with a predetermined purpose.
  • the predetermined purpose includes, for example, exchanges about favorite artists, genres, songs, and the like.
  • the group profile 782 displays information on members (users) participating in the group and information posted by the members, for example, comments and music information.
  • the user interface screen transitions to a user profile 770 of the corresponding user or a chat 783 with the corresponding user depending on the setting.
  • the user interface screen transitions to a song 760 displaying the corresponding music information.
  • buttons for the user to join and leave the group are arranged as appropriate.
  • Chat 783 is a screen for performing chat, that is, sending and receiving messages between users.
  • chat 783 a message transmitted by each user is displayed together with an image of each user.
  • My profile 784 is a screen that displays information about the user who owns the user terminal 200.
  • information related to the user who owns the user terminal 200 is displayed in a format corresponding to the user profile 770.
  • a user who owns the user terminal 200 can grasp how his / her information is disclosed to other users by referring to the my profile 784.
  • the user interface screen used in the user terminal 200 is not limited to the illustrated mode, and screens can be added or omitted as appropriate. Moreover, in each screen mentioned above, the structure regarding each button or information may be added or abbreviate
  • each of the user terminals 200A and 200B is a terminal that is arbitrarily selected from the user terminals 200 described above, and thus has the same configuration as the user terminal 200. Further, the description will be made assuming that the owner of the user terminal 200A is the first user and the owner of the user terminal 200B is the second user.
  • the operation flow of the information processing system 1000 will be described separately for display of recommended users, display of preference information, and reproduction of notification information.
  • the user terminal 200A generates sensitivity information of the first user who owns the user terminal 200A for each music data (S801). Specifically, sensitivity points are input from the input unit 226 in accordance with the music of the music data that is played back and output by the music playback unit 214, the output control unit 204, and the output unit 206. The sensitivity points are input by tapping the sensitivity input button 714 by the first user in the player 710 on the user interface screen described above. Next, the sensitivity information generation unit 216 generates sensitivity information by associating each input sensitivity point with the time axis of the music data.
  • the user terminal 200A transmits the generated sensitivity information to the server 100 via the transmission unit 230 (S803).
  • the user terminal 200B also generates sensitivity information of the second user (S805).
  • the user terminal 200B transmits the sensitivity information of the second user to the server 100 (S807).
  • the server 100 acquires sensitivity information of each user including the first user and the second user (S809).
  • the server 100 determines a user to be presented to the second user as a recommended user from the plurality of users who have acquired the sensitivity information (S811).
  • the specific procedure for determining recommended users is as described above, and the determination is made based on the relevance of the sensitivity information of each user to the music data portion.
  • the recommended user is the first user. Note that there may be a plurality of recommended users.
  • the server 100 transmits the user information of the first user as the recommended user to the user terminal 200B (S813), and the user terminal 200B acquires the user information of the first user (S815).
  • the output control unit 204 of the user terminal 200B controls the display unit 208 to display the user information of the first user, and the display unit 208 displays the user information (S817).
  • the second user can recognize the user information of the first user, and can browse in the present embodiment.
  • Such display of user information is performed, for example, by displaying the user information of the first user in the above-described recommendation 730 on the user interface screen.
  • the user terminal 200A analyzes the mood information of music data in the mood analysis unit 218 (S821).
  • the user terminal 200A generates sensitivity information of the first user who owns the user terminal 200A for each music data (S823).
  • the user terminal 200A transmits the mood information and sensitivity information obtained through the transmission unit 230 to the server 100 (S825).
  • the server 100 acquires mood information and sensitivity information (S827).
  • the preference information generation unit 110 of the server 100 generates the first user's preference information based on the mood information and the sensitivity information (S829).
  • the server 100 transmits the user information stored in the user information database 104 together with the generated preference information to the user terminal 200B (S831).
  • the user terminal 200B receives the first user's preference information and user information (S833).
  • the output control unit 204 of the user terminal 200B controls the display unit 208 so as to display the received first user preference information and user information, and the display unit displays the first user preference information and the user information. Display information.
  • Such display of user information and preference information can be performed, for example, by the method described above as shown in FIG.
  • the user terminal 200A generates sensitivity information of the first user who owns the user terminal 200A for certain music data (S841).
  • the user terminal 200A transmits the generated sensitivity information to the server 100 via the transmission unit 230 (S843).
  • the server 100 acquires the sensitivity information of the first user (S845).
  • the notification information generation unit 112 of the server 100 generates notification information based on the sensitivity information regarding the music data and the meta information of the music data stored in the music information database 106 (S847).
  • the server 100 transmits the generated notification information to the user terminal 200B via the transmission unit 114 (S849).
  • the user terminal 200B receives the notification information (S851).
  • the notification information is reproduced in accordance with the progress of the music in the music data (S853).
  • the notification information is reproduced as shown in FIG. 3, for example. Specifically, the notification information is reproduced so that the sound effect 403C is generated at the portion where the sensitivity point 401C of the user C as the first user is generated.
  • the user information image 500 is displayed at a position where a sensitivity point is generated by each user.
  • own sensitivity points 716 are displayed as notification information.
  • the recommended user determining unit 108 uses the product of the sensitivity points for each part of the music between the users as an index of the sensitivity relationship between the users.
  • Sensitivity information for a part of the user's music data only needs to be used for evaluation of the relevance of sensitivity between users.
  • 26 to 28 are other examples of information used by the recommended user determining unit 108 of the server shown in FIG.
  • the sensitivity relationship between users may be evaluated based on mood information of music data held by the users.
  • the sensitivity points for each mood of music held in FIG. 11 are generated based on the mood information of the music data held by the user and the sensitivity points for each part of the user's music data.
  • the product of the relative frequency of the integrated value of the sensitivity points for each mood of the music shown in FIG. 12 is calculated between the users, and this sum is an index of the sensitivity relationship between the users. It can be.
  • evaluation of the relevance of sensibility between users may be performed based on user preference information.
  • the preference information as shown in FIG. 13 has a value related to preference for each mood.
  • the sum total obtained by multiplying the value related to the preference for each mood for each mood can be calculated between users, and this sum can be used as an index of the relevance of sensitivity between users. If such total (fitness) is a value greater than or equal to a certain value, or is relatively large within a certain range compared to other users, the recommended user determination unit 108 determines such suitability. Can be determined as a recommended user.
  • the recommended user determination unit 108 can determine the recommended user even when no sensitivity information is generated for music data that is common among users. is there. That is, in the present disclosure, the recommended user may be determined based on the sensitivity information of each user, and does not necessarily need to be based on the sensitivity information about the common music data.
  • the recommended user may be determined based on the reproduction history of the music data of the user who receives the recommendation of the recommended user. For example, in the music data playback history, the sensitivity point in the sensitivity information of each music data may be multiplied by a coefficient corresponding to the playback frequency of each music data. Further, the coefficient may simply increase as the reproduction frequency increases. In addition to this, the coefficient may be set in consideration of an elapsed time after the music data is reproduced. For example, even if the reproduction frequency is the same, the coefficient can be reduced as the elapsed time from the reproduction of the music data is longer. In such a case, the user terminal 200 has a reproduction history database that stores a user's reproduction history, and the reproduction history is transmitted to the server 100 as appropriate.
  • the preference information generation unit 110 generates the preference information based on the sensitivity information for the mood of the user's music and the mood of the music held by the user, but the present disclosure is not limited thereto. What is necessary is just to generate preference information based on the user's sensibility information for one or more moods included in the music and the mood of the music. 29 and 30 show another example of data used in the preference information generation unit 110 of the server 100 shown in FIG.
  • the preference information generation unit 110 may generate preference information based only on the sensitivity information for the mood of the user's music.
  • the preference information generation unit 110 may generate preference information based on the sensitivity information about the mood of the user's music, the mood of the music held by the user, and the reproduction history reproduced by the user. For example, the preference information generation unit 110 accumulates the number of moods in all the reproduced music pieces as shown in the table 621 in FIG. 28 based on the reproduction history of the music pieces, and performs all the reproductions as shown in the table 623 in FIG. Calculate the relative frequency of the number of moods in the music. Next, the preference information generation unit 110 averages each relative frequency in the table 623 in FIG. 29, the table 613 in FIG. 12, and the table 609 in FIG. 10 for each user and mood, and the user preference information as shown in FIG. obtain.
  • the preference information generation unit 110 may multiply the mood by a predetermined coefficient for each piece of music in accordance with the frequency of reproduction of the music in the process of accumulating the number of moods of all played music. That is, the music may be weighted according to the frequency of music playback.
  • the coefficient can be the same as the coefficient at the time of determining the recommended user described above.
  • the preference information generation unit 110 may generate a plurality of preference information based on different information.
  • at least one of the preference information is generated based on the user's sensibility information and the mood of the music for one or more moods included in the music.
  • the user information image 500 has been described on the assumption that the user image 501 having a circular outer shape is displayed and the mood image 503 about the user is displayed so as to surround the user image 501. Is not limited to this. 31 to 35 show the display of mood images according to the modified example of the present disclosure.
  • the user image may be a polygon, for example, a polygon having 3 to 12 corners, particularly a quadrangle, and an annular mood image may be displayed along the outer periphery thereof.
  • the number of mood images to be displayed is not particularly limited.
  • the mood images 513A to 513C and 523A to 523C may be configured by arranging a plurality of color bands representing the mood.
  • the mood images 513A to 513C and 523A to 523C can be displayed as the background of the user image 501.
  • the mood image may be displayed as vertical stripes as shown in FIGS. 31 and 32, may be displayed as horizontal stripes, or may be displayed as stripes having a predetermined angle from the horizontal line. .
  • the output control unit 204 may control the display unit 208 so that the mood images 513A to 513C are displayed evenly regardless of the user's preference.
  • the output control unit 204 is configured such that the mood image (first mood) 513B related to the preference information with higher user preference is from the mood images (second mood) 513A and 513C related to other preference information.
  • the display unit 208 may be controlled so as to be displayed close to the user image 501.
  • the output control unit 204 indicates that the mood image 523A related to the preference information (second mood) with higher user preference is the other preference information (first mood, third mood).
  • the display unit 208 may be controlled so as to be displayed larger than the mood image related to the mood.
  • the output control unit 204 for example, based on the relationship between the value about the first mood calculated by the preference information generation unit 110 and the value about the second mood, the image about the first mood The ratio of the area of the image for the second mood to the area of can be determined. More specifically, the areas of the mood images 513A to 513C can be determined according to the value for each mood calculated by the preference information generation unit 110.
  • the output control unit 204 includes a reference point and displays user preference information.
  • An image related to the first mood having a higher user preference than the second mood among the music moods is displayed.
  • the reference portion is a point, a line, or a part having a certain area on the screen of the display unit 208 set to display the mood image.
  • Such a reference portion can be, for example, a display part of the user image 501.
  • the output control unit 204 uses the mood images 533A to 533I and 533A to 533I as the reference part, and the mood with higher user preference is the reference. You may control the display part 208 so that it may display near a part. Even in this case, the output control unit 204 controls the display unit 208 so that a mood image related to preference information having higher user preference is displayed larger than a mood image related to other preference information. Also good.
  • the output control unit 204 displays mood images 563, 573, and 583 related to a plurality of preference information based on different information.
  • the display unit 208 may be controlled.
  • mood images 563, 573, and 583 are superimposed and displayed so as to form concentric circles.
  • the mood image 563 described at the center in the figure is based on the sensitivity information for the mood of the user's music.
  • the mood image 573 arranged along the outer periphery of the preference information 563 is based on the mood of the music held by the user.
  • the mood image 583 arranged along the outer periphery of the mood image 573 is based on the reproduction history reproduced by the user.
  • the preference information for the mood images 563, 573, and 583 is generated by the above-described preference information generation unit 110.
  • the user terminal 200 may include an edited music data generation unit that generates edited music data obtained by editing the music data based on the music data.
  • the edited music data generation unit can edit music data based on sensitivity information about the user's music data.
  • FIG. 36 is a schematic diagram illustrating editing of music data by the edited music data generation unit included in the user terminal according to the modified example of the present disclosure.
  • the edited music data generation unit extracts and combines the subsections 411, 421, and 431 where the sensitivity points 401 are generated, and generates new edited music data 400A. Since such edited music data 400A is extracted from the portion where the user's sensitivity point 401 is generated, the user can view only the favorite portion of the music data 400 with the edited music data 400A.
  • the user terminal 200 may include a biological information detection unit that can detect a change in the biological information of the user.
  • a biological information detection unit a device corresponding to the biological information to be detected is appropriately selected.
  • the biological information detection unit is a heart rate monitor, a blood pressure monitor, an electroencephalograph, a pulse meter, a thermometer, an acceleration sensor, a gyro sensor, or a geomagnetic sensor.
  • the acceleration sensor, the gyro sensor, and the geomagnetic sensor can be used, for example, for detecting body movement and body position, depending on the configuration.
  • the biological information detection unit is an acceleration sensor
  • the biological information detection unit is configured to be attached to the vicinity of the user's head, neck, and trunk, it is possible to detect the user's movement in accordance with the reproduction of music data. is there.
  • the body movement can be detected by comparing the tempo of the music that each part of the music data has with the period of the amplitude in the user's movement.
  • the sensitivity point is detected when the correlation between the tempo of the music and the period of the amplitude is a certain level or more.
  • the user terminal 200 has been described as including the mood analysis unit 218.
  • the present disclosure is not limited to this, and the user terminal 200 does not include the mood analysis unit.
  • the server 100 may have a mood analysis unit. In such a case, it is not necessary to perform mood analysis of music data for each user terminal 200. Further, by performing the mood analysis of music data collectively in the server 100, it is possible to prevent the mood analysis of overlapping music data.
  • the server 100 may have a music database for storing music data, or the server 100 acquires music data for performing mood analysis from an external music database. May be.
  • the server 100 has been described as including the preference information generation unit 110.
  • the present disclosure is not limited to this, and the user terminal 200 may include a preference information generation unit.
  • preference information generated in the user terminal 200 is appropriately transmitted to the user information database 104 of the server 100.
  • FIG. 37 is a block diagram showing a hardware configuration of the server 100 shown in FIG.
  • the server 100 includes a CPU 120, a ROM 122, a RAM 124, a storage device 126, and a communication device 128.
  • the CPU 120 is a processor that functions as an arithmetic processing unit and a control unit, and controls the overall operation in the server 100 or a part of the operation in the server 100 according to various programs recorded in the ROM 122, the RAM 124, and the storage device 126.
  • the ROM 122 stores programs used by the CPU 120, calculation parameters, and the like.
  • the RAM 124 primarily stores programs used by the CPU 120, parameters that change as appropriate during execution of the programs, and the like.
  • the CPU 120, the ROM 122, and the RAM 124 are connected to each other by a host bus constituted by an internal bus such as a CPU bus.
  • the storage device 126 is a device for data storage configured as an example of a storage unit such as stored in the user information database 104 and the music information database 106.
  • the storage device 126 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the storage device 126 stores programs executed by the CPU 120, various data, various data acquired from the outside, and the like.
  • the communication device 128 is a communication interface configured with, for example, a communication device for connecting to the network 300.
  • the communication device 128 is, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 128 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet or other communication devices.
  • each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Moreover, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out the present embodiment.
  • the receiving unit 102 the user information database 104, the music information database 106, the recommended user determining unit 108, the preference information generating unit 110, the notification information generating unit 112, and The function of the transmission unit 114 is realized.
  • FIG. 38 is a block diagram showing a hardware configuration of the user terminal 200 shown in FIG.
  • the user terminal 200 includes a CPU 240, a ROM 242, a RAM 244, a display device 246, a speaker 210, an input device 248, a storage device 250, a communication device 252, and a vibration device 254.
  • the configurations of the CPU 240, the ROM 242, the RAM 244, and the storage device 250 can be the same as the configurations of the CPU 120, the ROM 122, the RAM 124, and the storage device 126 described above, and thus the description thereof is omitted.
  • the speaker 210 has been described above.
  • the display device 246 is a device that can visually notify the acquired information to the user.
  • the display device 246 constitutes the display unit 208.
  • the display device 246 can be, for example, a CRT display device, a liquid crystal display device, a plasma display device, an EL display device, and a lamp. Further, in the illustrated embodiment, the display device 246 is configured to be incorporated in the user terminal 200, but is not limited thereto, and the display device may exist outside the user terminal 200.
  • the input device 248 is an operation means operated by the user such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
  • the input device 248 may be, for example, remote control means (so-called remote control) using infrared rays or other radio waves, or an external connection device such as a mobile phone or a PDA that supports the operation of the user terminal 200. There may be.
  • the input device 248 is configured by, for example, an input control circuit that generates an input signal based on information input by the user using the above-described operation means and outputs the input signal to the CPU 240.
  • the user of the user terminal 200 can input various data and instruct processing operations to the user terminal 200 by operating the input device 248.
  • the communication device 252 includes a communication device for connecting to a wired and / or wireless wide area network (WAN) as necessary.
  • WAN wide area network
  • the vibration device 254 is a device for generating vibration that constitutes the vibration generating unit 212.
  • the vibration device 254 can generate vibration by, for example, rotation of a motor having an eccentric mass.
  • the receiving unit 202, the output control unit 204, the output unit 206, the music playback unit 214, the sensitivity information generation unit 216, the mood analysis unit 218, the storage unit 220, Functions of the input unit 226, the position information detection unit 228, and the transmission unit 230 are realized.
  • Computer program> A computer program for causing the hardware of each device on the information processing system 1000 described above, or the hardware such as the CPU 120, the ROM 122, the RAM 124, and the storage device 126 built in the server 100 to perform the functions of each device described above. Can also be created.
  • the server 100 downloads and installs a computer program
  • the receiving unit 102, the user information database 104, the music information database 106, the recommended user determining unit 108, the preference information generating unit 110, the notification information generating unit 112, and the server 100 are downloaded. Functions such as the transmission unit 114 may be implemented.
  • a storage medium storing the computer program is also provided.
  • a computer program for causing hardware such as the CPU 220, the ROM 222, the RAM 224, and the storage device 250 built in the user terminal 200 to perform the functions of each device described above can be created.
  • the receiving unit 202, the output control unit 204, the output unit 206, the music playback unit 214, the sensitivity information generation unit 216, the mood analysis unit 218, the storage Functions such as unit 220, input unit 226, position information detection unit 228, and transmission unit 230 may be implemented.
  • a storage medium storing the computer program is also provided.
  • An information acquisition unit for acquiring notification information for a part of the second music data related to the first music data, generated based on the sensitivity information of the first user at the time of reproduction of the first music data; Notification for outputting the notification information to the notification unit so that the notification information is notified to at least one of the first user and the second user in association with the portion at the time of reproduction of the second music data.
  • An information processing apparatus having an information output unit.
  • the notification by the notification unit is performed by outputting a sound in the portion of the second music data corresponding to the sensitivity information of the first user in accordance with the reproduction of the first music data, (1) or The information processing apparatus according to (2).
  • the first music data is data relating to a first music piece
  • the second music data is data relating to a second music piece including a portion related to a part of the first music piece. ).
  • the information processing apparatus includes a display unit, The notification by the notification unit is performed by displaying a notification image at a position corresponding to the sensitivity information along the time axis of the progress bar image of the second music data displayed on the display unit.
  • the information processing apparatus according to any one of 1) to (3).
  • the information processing apparatus according to (6), wherein the notification image includes user information regarding the first user, and the user information includes preference information for music of the first user.
  • the notification information output unit according to any one of (1) to (7), wherein a notification method by the notification unit is changed according to an instruction from the first user or the second user. Information processing device.
  • the notification information output unit includes at least one of information on at least one of the first user and the second user, and information on at least one of the first music data and the second music data.
  • the information processing apparatus according to any one of (1) to (8), wherein a notification method by the notification unit is changed according to any of them.
  • the notification by the notification unit is performed by outputting a sound in the part of the second music data corresponding to the sensitivity information of the first user in accordance with the reproduction of the first music data, The information processing apparatus according to (9), wherein changing the notification method is changing the output sound.
  • (11) The information processing apparatus according to (9) or (10), wherein the notification information output unit determines a notification method by the notification unit according to a type of music of the second music data to be reproduced. .
  • the notification information output unit determines a notification method by the notification unit according to an ambient environment of the information processing apparatus when the second music data is played back. (9) to (11) The information processing apparatus according to any one of claims. (13) The notification information output unit determines the notification method by the notification unit according to the number of the notification information acquired for the first music data, according to any one of (9) to (12), The information processing apparatus described. (14) The sensibility information is based on at least one of a change in the biological information of the first user when the first music data is reproduced and a body movement of the first user when the first music data is reproduced.
  • the information processing apparatus according to any one of (1) to (13), generated by (15) The information processing apparatus according to (14), wherein the sensitivity information includes body motion information calculated based on a frequency of body motion of the first user for each portion of the first music data. (16) The information processing apparatus according to (15), wherein the body motion information is calculated by integrating the frequency of the user's body motion for each portion of the first music data for each portion. (17) The body movement is detected by comparing the tempo of the music that each part of the first music data has with the period of amplitude in the movement of the first user, (14) to (16) The information processing apparatus according to any one of the above. (18) The information processing apparatus according to any one of (14) to (16), wherein the body movement is detected based on an input of the first user.
  • the information processing apparatus controls the notification unit to display an animation based on the notification information.
  • the notification information output unit controls the notification unit to display the animation including a diagram showing an amount corresponding to a change in the biological information of the first user at the time of reproduction of the first music data.
  • the information processing apparatus according to (19).
  • (21) Obtaining notification information for a part of the second music data related to the first music data generated based on the sensitivity information of the first user at the time of reproduction of the first music data;
  • a processor outputs the notification information to a notification unit so that the notification information is notified to the first user or the second user in association with the portion when the second music data is reproduced; , An information processing method.
  • Computer An information acquisition unit for acquiring notification information for a part of the second music data related to the first music data generated based on the sensitivity information of the first user at the time of reproduction of the first music data;
  • a notification information output unit that outputs the notification information to the notification unit so that the notification information is notified to the first user or the second user in association with the portion at the time of reproduction of the second music data;

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

[Problem] To provide an information processing device, an information processing method, and a program that are novel and improved, and that can provide notification of a section that a user had feelings about while appreciating music. [Solution] An information processing device comprising: an information acquisition unit that acquires notification information with respect to a section of second music data, said second music data being related to first music data, said notification information being generated on the basis of a first user's feeling information while playing back the first music data; and a notification information output unit that outputs the notification information to a notification unit such that at least one of the first user and the second user is notified of the notification information in association with the section while playing back the second music data.

Description

情報処理装置、情報処理方法及びプログラムInformation processing apparatus, information processing method, and program
 本開示は、情報処理装置、情報処理方法及びプログラムに関する。 The present disclosure relates to an information processing apparatus, an information processing method, and a program.
 近年、インターネットを介してユーザ間で音楽の情報を共有するいわゆる音楽共有サービスが提案されている。このような音楽共有サービスにおいては、ユーザ間で楽曲を紹介し合うことにより、各ユーザに新たな楽曲に出会う機会が提供される。 In recent years, a so-called music sharing service for sharing music information between users via the Internet has been proposed. In such a music sharing service, each user is provided with an opportunity to meet a new song by introducing songs to each other.
 また、ユーザの利用したコンテンツに対応するコンテンツメタ情報に基づいてユーザのユーザ嗜好ベクトルを生成し、同ユーザ嗜好ベクトルに基づいて他のユーザを紹介する情報処理装置が提案されている(例えば特許文献1)。 In addition, an information processing apparatus that generates a user preference vector of a user based on content meta information corresponding to the content used by the user and introduces other users based on the user preference vector has been proposed (for example, Patent Documents). 1).
特開2009-157899号公報JP 2009-157899 A
 しかし、特許文献1に記載の情報処理装置は、ユーザに対し、他のユーザが実際に音楽のどの部分に対して感情が動いたかをなんら通知していない。 However, the information processing apparatus described in Patent Document 1 does not notify the user what part of the music the other user actually moved.
 そこで、本開示では、ユーザが音楽の鑑賞時に感じた部分を通知することが可能な、新規かつ改良された情報処理装置、情報処理方法及びプログラムを提案する。 Therefore, the present disclosure proposes a new and improved information processing apparatus, information processing method, and program capable of notifying a part that a user feels when listening to music.
 本開示によれば、
第1の音楽データの再生時における第1のユーザの感性情報に基づいて生成された、前記第1の音楽データと関連する第2の音楽データの一部分に対する通知情報を取得する情報取得部と、
 前記第2の音楽データの再生時において前記通知情報が前記一部分と関連付けて前記第1のユーザおよび第2のユーザのうち少なくとも一方に通知されるように通知部に対し前記通知情報を出力する通知情報出力部とを有する、情報処理装置が提供される。
According to this disclosure,
An information acquisition unit for acquiring notification information for a part of the second music data related to the first music data, generated based on the sensitivity information of the first user at the time of reproduction of the first music data;
Notification for outputting the notification information to the notification unit so that the notification information is notified to at least one of the first user and the second user in association with the portion at the time of reproduction of the second music data. An information processing apparatus having an information output unit is provided.
 また、本開示によれば、
第1の音楽データの再生時における第1のユーザの感性情報に基づいて生成された前記第1の音楽データと関連する第2の音楽データの一部分に対する通知情報を取得することと、
 プロセッサが、前記第2の音楽データの再生時において前記通知情報が前記一部分と関連付けて前記第1のユーザまたは第2のユーザに通知されるように通知部に対し前記通知情報を出力することと、
を有する、情報処理方法が提供される。
In addition, according to the present disclosure,
Obtaining notification information for a part of the second music data related to the first music data generated based on the sensitivity information of the first user at the time of reproduction of the first music data;
A processor outputs the notification information to a notification unit so that the notification information is notified to the first user or the second user in association with the portion when the second music data is reproduced; ,
An information processing method is provided.
 また、本開示によれば、コンピュータを、
 第1の音楽データの再生時における第1のユーザの感性情報に基づいて生成された前記第1の音楽データと関連する第2の音楽データの一部分に対する通知情報を取得する情報取得部と、
 前記第2の音楽データの再生時において前記通知情報が前記一部分と関連付けて前記第1のユーザまたは第2のユーザに通知されるように通知部に対し前記通知情報を出力する通知情報出力部と、として機能させるための、プログラムが提供される。
According to the present disclosure, the computer is
An information acquisition unit for acquiring notification information for a part of the second music data related to the first music data generated based on the sensitivity information of the first user at the time of reproduction of the first music data;
A notification information output unit that outputs the notification information to the notification unit so that the notification information is notified to the first user or the second user in association with the portion at the time of reproduction of the second music data; , A program is provided to function as.
 以上説明したように本開示によれば、ユーザが音楽の鑑賞時に感じた部分を通知することができる。 As described above, according to the present disclosure, it is possible to notify the part that the user feels when listening to music.
 なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。 Note that the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
本開示の実施形態に係る情報処理システムの概要図である。1 is a schematic diagram of an information processing system according to an embodiment of the present disclosure. 本開示の実施形態に係る情報処理システムにおいて提供されるサービスの一例を説明する概念図である。4 is a conceptual diagram illustrating an example of a service provided in an information processing system according to an embodiment of the present disclosure. FIG. 本開示の実施形態に係る情報処理システムにおいて提供されるサービスの一例を説明する概念図である。4 is a conceptual diagram illustrating an example of a service provided in an information processing system according to an embodiment of the present disclosure. FIG. 本開示の実施形態に係る情報処理システムにおいて提供されるサービスの一例を説明する概念図である。4 is a conceptual diagram illustrating an example of a service provided in an information processing system according to an embodiment of the present disclosure. FIG. 本実施形態に係るサーバ(第1の情報処理装置)の機能構成の概略を示すブロック図である。It is a block diagram which shows the outline of a function structure of the server (1st information processing apparatus) which concerns on this embodiment. 図5に示すサーバの推薦ユーザ決定部で用いられる感性情報に関する表である。It is a table | surface regarding the sensitivity information used by the recommended user determination part of the server shown in FIG. 図5に示すサーバの推薦ユーザ決定部で用いられる感性情報に関する表である。It is a table | surface regarding the sensitivity information used by the recommended user determination part of the server shown in FIG. 図5に示すサーバの推薦ユーザ決定部で用いられる感性情報に関する表である。It is a table | surface regarding the sensitivity information used by the recommended user determination part of the server shown in FIG. 図5に示すサーバの嗜好情報生成部において用いられるデータの一例である。It is an example of the data used in the preference information generation part of the server shown in FIG. 図5に示すサーバの嗜好情報生成部において用いられるデータの一例である。It is an example of the data used in the preference information generation part of the server shown in FIG. 図5に示すサーバの嗜好情報生成部において用いられるデータの一例である。It is an example of the data used in the preference information generation part of the server shown in FIG. 図5に示すサーバの嗜好情報生成部において用いられるデータの一例である。It is an example of the data used in the preference information generation part of the server shown in FIG. 図5に示すサーバの嗜好情報生成部において用いられるデータの一例である。It is an example of the data used in the preference information generation part of the server shown in FIG. 本開示の実施形態に係るユーザ端末(第2の情報処理端末)の機能構成の概略を示すブロック図である。It is a block diagram showing an outline of functional composition of a user terminal (second information processing terminal) concerning an embodiment of this indication. 本開示の実施形態に係るユーザ端末のユーザインタフェースの画面遷移図である。It is a screen transition figure of the user interface of the user terminal concerning an embodiment of this indication. 本開示の実施形態に係るユーザ端末のユーザインタフェース画面の一例である。It is an example of the user interface screen of the user terminal which concerns on embodiment of this indication. 本開示の実施形態に係るユーザ端末のユーザインタフェース画面の一例である。It is an example of the user interface screen of the user terminal which concerns on embodiment of this indication. 本開示の実施形態に係るユーザ端末のユーザインタフェース画面の一例である。It is an example of the user interface screen of the user terminal which concerns on embodiment of this indication. 本開示の実施形態に係るユーザ端末のユーザインタフェース画面の一例である。It is an example of the user interface screen of the user terminal which concerns on embodiment of this indication. 本開示の実施形態に係るユーザ端末のユーザインタフェース画面の一例である。It is an example of the user interface screen of the user terminal which concerns on embodiment of this indication. 本開示の実施形態に係るユーザ端末のユーザインタフェース画面の一例である。It is an example of the user interface screen of the user terminal which concerns on embodiment of this indication. 本開示の実施形態に係るユーザ端末のユーザインタフェース画面の一例である。It is an example of the user interface screen of the user terminal which concerns on embodiment of this indication. 本開示の実施形態に係る情報処理システムの動作の一例を示すシーケンス図である。5 is a sequence diagram illustrating an example of an operation of an information processing system according to an embodiment of the present disclosure. FIG. 本開示の実施形態に係る情報処理システムの動作の一例を示すシーケンス図である。5 is a sequence diagram illustrating an example of an operation of an information processing system according to an embodiment of the present disclosure. FIG. 本開示の実施形態に係る情報処理システムの動作の一例を示すシーケンス図である。5 is a sequence diagram illustrating an example of an operation of an information processing system according to an embodiment of the present disclosure. FIG. 図5に示すサーバの推薦ユーザ決定部で用いられる情報の他の一例である。It is another example of the information used by the recommended user determination part of the server shown in FIG. 図5に示すサーバの推薦ユーザ決定部で用いられる情報の他の一例である。It is another example of the information used by the recommended user determination part of the server shown in FIG. 図5に示すサーバの推薦ユーザ決定部で用いられる情報の他の一例である。It is another example of the information used by the recommended user determination part of the server shown in FIG. 図5に示すサーバの嗜好情報生成部において用いられるデータの他の一例である。It is another example of the data used in the preference information generation part of the server shown in FIG. 図5に示すサーバの嗜好情報生成部において用いられるデータの他の一例である。It is another example of the data used in the preference information generation part of the server shown in FIG. 本開示の変形例に係る嗜好情報の表示を示す。The display of the preference information which concerns on the modification of this indication is shown. 本開示の変形例に係る嗜好情報の表示を示す。The display of the preference information which concerns on the modification of this indication is shown. 本開示の変形例に係る嗜好情報の表示を示す。The display of the preference information which concerns on the modification of this indication is shown. 本開示の変形例に係る嗜好情報の表示を示す。The display of the preference information which concerns on the modification of this indication is shown. 本開示の変形例に係る嗜好情報の表示を示す。The display of the preference information which concerns on the modification of this indication is shown. 本開示の変形例に係るユーザ端末が備える編集音楽データ生成部による音楽データの編集を示す概要図である。It is a schematic diagram which shows edit of the music data by the edit music data generation part with which the user terminal which concerns on the modification of this indication is provided. 図1に示すサーバのハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the server shown in FIG. 図1に示すユーザ端末のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the user terminal shown in FIG.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。また、本明細書および図面において、類似する構成要素については、同一の符号の後に異なるアルファベットを付して区別する。ただし、実質的に同一の機能構成を有する複数の構成要素等の各々を特に区別する必要がない場合、同一符号のみを付する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol. Further, in the present specification and drawings, similar components are distinguished by attaching different alphabets after the same reference numerals. However, when there is no need to particularly distinguish each of a plurality of constituent elements having substantially the same functional configuration, only the same reference numerals are given.
 なお、説明は以下の順序で行うものとする。
 1.情報処理システムの外観例
 2.サーバの構成例(第1の情報処理装置)
 3.ユーザ端末の構成例(第2の情報処理装置)
 4.ユーザ端末のユーザインタフェース例
 5.情報処理システムの動作例
 6.変形例
 7.サーバのハードウェア構成例
 8.ユーザ端末のハードウェア構成例
 9.コンピュータプログラム
The description will be made in the following order.
1. 1. Appearance example of information processing system Server configuration example (first information processing apparatus)
3. Configuration example of user terminal (second information processing apparatus)
4). 4. User interface example of user terminal 5. Example of operation of information processing system Modification 7 7. Server hardware configuration example 8. Hardware configuration example of user terminal Computer program
 <1.情報処理システムの外観例>
 まず、図1~図4を参照して、本開示の実施形態に係る情報処理システム1000の概略構成について説明する。
<1. Appearance example of information processing system>
First, a schematic configuration of an information processing system 1000 according to an embodiment of the present disclosure will be described with reference to FIGS.
 図1は、本開示の実施形態に係る情報処理システム1000の概要図である。図1に示す情報処理システム1000は、サーバ100と、複数のユーザ端末200とを有し、これらは、ネットワーク300を介して通信可能に接続されている。 FIG. 1 is a schematic diagram of an information processing system 1000 according to an embodiment of the present disclosure. An information processing system 1000 illustrated in FIG. 1 includes a server 100 and a plurality of user terminals 200, which are communicably connected via a network 300.
 サーバ100は、本開示に係る第1の情報処理端末の一例であり、ユーザ端末200から、ユーザの音楽に対する感性についての情報(以下単に「感性情報」ともいう)を収集し、感性情報またはこれに基づく情報を他のユーザ端末200に送信する。 The server 100 is an example of a first information processing terminal according to the present disclosure. The server 100 collects information about the user's sensitivity to music (hereinafter also simply referred to as “sensitivity information”) from the user terminal 200, and the sensitivity information or the information Is transmitted to the other user terminal 200.
 ここで、本開示において、「音楽に対する感性」とは、音楽データの再生時において生じる視聴者(ユーザ)の感情の動き、いわゆる「ノリ」をいう。このような感情の動きは、例えば音楽データの再生時におけるユーザの生体情報の変化に基づいて検出することができる。生体情報としては、特に限定されないが、例えば、心拍、体温、発汗、血圧、発汗、脈拍、呼吸、瞬目、眼球運動、凝視時間、瞳孔径の大きさ、血圧、脳波、体動、体位、皮膚温度、皮膚電気抵抗等が挙げられる。このような生体情報の変化は、例えば後述する入力部226や、心拍計、血圧計、脳波測定機、脈拍計、体温計、加速度センサ、ジャイロセンサまたは地磁気センサ等の他のセンサにより検出可能である。そして、入力部226またはセンサによって検出された上記情報を感性情報として利用することができる。 Here, in the present disclosure, “sensitivity to music” refers to the movement of the viewer's (user's) emotion that occurs during playback of music data, so-called “pitch”. Such emotional movements can be detected based on, for example, changes in the user's biometric information during music data playback. The biological information is not particularly limited, for example, heart rate, body temperature, sweating, blood pressure, sweating, pulse, breathing, blink, eye movement, staring time, pupil size, blood pressure, brain wave, body movement, posture, Examples thereof include skin temperature and skin electrical resistance. Such changes in biological information can be detected by, for example, an input unit 226 described later, or other sensors such as a heart rate monitor, a blood pressure monitor, an electroencephalograph, a pulse meter, a thermometer, an acceleration sensor, a gyro sensor, or a geomagnetic sensor. . And the said information detected by the input part 226 or the sensor can be utilized as sensitivity information.
 また、本開示において、「音楽データ」とは、音楽情報を含むデータをいう。したがって、音楽データは、音情報のみ含むデータのみならず、静止画および動画等の画像を含む動画データを含む。また、音楽データは、さらに、他の情報、例えば、発光素子の点灯、振動の発生、他のアプリケーションの動作等に関する情報を含んでいてもよい。また、本開示において、「音楽データの部分」とは、音楽データの一部分のみならず全部も含む概念である。 Further, in the present disclosure, “music data” refers to data including music information. Therefore, the music data includes not only data including only sound information but also moving image data including images such as still images and moving images. The music data may further include other information such as information on lighting of the light emitting element, occurrence of vibration, operation of other applications, and the like. Further, in the present disclosure, the “music data portion” is a concept including not only a part of music data but also all.
 ユーザ端末200は、本開示に係る第2の情報処理装置の一例である。ユーザ端末200は、例えばスマートフォン、携帯電話、タブレット、メディアプレーヤ、デスクトップコンピュータ、ラップトップコンピュータ等の音楽データを再生可能な電子機器である。各ユーザ端末200は、それぞれ使用するユーザの感性情報を収集し、サーバ100に送信する。また、各ユーザ端末200は、サーバ100より他のユーザの感性情報またはこれに基づく情報を受信し、これらの情報をそれぞれのユーザに対して通知する。 The user terminal 200 is an example of a second information processing apparatus according to the present disclosure. The user terminal 200 is an electronic device that can reproduce music data, such as a smartphone, a mobile phone, a tablet, a media player, a desktop computer, and a laptop computer. Each user terminal 200 collects the sensitivity information of the user to be used and transmits it to the server 100. In addition, each user terminal 200 receives sensitivity information of other users or information based thereon from the server 100, and notifies each user of these information.
 また、ユーザ端末200は、他のユーザ端末200と、サーバ100を介して、または介さずに通信可能に構成されている。これにより、ユーザ端末200を操作するユーザは、他のユーザと例えば、チャット、メッセンジャー等により、会話および情報交換を行うことができる。 Also, the user terminal 200 is configured to be able to communicate with other user terminals 200 via or without the server 100. Thereby, the user who operates the user terminal 200 can perform conversation and information exchange with other users, for example, by chat, messenger, or the like.
 ネットワーク300は、ネットワーク300に接続されている装置から送信される情報の有線、または無線の伝送路である。例えば、ネットワーク300は、インターネット、電話回線網、衛星通信網等の公衆回線網や、Ethernet(登録商標)を含む各種のLAN(Local Area Network)、WAN(Wide Area Network)等を含んでもよい。また、ネットワーク300は、IP-VPN(Internet Protocol-Virtual Private Network)等の専用回線網を含んでもよい。 The network 300 is a wired or wireless transmission path for information transmitted from a device connected to the network 300. For example, the network 300 may include a public line network such as the Internet, a telephone line network, a satellite communication network, various types of LANs (Local Area Network) including Ethernet (registered trademark), WAN (Wide Area Network), and the like. Further, the network 300 may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
 以上のような情報処理システム1000は、ユーザ端末200を所持するユーザに対し、例えば図2~4に示すサービスを提供することが可能である。図2~図4は、本開示の実施形態に係る情報処理システム1000において提供されるサービスの一例を説明する概念図である。情報処理システム1000は、図2に示す「推薦ユーザの表示」、図3に示す「嗜好情報の表示」および図4に示す「通知情報の再生」といったサービスを提供する。 The information processing system 1000 as described above can provide, for example, the services shown in FIGS. 2 to 4 to the user who owns the user terminal 200. 2 to 4 are conceptual diagrams illustrating examples of services provided in the information processing system 1000 according to the embodiment of the present disclosure. The information processing system 1000 provides services such as “recommended user display” shown in FIG. 2, “preference information display” shown in FIG. 3, and “reproduction of notification information” shown in FIG.
 (推薦ユーザの表示)
 図2は、感性情報を使用してユーザを他のユーザに推薦するサービスの一例である。図2に示すように、各ユーザ端末200において音楽データ400を再生した場合に、ユーザ端末200は、ユーザA、ユーザB、ユーザCの感情の動き(以下「感性ポイント」ともいう)401A、401B、401Cの発生および発生位置を音楽データの各部分と関連付けて検出する。検出された感性ポイントについての情報は、感性情報として各ユーザ端末200からサーバ100へ送信される。
(Recommended user display)
FIG. 2 is an example of a service that recommends a user to another user using sensitivity information. As shown in FIG. 2, when music data 400 is reproduced on each user terminal 200, the user terminal 200 moves the emotions of user A, user B, and user C (hereinafter also referred to as “sensitivity points”) 401 A, 401 B. , 401C occurrence and occurrence position are detected in association with each part of the music data. Information about the detected sensitivity point is transmitted from each user terminal 200 to the server 100 as sensitivity information.
 サーバ100は、これらの感性情報を比較して、ユーザAに対しいずれのユーザを紹介(推薦)すべきかを決定する。例えば、図2においては、音楽データ400においてAメロである部分410、Bメロである部分420、サビである部分430が再生される。この場合において、ユーザAの感性ポイント401Aと、ユーザBの感性ポイント401Bとは、部分410の初期において重複している。一方で、ユーザAの感性ポイント401Aと、ユーザCの感性ポイント401Cとは、部分410の初期および部分420の中間において重複している。このように、ユーザA(第1のユーザ)の第1の音楽データの一部分に対する感性情報とユーザC(第2のユーザ)の第2の音楽データの一部分に対する感性情報とは、関連する。したがって、サーバ100は、感性ポイント401Aとの重複がより多い感性ポイント401Cを生じさせたユーザCをユーザAに紹介すべきと決定する。そして、サーバ100よりユーザCについての情報がユーザAのユーザ端末200に送信され、同情報がユーザ端末200に表示される。 The server 100 compares these pieces of sensitivity information and determines which user to introduce (recommend) to the user A. For example, in FIG. 2, a portion 410 that is A melody, a portion 420 that is B melody, and a portion 430 that is chorus are reproduced in the music data 400. In this case, the sensitivity point 401 </ b> A of the user A and the sensitivity point 401 </ b> B of the user B overlap at the initial stage of the portion 410. On the other hand, the sensitivity point 401 </ b> A of the user A and the sensitivity point 401 </ b> C of the user C overlap in the initial part of the part 410 and in the middle of the part 420. As described above, the sensitivity information for the part of the first music data of the user A (first user) and the sensitivity information for the part of the second music data of the user C (second user) are related. Therefore, the server 100 determines that the user C who has generated the sensitivity point 401C having more overlap with the sensitivity point 401A should be introduced to the user A. Then, information about the user C is transmitted from the server 100 to the user terminal 200 of the user A, and the same information is displayed on the user terminal 200.
 これにより、あるユーザは、情報処理システム1000を利用して、音楽に対してより感性の近いユーザを見出すことが可能となる。そして、音楽に対する感性の近いユーザ同士での交流が可能となる。すなわち、本開示では、あるユーザに対し、音楽に対する感性の近いユーザを提示することが可能な、新規かつ改良された情報処理装置、情報処理システム、情報処理方法及びプログラムを提供することができる。 As a result, a user can use the information processing system 1000 to find a user who is more sensitive to music. And exchange between users with close sensibility to music is possible. That is, in the present disclosure, it is possible to provide a new and improved information processing apparatus, information processing system, information processing method, and program capable of presenting a user with a close sensitivity to music to a certain user.
 なお、特許文献1に提案されている情報処理装置における処理では、ユーザが利用したコンテンツに対して実際に当該ユーザがどのように感じたかが考慮されていない。したがって、特許文献1に情報処理装置は、必ずしも感性の近いユーザを紹介するものではない。 Note that the processing in the information processing apparatus proposed in Patent Document 1 does not consider how the user actually felt the content used by the user. Therefore, the information processing apparatus described in Patent Document 1 does not necessarily introduce users with close sensibilities.
 (通知情報の再生)
 図3は、他のユーザの感性情報に基づいた通知情報をユーザに通知するサービスの一例である。まず、ユーザCが所持するユーザ端末200において、音楽データ400が再生され、これに対応する感性ポイント401Cが検出される。検出された感性ポイントの情報は、感性情報としてユーザCのユーザ端末200からサーバ100へ送信される。
(Replay of notification information)
FIG. 3 is an example of a service for notifying the user of notification information based on the sensitivity information of other users. First, the music data 400 is reproduced on the user terminal 200 possessed by the user C, and the sensitivity point 401C corresponding to this is detected. Information on the detected sensitivity points is transmitted from the user terminal 200 of the user C to the server 100 as sensitivity information.
 サーバ100は、受信した感性情報に基づいて、通知情報403を生成し、ユーザAのユーザ端末200に送信する。通知情報403を受信したユーザAのユーザ端末200は、音楽データ400の再生時において、楽曲の再生に合わせてユーザAに通知情報403を提示する。図示の態様においては、通知情報403は、音情報(効果音)である。 The server 100 generates notification information 403 based on the received sensitivity information and transmits it to the user terminal 200 of the user A. The user terminal 200 of the user A who has received the notification information 403 presents the notification information 403 to the user A along with the reproduction of the music when the music data 400 is reproduced. In the illustrated embodiment, the notification information 403 is sound information (sound effects).
 これにより、ユーザAは、楽曲中のユーザCの感情が動いた部分(いわゆる「ノッた」部分)を音楽データ400の再生とともに知ることができる。したがって、ユーザAは、より詳細に、かつ身近にユーザCの音楽に対する感性を感じることができ、ユーザCに対して共感することができる。結果、ユーザは、一人で音楽を聴いている場合であっても、ライブ会場に参加した場合のように、他のユーザと一緒に音楽を聴いている感覚を得ることができる。すなわち、本開示では、ユーザが音楽の鑑賞時に感情が動いた部分を通知することが可能な、新規かつ改良された情報処理装置、情報処理方法及びプログラムを提供することができる。 Thereby, the user A can know the part (so-called “knocked” part) in which the emotion of the user C moves in the music along with the reproduction of the music data 400. Therefore, the user A can feel the sensitivity of the music of the user C in detail and close to the user C, and can feel sympathy with the user C. As a result, even when the user is listening to music alone, the user can feel as if he / she is listening to music together with other users as in the case of participating in a live venue. In other words, the present disclosure can provide a new and improved information processing apparatus, information processing method, and program capable of notifying a part where an emotion has moved during appreciation of music.
 なお、特許文献1に記載の情報処理装置は、ユーザに対し、他のユーザが実際に音楽のどの部分に対して感情が動いたかをなんら通知していない。 Note that the information processing apparatus described in Patent Document 1 does not notify the user what part of the music the other user actually moved.
 (嗜好情報の表示)
 図4は、ユーザを感性情報に基づいた嗜好情報とともに表現するサービスの一例を示している。図4に示すユーザ情報画像500においては、ユーザ自身のユーザ画像501が表示されるとともに、ユーザのムード画像503A、503B、503Cがユーザ画像501の周囲に配置されている。このような嗜好情報503A、503B、503Cは、ユーザが音楽データを再生した際に得られる感性情報に基づいて生成され、各嗜好情報のムードに応じて色分けして表示される。ムード画像503Aは、例えば、ノリノリ(Euphoric)である嗜好情報を示し赤色(図中、ドット柄)で表示される。ムード画像503Bは、例えば、幸せ(Happy)であることを示し緑色(図中、破線を含むハッチング)で表示される。嗜好情報503Cは、例えば、楽しい(Joyful)ことを示し黄色(図中、六角形を含む模様)で表示される。このようなユーザ情報画像500は、ムード画像503A、503B、503Cについてのユーザのユーザ端末200のみならず他のユーザのユーザ端末200においても表示される。ここで、本明細書においてムードとは、音楽データに関する楽曲またはその部分の再生時において現れる雰囲気であり、例えば曲調、曲想、情調および気分を含む。このようなムードは、例えば後述するような「ノリノリ(Euphoric)」、「幸せ(Happy)」、「楽しい(Joyful)」、「穏やか(Mild)」、「悲しい(Sad)」、「おごそか(Solemn)」、「明るい(Bright)」、「癒し(Healing)」、「爽やか(Fresh)」および「優雅(Elegant)」等の複数種に分類されることができる。このようなムードの種類は、楽曲の特徴量に基づき推定することが可能である。また、楽曲のムードは、後述するムード解析部において解析可能である。
(Display of preference information)
FIG. 4 shows an example of a service that represents a user together with preference information based on sensitivity information. In the user information image 500 shown in FIG. 4, the user image 501 of the user himself / herself is displayed, and the mood images 503 </ b> A, 503 </ b> B, and 503 </ b> C of the user are arranged around the user image 501. Such preference information 503A, 503B, and 503C is generated based on sensitivity information obtained when the user reproduces music data, and is displayed in different colors according to the mood of each preference information. The mood image 503A indicates preference information that is, for example, “Europhoric” and is displayed in red (dot pattern in the figure). The mood image 503B indicates, for example, that it is happy (Happy) and is displayed in green (hatching including a broken line in the drawing). The preference information 503C indicates, for example, that it is fun (Joyful) and is displayed in yellow (in the drawing, a pattern including a hexagon). Such a user information image 500 is displayed not only on the user terminal 200 of the user for the mood images 503A, 503B, and 503C but also on the user terminals 200 of other users. Here, the mood in the present specification is an atmosphere that appears at the time of reproduction of a musical piece related to music data or a portion thereof, and includes, for example, a musical tone, a musical idea, a mood, and a mood. Such moods include, for example, “Euphoric”, “Happy”, “Joyful”, “Mild”, “Sad”, “Solenn” as described below. ) "," Bright "," Healing "," Fresh ", and" Elegant ". Such a kind of mood can be estimated based on the feature amount of music. Further, the mood of the music can be analyzed by a mood analysis unit described later.
 これにより、ユーザ情報画像500に接したユーザは、ユーザ情報画像500に表示されたユーザが音楽に対するどのような嗜好を有するか、明確に判断することができる。また、ユーザ情報画像500に表示されるユーザの嗜好情報は、音楽データを再生した際にユーザの感情が実際に動いたことに基づく。したがって、このような嗜好情報は、単に楽曲のジャンルや再生履歴に基づいた嗜好に関する情報と比較して、ユーザの実際の嗜好をより正確に表している。すなわち、本開示では、ユーザの音楽に対する感性を反映させた嗜好情報を表示することが可能な、新規かつ改良された情報処理装置、情報処理方法及びプログラムを提供することができる。 Thus, the user who contacts the user information image 500 can clearly determine what preference the user displayed in the user information image 500 has for music. Also, the user preference information displayed in the user information image 500 is based on the fact that the user's emotions actually moved when music data was reproduced. Therefore, such preference information simply represents the user's actual preference more accurately than information on preference based on the genre of music or the playback history. That is, according to the present disclosure, it is possible to provide a new and improved information processing apparatus, information processing method, and program capable of displaying preference information that reflects a user's sensitivity to music.
 なお、特許文献1に提案されている紹介理由は、あくまでもコンテンツメタ情報に基づいて生成されており、ユーザが利用したコンテンツに対して実際にどのように感じたかが考慮されていない。したがって、特許文献1の情報処理装置は、紹介すべきユーザの嗜好を十分に紹介理由として示すことができない。 Note that the reason for introduction proposed in Patent Document 1 is generated based on content meta information to the last, and does not take into account how the user actually felt the content. Therefore, the information processing apparatus of Patent Document 1 cannot sufficiently indicate the user's preference to be introduced as the reason for introduction.
 <2.サーバの構成例(第1の情報処理装置)>
 次に、図5~図13を参照して、本実施形態に係るサーバ100の構成について説明する。
<2. Server configuration example (first information processing apparatus)>
Next, the configuration of the server 100 according to the present embodiment will be described with reference to FIGS.
 図5は、本実施形態に係るサーバ100の機能構成の概略を示すブロック図である。図5に示すように、サーバ100は、受信部102と、ユーザ情報データベース104と、楽曲情報データベース106と、推薦ユーザ決定部108と、嗜好情報生成部110と、通知情報生成部112と、送信部114とを有している。 FIG. 5 is a block diagram illustrating an outline of a functional configuration of the server 100 according to the present embodiment. 5, the server 100 includes a receiving unit 102, a user information database 104, a music information database 106, a recommended user determining unit 108, a preference information generating unit 110, a notification information generating unit 112, and a transmission. Part 114.
 (受信部)
 受信部102は、ネットワーク300に接続されており、ネットワーク300を介して、ユーザ端末200等の電子機器からの情報を受信することができる。具体的には、受信部102は、ユーザ端末200から、ユーザ端末200を所有するユーザの、感性情報、ユーザ情報、再生履歴、保有楽曲情報等のユーザに関する情報ならびにユーザ端末200に保存されている音楽データのメタ情報およびムード情報等を受信する。
(Receiver)
The receiving unit 102 is connected to the network 300 and can receive information from an electronic device such as the user terminal 200 via the network 300. Specifically, the receiving unit 102 is stored in the user terminal 200 from the user terminal 200, such as sensitivity information, user information, reproduction history, owned music information, and the like regarding the user who owns the user terminal 200. Receives meta information and mood information of music data.
 受信部102は、ユーザ端末を有するユーザのユーザ情報および感性情報を受信すると、同ユーザ情報および感性情報をユーザ情報データベース104に入力する。また、受信部102は、音楽データのメタ情報およびムード情報を受信すると、楽曲情報データベース106に入力する。 When receiving the user information and sensitivity information of the user having the user terminal, the receiving unit 102 inputs the user information and sensitivity information to the user information database 104. Further, when receiving the meta information and mood information of the music data, the receiving unit 102 inputs the music information to the music information database 106.
 (ユーザ情報データベース)
 ユーザ情報データベース104は、楽曲情報データベース106とともに記憶部を構成する。ユーザ情報データベース104は、ユーザ端末200を所持するユーザの情報を記憶する。このようなユーザに関する情報としては、ユーザID、ユーザ名、ユーザ画像等のユーザプロフィール情報、当該ユーザのお気に入りユーザに関する情報、音楽データ再生履歴、保有する音楽データの楽曲ID、感性情報、嗜好情報、効果音情報等が挙げられる。
(User information database)
The user information database 104 constitutes a storage unit together with the music information database 106. The user information database 104 stores information on a user who owns the user terminal 200. As such user information, user ID information such as user ID, user name, user image, information related to the user's favorite user, music data reproduction history, song ID of music data held, sensitivity information, preference information, Sound effect information etc. are mentioned.
 (楽曲情報データベース)
 楽曲情報データベース106は、音楽データに関する情報を記録する。音楽データに関する情報としては、例えば各音楽データの楽曲ID、楽曲名、アーティスト情報、アルバム情報、カバー画像、ジャンルおよびムード情報等のメタ情報が挙げられる。
(Music information database)
The music information database 106 records information related to music data. Examples of information related to music data include meta information such as song ID, song name, artist information, album information, cover image, genre, and mood information of each music data.
 (推薦ユーザ決定部)
 推薦ユーザ決定部108は、ユーザ間の感性情報の関連性に基づいて、あるユーザに対して推薦(提示)するユーザを決定する。言い換えると、推薦ユーザ決定部108は、感性情報を含む情報を比較して、第2のユーザについての感性情報と関連する感性情報が関連付けられた第1のユーザを特定するユーザ特定部として機能する。本実施形態においては、推薦ユーザ決定部108は、例えば、図6~8に記載されるように、ユーザ間で感性ポイントの相対頻度の積を算出することにより推薦するユーザを決定する。図6~8は、図5に示すサーバ100の推薦ユーザ決定部108で用いられる感性情報に関する表である。
(Recommended user decision section)
The recommended user determination unit 108 determines a user to be recommended (presented) to a certain user based on the relevance of sensitivity information between users. In other words, the recommended user determination unit 108 functions as a user specifying unit that compares the information including the sensitivity information and specifies the first user associated with the sensitivity information related to the sensitivity information about the second user. . In the present embodiment, the recommended user determining unit 108 determines a recommended user by calculating the product of the relative frequencies of the sensitivity points among the users, for example, as described in FIGS. 6 to 8 are tables relating to sensitivity information used by the recommended user determination unit 108 of the server 100 shown in FIG.
 推薦ユーザ決定部108は、まず、ユーザ情報データベースから、各ユーザの音楽データA1についての感性ポイントを取得し、表601に示すようなデータを得る。表601には、ユーザX1、X2、X3についてのある音楽データの各部分に対する感性ポイントの数(感性ポイント数)が示されている。 The recommended user determination unit 108 first acquires the sensitivity points for the music data A1 of each user from the user information database, and obtains data as shown in Table 601. Table 601 shows the number of sensitivity points (the number of sensitivity points) for each portion of certain music data for the users X1, X2, and X3.
 ここで、推薦の候補となるユーザは、例えば、推薦ユーザの提示を受けるユーザが保有する楽曲の少なくとも一部の楽曲A1~Anを保有するユーザである。また、推薦ユーザの決定に用いられる楽曲A1~Anについての音楽データは、推薦ユーザの提示を受けるユーザについて後述する方法により感性情報が生成された音楽データである。また、このような音楽データについて、推薦ユーザの決定に際し候補となるユーザ全てについて感性情報が生成されていることが好ましい。 Here, the user who is a candidate for recommendation is, for example, a user who holds at least some of the songs A1 to An of the songs held by the user who receives the recommendation user's presentation. The music data for the music pieces A1 to An used for determining the recommended user is music data in which sensitivity information is generated by a method described later for the user who receives the recommendation user's presentation. In addition, for such music data, it is preferable that sensibility information is generated for all users who are candidates when determining recommended users.
 本実施形態において、感性ポイントは、音楽データの各部分に対するユーザの体動に基づいて生成される。さらに、感性ポイント数は、音楽データの再生時における体動の頻度を音楽データの部分毎に積算して得られる値である。 In the present embodiment, the sensitivity points are generated based on the user's body movement with respect to each part of the music data. Furthermore, the number of sensibility points is a value obtained by integrating the frequency of body movements at the time of reproduction of music data for each piece of music data.
 また、図示の態様において、音楽データの部分は、音楽データのフレーズをさらに複数に、より具体的には始期、中期、終期の3つに分割することにより得られる区画である。なお、図示の態様に限定されず、上記部分は、フレーズそのものであってもよいし、ストレインまたはこれを分割して得られる区分であってもよいし、小節であってもよい。しかしながら、音楽データの一定のまとまりのある部分、例えばフレーズに対してユーザの感情の動きが生じることを鑑みると、音楽データの部分は、音楽データの各フレーズを複数に分割して得られる区間であることが好ましい。 Further, in the illustrated embodiment, the music data portion is a section obtained by dividing the music data phrase into a plurality of phrases, more specifically, a start period, a middle period, and an end period. In addition, it is not limited to the aspect of illustration, The said phrase may be sufficient as the said part, the division obtained by dividing | segmenting a strain or this may be sufficient, and a measure may be sufficient as it. However, in view of the fact that the user's emotional movement occurs with respect to a certain unit of music data, for example, a phrase, the music data part is a section obtained by dividing each phrase of music data into a plurality of parts. Preferably there is.
 次いで、推薦ユーザ決定部108は、図7に示すように、表601に記載された感性ポイント数の音楽データ内での相対頻度をユーザ毎に算出し、表603に示すような相対頻度情報を得る。 Next, as shown in FIG. 7, the recommended user determining unit 108 calculates the relative frequency in the music data of the number of sensitivity points described in the table 601 for each user, and calculates the relative frequency information as shown in the table 603. obtain.
 次いで、推薦ユーザ決定部108は、図7に示す感性ポイント数の相対頻度を音楽データの部分毎にユーザ間で乗じ、図8に示すようにユーザ間の感性ポイント数の相対頻度の積を得る。そして、部分毎の積の合計(総和)を、楽曲A1におけるユーザ間の感性の関連性の程度として、得る。例えば、ユーザX1にユーザX2、X3のいずれかを推薦する場合を考えると、相対頻度の積の合計がより大きいユーザX2の方が、ユーザX3よりもユーザX1との感性の関連性が高いことになる。また、得る。例えば、ユーザX2にユーザX1、X3のいずれかを推薦する場合を考えると、相対頻度の積の合計がより大きいユーザX1の方が、ユーザX3よりもユーザX2との感性の関連性が高いことになる。 Next, the recommended user determination unit 108 multiplies the relative frequency of the sensitivity points shown in FIG. 7 between the users for each piece of music data, and obtains the product of the relative frequencies of the sensitivity points between the users as shown in FIG. . Then, the sum (sum) of the products for each part is obtained as the degree of sensitivity relatedness between users in the music piece A1. For example, considering a case where one of the users X2 and X3 is recommended to the user X1, the user X2 having a larger sum of the products of the relative frequencies has a higher sensitivity relationship with the user X1 than the user X3. become. Also get. For example, considering the case where one of the users X1 and X3 is recommended to the user X2, the user X1 having a larger sum of the products of the relative frequencies has a higher sensitivity relationship with the user X2 than the user X3. become.
 推薦ユーザ決定部108は、以上のような計算を楽曲A2~Anについて行い、楽曲A1~Anについての上記の部分毎の積の総和を積算して、ユーザ間の感性関連度を得る。そして、推薦ユーザ決定部108は、推薦ユーザの提示を受けるユーザとの感性関連度の相対的に大きいユーザを、候補となるユーザから選択して推薦ユーザとして決定する。なお、決定される推薦ユーザは、複数であってもよい。 The recommended user determination unit 108 performs the above calculation for the music pieces A2 to An, and adds up the sum of the products of the parts for the music pieces A1 to An to obtain the degree of sensitivity related between users. Then, the recommended user determination unit 108 selects a user having a relatively high sensitivity relevance with the user who receives the recommendation user from the candidate users and determines it as a recommended user. Note that a plurality of recommended users may be determined.
 推薦ユーザ決定部108は、決定した推薦ユーザについてのユーザ情報を、推薦ユーザの提示を受けるユーザのユーザ端末200へ、送信部230を介して送信する。または、推薦ユーザ決定部108は、決定した推薦ユーザについての情報をユーザ情報データベース104に入力してもよい。この場合、ユーザ端末200からの要求に応じて、または定期的に、決定した推薦ユーザについてのユーザ情報が、推薦ユーザの提示を受けるユーザのユーザ端末200へ送信される。 The recommended user determination unit 108 transmits the user information about the determined recommended user to the user terminal 200 of the user who receives the recommendation user's presentation via the transmission unit 230. Alternatively, the recommended user determination unit 108 may input information about the determined recommended user into the user information database 104. In this case, in response to a request from the user terminal 200 or periodically, user information on the determined recommended user is transmitted to the user terminal 200 of the user who receives the recommendation user's presentation.
 (嗜好情報生成部)
 嗜好情報生成部110は、ユーザの音楽に対する嗜好に関する情報、すなわち嗜好情報を生成する。嗜好情報生成部110は、音楽データの楽曲に含まれるムードに対するユーザの感性情報に基づいて、嗜好情報を生成する。本実施形態では、ユーザの感性情報のみならず、音楽データの楽曲のムードやユーザの保有楽曲にも基づいて嗜好情報が生成される。また、感性情報の生成に用いられる楽曲は1つであってもよいし、複数であってもよい。さらに、各楽曲に含まれるムードは、1つであってもよいし複数であってもよい。例えば、嗜好情報は、第1の楽曲に含まれる第1のムードに対するユーザの感性情報と、第2の楽曲に含まれる、第2のムードとは異なる第2のムードに対するユーザの感性情報と、に基づいて得られることができる。ここで、第1の楽曲と第2の楽曲とは、同じまたは異なる楽曲であってもよい。
(Preference information generator)
The preference information generation unit 110 generates information related to the user's preference for music, that is, preference information. The preference information generation unit 110 generates preference information based on the user's sensitivity information regarding the mood included in the music data. In the present embodiment, preference information is generated not only based on the user's sensitivity information, but also based on the mood of music in the music data and the music held by the user. Further, there may be one or more pieces of music used for generating the sensitivity information. Furthermore, the mood included in each musical piece may be one or plural. For example, the preference information includes user's sensitivity information for the first mood included in the first song, and user's sensitivity information for the second mood different from the second mood included in the second song, Can be obtained based on. Here, the first music piece and the second music piece may be the same or different music pieces.
 以下、図9~図13に基づいて、嗜好情報生成部110における具体的な嗜好情報の生成過程例を説明する。図9~13は、図5に示すサーバ100の嗜好情報生成部110において用いられるデータの一例である。なお、図示の態様においては、説明の簡略化のため、ムードの種類、ユーザの数を限定して記載している。本実施形態においては、ムード毎の感性ポイント数の相対頻度とムード毎の出現の相対頻度とが算出され、これらを平均化することにより嗜好情報が得られる。 Hereinafter, based on FIGS. 9 to 13, a specific example of the generation process of preference information in the preference information generation unit 110 will be described. 9 to 13 are examples of data used in the preference information generation unit 110 of the server 100 shown in FIG. In the illustrated embodiment, the type of mood and the number of users are limited to simplify the description. In this embodiment, the relative frequency of the number of sensibility points for each mood and the relative frequency of appearance for each mood are calculated, and preference information is obtained by averaging these.
 まず、嗜好情報生成部110は、ユーザ情報データベース104よりユーザの保有する楽曲の楽曲ID、すなわち保有する楽曲のリストと、ユーザの各楽曲に対する感性情報をユーザ毎に読み出す。一方で、嗜好情報生成部110は、楽曲情報データベース106より、楽曲IDに対応する楽曲のムード情報含むメタ情報を読み出す。 First, the preference information generation unit 110 reads, from the user information database 104, the song ID of the song held by the user, that is, the list of songs held and the sensitivity information for each song of the user for each user. On the other hand, the preference information generation unit 110 reads the meta information including the mood information of the music corresponding to the music ID from the music information database 106.
 次いで、嗜好情報生成部110は、ユーザの保有する楽曲のリストと楽曲のムード情報とに基づいて、図9の表607に示すように、ユーザ毎に、保有する楽曲に含まれる各ムードの数を積算する。ここで、図9に示すムードは、各ユーザが保有する楽曲の各部分に含まれるムードである。このような楽曲の部分は、フレーズまたはストレインまたはこれらを分割して得られる区分であってもよいし、小節であってもよい。 Next, the preference information generation unit 110, based on the list of songs held by the user and the mood information of the songs, as shown in the table 607 of FIG. Is accumulated. Here, the mood shown in FIG. 9 is a mood included in each part of the music held by each user. Such a piece of music may be a phrase, a strain, a segment obtained by dividing these, or a measure.
 次いで、嗜好情報生成部110は、図10の表609に示すように、ユーザ毎に、保有する楽曲に含まれる各ムードの出現の相対頻度を算出する。 Next, as shown in the table 609 in FIG. 10, the preference information generation unit 110 calculates the relative frequency of appearance of each mood included in the music held for each user.
 次いで、嗜好情報生成部110は、ユーザの保有する楽曲のリストと各楽曲ついての感性情報に基づいて、図11の表611に示すように、ユーザ毎に、保有する楽曲の各ムードについての感性ポイントを積算する。 Next, the preference information generation unit 110, based on the list of songs held by the user and the sensitivity information about each song, as shown in Table 611 in FIG. 11, the sensitivity for each mood of the song held for each user. Accumulate points.
 次いで、嗜好情報生成部110は、図12の表613に示すように、ユーザ毎に、保有する楽曲の各ムードについての感性ポイント数の相対頻度を算出する。 Next, as shown in Table 613 in FIG. 12, the preference information generation unit 110 calculates the relative frequency of the number of sensibility points for each mood of the music held for each user.
 以上で得られた感性ポイント数の相対頻度およびムードの出現の相対頻度を、ユーザ、ムード毎に平均し、図13に示すようなユーザの嗜好情報を得る。このような、ユーザの嗜好情報は、ユーザの感性情報に基づいて生成されるため、単に保有する楽曲の情報または再生履歴から生成した場合と比較して、実際のユーザの音楽に対して感情が動いた事実がより的確に反映される。 The relative frequency of the number of sensibility points and the relative frequency of appearance of the mood obtained above are averaged for each user and mood to obtain user preference information as shown in FIG. Since such user preference information is generated based on the user's sensibility information, compared to the case where the user's preference information is simply generated from the information of the music or the reproduction history held, there is an emotion to the actual user's music. The moving facts are reflected more accurately.
 例えば、図10においては、ユーザX1の保有する楽曲のムード毎の相対頻度は、「ノリノリ」が0.39である。しかしながら、図13に示すように、感性情報を反映した場合、ユーザX1の嗜好情報は、「ノリノリ」が0.59となる。また、図10においては、ユーザX2の保有する楽曲のムード毎の相対頻度は、「厳か」が0.37、「悲しい」が0.26である。しかしながら、図13に示すように、感性情報を反映した場合、ユーザX2の嗜好情報は、「厳か」、「悲しい」が共に同程度の0.26となる。また、図10においては、ユーザX3の保有する楽曲のムード毎の相対頻度は、「ノリノリ」が0.31、「厳か」が0.23、「楽しい」が0.19である。しかしながら、図13に示すように、感性情報を反映した場合、ユーザX3の嗜好情報は、「ノリノリ」が0.29、「厳か」が0.18、「楽しい」が0.20となる。したがって、ユーザX3の嗜好が、「厳か」、「楽しい」について逆転する。 For example, in FIG. 10, the relative frequency for each mood of the music held by the user X1 is 0.39. However, as shown in FIG. 13, when the sensitivity information is reflected, the preference information of the user X1 is “Nori” is 0.59. In FIG. 10, the relative frequency for each mood of music held by the user X2 is 0.37 for “severe” and 0.26 for “sad”. However, as shown in FIG. 13, when the sensitivity information is reflected, the preference information of the user X2 is 0.26, which is the same for both “severe” and “sad”. In FIG. 10, the relative frequencies of the music pieces held by the user X3 for each mood are 0.31 for “Nori nori”, 0.23 for “strict” and 0.19 for “fun”. However, as shown in FIG. 13, when the sensitivity information is reflected, the preference information of the user X3 is 0.29 for “Nori”, 0.18 for “Strict”, and 0.20 for “Fun”. Therefore, the preference of the user X3 is reversed for “serious” and “fun”.
 嗜好情報生成部110は、このようにして生成した嗜好情報をユーザ情報データベース104に記憶させる。または、嗜好情報生成部110は、嗜好情報を送信部114を介してユーザ端末200に送信する。 The preference information generation unit 110 stores the preference information generated in this way in the user information database 104. Alternatively, the preference information generation unit 110 transmits the preference information to the user terminal 200 via the transmission unit 114.
 (通知情報生成部)
 通知情報生成部112は、通知情報を生成する。ここで、通知情報は、ユーザ端末200において音楽データが再生された際に、楽曲の再生と合わせて、自己または他のユーザの感性情報を通知するための方法を示す情報である。通知情報は、ユーザのある音楽データに対しての感性情報に基づいて生成される。また、通知情報の通知とともに再生される音楽データ(第2の音楽データ)は、感性情報の生成に用いられた音楽データ(第1の音楽データ)と同一であってもよいし、異なっていてもよい。但し、第2の音楽データと第1の音楽データとは関連する。例えば、第1の音楽データと第2の音楽データは、いずれも同一の楽曲に関するデータであってもよい。このような場合としては、第1の音楽データと第2の音楽データに含まれる楽曲が同一であるけれども、他の記憶装置に保存されている場合や、mp3ファイルとmp4ファイル等、ファイル形式が異なる場合が挙げられる。また、例えば、第1の音楽データは第1の楽曲に関するデータであり、第2の音楽データは第1の楽曲の一部分に関連(類似)する部分を含む第2の楽曲に関するデータであってもよい。関連する部分としては、同一または類似する旋律、リズム、ムードを有する部分が挙げられる。
(Notification information generator)
The notification information generation unit 112 generates notification information. Here, the notification information is information indicating a method for notifying the sensibility information of the user or another user together with the reproduction of the music when the music data is reproduced on the user terminal 200. The notification information is generated based on the sensitivity information for the music data of the user. Further, the music data (second music data) reproduced together with the notification information may be the same as or different from the music data (first music data) used to generate the sensitivity information. Also good. However, the second music data and the first music data are related. For example, both the first music data and the second music data may be data related to the same music piece. In such a case, although the music pieces included in the first music data and the second music data are the same, the file format is such as stored in another storage device, or mp3 file and mp4 file. There are different cases. Further, for example, the first music data may be data related to the first music, and the second music data may be data related to the second music including a portion related to (similar to) a part of the first music. Good. Related parts include parts having the same or similar melodies, rhythms, and moods.
 対象となるユーザは、例えば、通知情報を受信するユーザ端末200により指定されるユーザ、推薦ユーザ決定部108によって決定された推薦されるユーザ、特定の限定されたユーザ、例えばお気に入りのユーザもしくはお気に入りのグループに所属するユーザまたは任意に決定されるユーザであることができる。また、通知情報は、一人のユーザの感性情報に基づいて生成されてもよいし、複数人のユーザの感性情報に基づいて生成されてもよい。これらの事項は、例えば、通知情報を受信するユーザ端末200の指示により変更され得る。 The target user is, for example, a user specified by the user terminal 200 that receives the notification information, a recommended user determined by the recommended user determining unit 108, a specific limited user, for example, a favorite user or a favorite user It can be a user belonging to a group or an arbitrarily determined user. Further, the notification information may be generated based on the sensitivity information of one user, or may be generated based on the sensitivity information of a plurality of users. These items can be changed by, for example, an instruction from the user terminal 200 that receives the notification information.
 また、ユーザ端末200における通知手段は、例えば、効果音等の音、バイブレーション等の振動、LEDランプ等による発光、および画像、文字等の表示であることができる。これらの通知手段は、ユーザ端末200において適宜選択される。 Further, the notification means in the user terminal 200 can be, for example, sound such as sound effects, vibration such as vibration, light emission by an LED lamp, and display of images, characters, and the like. These notification means are appropriately selected in the user terminal 200.
 したがって、通知情報は、効果音等の音に関する音情報、振動に関する振動情報、発光に関する発光情報および画像表示に関する画像表示情報を含みうる。以下、通知情報、特に音情報および画像表示情報の生成について、通知情報生成部112の機能を説明する。 Therefore, the notification information can include sound information related to sound such as sound effects, vibration information related to vibration, light emission information related to light emission, and image display information related to image display. Hereinafter, the function of the notification information generation unit 112 will be described with respect to generation of notification information, particularly sound information and image display information.
 通知情報生成部112は、ユーザ情報データベース104から、対象となるユーザの対象となる音楽データについての感性情報、ユーザ画像および嗜好情報を含むユーザ情報を取得する。また、通知情報生成部112は、楽曲情報データベース106より、対象となる音楽データのメタデータを取得する。 The notification information generation unit 112 acquires user information including sensitivity information, user images, and preference information about music data that is a target of the target user from the user information database 104. Further, the notification information generating unit 112 acquires metadata of the target music data from the music information database 106.
 通知情報生成部112は、感性情報を含むユーザ情報およびメタデータに基づいて音情報を生成する。具体的には、通知情報生成部112は、音楽データの再生時において、ユーザの感性情報に対応した音楽データの部分、すなわち感性ポイントが検出された部分に音(効果音)が出力されるように音情報を作成する。なお、音楽データの部分は、フレーズ単位、小節単位またはそれらの分割された区分単位であることができる。また、効果音の生じる時間は、音楽の観賞を妨げないように、単位となる区分の前または後であることが好ましい。 The notification information generation unit 112 generates sound information based on user information including metadata and metadata. Specifically, the notification information generation unit 112 outputs a sound (sound effect) to the portion of the music data corresponding to the user's sensitivity information, that is, the portion where the sensitivity points are detected, during reproduction of the music data. Create sound information. The music data portion can be a phrase unit, a measure unit, or a divided segment unit. Moreover, it is preferable that the time for the sound effect to occur is before or after the unit division so as not to disturb the music appreciation.
 また、通知情報生成部112は、例えば、予め準備された複数種の効果音から、音情報に使用する効果音を選択する。効果音は、特に限定されず、例えば、電子音または自然界で発生し得る音、あるいはこれらの疑似音または編集された音であることができる。自然界で発生し得る音としては、例えば、「ブラボー」等の掛け声、歓声等の声、拍手、足音等の人体動作による音、楽器音、心音等の生体情報に由来する音、口笛、クラッカー音、ライブ会場、コンサート会場等における観客側からの音声の録音、等が挙げられる。 Further, the notification information generation unit 112 selects a sound effect to be used for sound information from a plurality of kinds of sound effects prepared in advance, for example. The sound effect is not particularly limited, and can be, for example, an electronic sound, a sound that can occur in nature, or a simulated sound or an edited sound thereof. Sounds that can be generated in nature include, for example, shouts such as “Bravo”, voices such as cheers, sounds from human body movements such as applause and footsteps, sounds derived from biological information such as instrument sounds and heart sounds, whistle, cracker sounds, etc. , Recording of audio from the audience at live venues, concert venues, etc.
 また、通知情報生成部112は、ユーザ情報または音楽データのメタ情報に基づいて効果音を選択してもよい。通知情報生成部112は、例えば、ユーザ情報に含まれる対象となるユーザの性別、年齢に基づいて、声の性質または掛け声の文言を変更してもよい。また、通知情報生成部112は、ユーザ情報に効果音情報が含まれる場合には、効果音情報に基づいて効果音を設定してもよい。ここで、効果音情報には、いずれの効果音を選択すべきかを指定する効果音指定情報と、ユーザ固有の効果音に関する固有効果音データとが含まれる。特に、効果音として、ユーザ固有の効果音を用いた場合、効果音に対象となるユーザの個性が反映される。また、通知情報生成部112は、対象となる音楽データの有するムード情報に基づいて効果音を選択してもよい。 Further, the notification information generation unit 112 may select a sound effect based on user information or meta information of music data. The notification information generation unit 112 may change the character of the voice or the wording of the shout based on, for example, the gender and age of the target user included in the user information. Moreover, the notification information production | generation part 112 may set a sound effect based on sound effect information, when sound effect information is contained in user information. Here, the sound effect information includes sound effect designation information for designating which sound effect should be selected and unique sound effect data relating to the user-specific sound effect. In particular, when a user-specific sound effect is used as the sound effect, the personality of the target user is reflected in the sound effect. Further, the notification information generation unit 112 may select a sound effect based on mood information included in the target music data.
 また、対象となるユーザが複数人である場合、通知情報生成部112は、ユーザの人数に合わせて、効果音の音量や、効果音の種類を変更してもよい。 Further, when there are a plurality of target users, the notification information generation unit 112 may change the volume of the sound effect and the type of the sound effect according to the number of users.
 通知情報生成部112は、また、ユーザ端末200が備えるスピーカ、イヤホン等の出力装置に合わせて効果音を設定してもよい。また、通知情報生成部112は、音像定位技術を用いて、効果音の生じる仮想的な位置を設定してもよい。また、通知情報生成部112は、ユーザ端末200の周囲の環境に合わせて効果音を設定してもよい。 The notification information generation unit 112 may also set sound effects in accordance with output devices such as speakers and earphones provided in the user terminal 200. In addition, the notification information generation unit 112 may set a virtual position where the sound effect occurs using a sound image localization technique. Further, the notification information generation unit 112 may set sound effects according to the environment around the user terminal 200.
 また、通知情報生成部112は、感性情報を含むユーザ情報および音楽データのメタ情報に基づいて、画像表示のための画像表示情報を生成する。画像表示情報は、表示される通知画像に関する情報と、通知画像の表示の方法に関する情報を含む。 Further, the notification information generation unit 112 generates image display information for image display based on user information including sensitivity information and meta information of music data. The image display information includes information regarding the displayed notification image and information regarding the method of displaying the notification image.
 通知画像としては、特に限定されず、多角形、星形多角形、楕円、円、扇形などの幾何学図形およびユーザ情報を含む画像、その他感性の動きを示す図形、アニメーションなど、任意の画像であることができる。このうち、通知画像がユーザ情報を含む画像である場合、音楽データの再生に際し感情が動いたユーザを、ユーザ端末200を所有するユーザは確認することが可能である。このようなユーザ情報が含まれる画像は、特に対象となるユーザが複数人の場合、対象となるユーザの識別が可能であるため便利である。ユーザ情報を含む画像は、例えば、ユーザ情報に含まれるユーザ画像と、ユーザの嗜好情報とを含むように構成される。また、通知画像がアニメーションである場合、当該アニメーションは、感性情報が検出されたユーザの感性情報に応じた量、例えば当該ユーザの生体情報の変化に応じた量を示す図を含むことができる。このように感性情報に応じて図が変化することにより、端末200を保有するユーザは、感性情報が検出されたユーザの楽曲に対する感情の動き(ノリ)の程度を把握することができる。 The notification image is not particularly limited, and may be any image such as a polygon, a star polygon, an ellipse, a circle, a fan-shaped geometric figure, and an image containing user information, other figures showing emotional movement, animation, etc. Can be. Among these, when the notification image is an image including user information, the user who owns the user terminal 200 can confirm the user whose emotion has moved during the reproduction of the music data. An image including such user information is convenient because the target user can be identified particularly when there are a plurality of target users. The image including user information is configured to include, for example, a user image included in the user information and user preference information. Further, when the notification image is an animation, the animation may include a figure indicating an amount according to the sensitivity information of the user whose sensitivity information is detected, for example, an amount according to a change in the biological information of the user. Thus, by changing the figure according to the sensitivity information, the user who owns the terminal 200 can grasp the degree of emotional movement (pitch) with respect to the user's music from which the sensitivity information is detected.
 通知画像の表示の方法は、音楽データのどの位置において対象となるユーザの感情が動いたかを示す任意の方法であり得る。このような方法としては、例えば、音楽データの感性ポイントが検出された時点付近で一定時間通知画像を表示する方法、表示部に表示された音楽データのプログレスバー画像の時間軸に沿って感性ポイントが検出された部分に通知画像を表示する方法が挙げられる。 The display method of the notification image can be any method indicating the target user's emotion moved at which position in the music data. As such a method, for example, a method of displaying a notification image for a certain period of time near the time when a sensitivity point of music data is detected, a sensitivity point along the time axis of a progress bar image of music data displayed on the display unit There is a method of displaying a notification image in a portion where is detected.
 なお、通知情報生成部112は、発光情報および振動情報についても、上記の方法に準じて通知情報を生成することが可能である。 Note that the notification information generation unit 112 can also generate notification information for light emission information and vibration information according to the above method.
 通知情報生成部112は、生成した通知情報を送信部114を介してユーザ端末200に送信する。通知情報の送信は、定期的に行われてもよいし、通知情報の生成の都度送信されてもよい。また、ユーザ端末200による設定に応じて、送信方法が変更されてもよい。また、送信先のユーザ端末200は、感性情報の取得に関与したユーザ(第1のユーザ)およびその他のユーザ(第2のユーザ)のうち少なくとも一方であることができる。すなわち、通知情報の通知は、第1のユーザおよび第2のユーザの少なくとも一方に対して行われることができる。 The notification information generation unit 112 transmits the generated notification information to the user terminal 200 via the transmission unit 114. The notification information may be transmitted periodically or may be transmitted every time the notification information is generated. Further, the transmission method may be changed according to the setting by the user terminal 200. Moreover, the user terminal 200 of a transmission destination can be at least one of a user (first user) involved in acquisition of sensitivity information and another user (second user). In other words, the notification information can be notified to at least one of the first user and the second user.
 (送信部)
 送信部114は、ネットワーク300に接続されており、ネットワーク300を介して、ユーザ端末200等の電子機器へ情報を送信することができる。具体的には、送信部114は、ユーザ端末200へ、嗜好情報生成部110によって生成された嗜好情報、推薦ユーザ決定部108によって特定された推薦ユーザのユーザ情報、通知情報生成部112によって生成された通知情報を送信することができる。また、ユーザ端末200による要求に応じて、ユーザ情報データベース104、楽曲情報データベース106に記憶された各種情報をユーザ端末200へ送信することができる。
(Transmitter)
The transmission unit 114 is connected to the network 300 and can transmit information to an electronic device such as the user terminal 200 via the network 300. Specifically, the transmission unit 114 is generated in the user terminal 200 by the preference information generated by the preference information generation unit 110, the user information of the recommended user specified by the recommended user determination unit 108, and the notification information generation unit 112. Notification information can be sent. Further, various information stored in the user information database 104 and the music information database 106 can be transmitted to the user terminal 200 in response to a request from the user terminal 200.
 <3.ユーザ端末の構成例(第2の情報処理装置)>
 次に、図14を参照して、本実施形態に係るユーザ端末200の構成について説明する。
<3. Configuration example of user terminal (second information processing apparatus)>
Next, the configuration of the user terminal 200 according to the present embodiment will be described with reference to FIG.
 図14は、本実施形態に係るユーザ端末200の機能構成の概略を示すブロック図である。図14に示すように、ユーザ端末200は、受信部202と、出力制御部204と、出力部206と、楽曲再生部214と、感性情報生成部216と、ムード解析部218と、記憶部220と、入力部226と、位置情報検出部228と、送信部230とを備えている。 FIG. 14 is a block diagram illustrating an outline of a functional configuration of the user terminal 200 according to the present embodiment. As illustrated in FIG. 14, the user terminal 200 includes a reception unit 202, an output control unit 204, an output unit 206, a music playback unit 214, a sensitivity information generation unit 216, a mood analysis unit 218, and a storage unit 220. An input unit 226, a position information detection unit 228, and a transmission unit 230.
 (受信部)
 受信部202は、ネットワーク300に接続されており、ネットワーク300を介して、サーバ100や他のユーザ端末200等の電子機器からの情報を受信することができる。具体的には、受信部202は、サーバ100から推薦ユーザや他のユーザ(第1のユーザ)についての嗜好情報等のユーザ情報や、通知情報を受信する。したがって、受信部202は、情報取得部として機能する。また、受信部202は、他のユーザ端末200からの他のユーザのメッセージを受信することが可能である。受信部202は、受信した情報をユーザ端末200の各構成部、例えば記憶部220や出力制御部204に入力する。
(Receiver)
The receiving unit 202 is connected to the network 300, and can receive information from electronic devices such as the server 100 and other user terminals 200 via the network 300. Specifically, the receiving unit 202 receives user information such as preference information about the recommended user and other users (first user) and notification information from the server 100. Therefore, the reception unit 202 functions as an information acquisition unit. The receiving unit 202 can receive other users' messages from other user terminals 200. The receiving unit 202 inputs the received information to each component of the user terminal 200, for example, the storage unit 220 and the output control unit 204.
 (出力制御部)
 出力制御部204は、後述する出力部206における情報の出力の制御を行う。具体的には、出力制御部204は、出力部206に対し情報を入出力(入力)し、当該情報を出力するように指示する。出力制御部204は、例えば、出力部206の表示部208における画像表示の制御を行う表示制御部、スピーカ210の音声出力の制御を行う音声出力制御部、振動発生部の制御を行う振動発生制御部として機能する。また、出力制御部204は、ユーザ端末200を保有するユーザ(第2のユーザ)が認識可能に推薦ユーザ(第1のユーザ)のユーザ情報を提示する表示部208に対しユーザ情報を出力する提示情報出力部の一例である。なお、提示情報出力部としての出力制御部204は、出力部206に対してユーザ情報を出力するのみならず、出力部206を制御することができる。さらに、出力制御部204は、嗜好情報を表示する表示部208に対し嗜好情報を出力する嗜好情報出力部の一例である。なお、嗜好情報出力部としての出力制御部204も、出力部206に対して嗜好情報を出力するのみならず、出力部206を制御することができる。
(Output control unit)
The output control unit 204 controls output of information in the output unit 206 described later. Specifically, the output control unit 204 inputs / outputs (inputs) information to the output unit 206 and instructs the output unit 206 to output the information. The output control unit 204 is, for example, a display control unit that controls image display on the display unit 208 of the output unit 206, a sound output control unit that controls sound output of the speaker 210, and a vibration generation control that controls the vibration generation unit. It functions as a part. The output control unit 204 also presents the user information to the display unit 208 that presents the user information of the recommended user (first user) so that the user (second user) who owns the user terminal 200 can recognize. It is an example of an information output part. Note that the output control unit 204 as a presentation information output unit can control the output unit 206 as well as output user information to the output unit 206. Furthermore, the output control unit 204 is an example of a preference information output unit that outputs preference information to the display unit 208 that displays preference information. Note that the output control unit 204 as the preference information output unit not only outputs preference information to the output unit 206 but can also control the output unit 206.
 より具体的には、出力制御部204は、ユーザインタフェース画面を生成、更新し、これを表示部208に表示させる。ユーザインタフェース画面の生成、更新は、例えば、ユーザによる入力部226における入力、ユーザ端末200内の各部からの要求またはサーバ100もしくは他のユーザ端末200からの情報の受信等をトリガにして行われる。なお、ユーザインタフェース画面の構成については、後述する。 More specifically, the output control unit 204 generates and updates a user interface screen and causes the display unit 208 to display it. The generation and update of the user interface screen are triggered by, for example, an input by the user at the input unit 226, a request from each unit in the user terminal 200, or reception of information from the server 100 or another user terminal 200. The configuration of the user interface screen will be described later.
 また、出力制御部204は、入力されたトリガの内容に応じて、ユーザインタフェースに合わせた効果音や、楽曲再生部214において復号化された音楽データの音声を、スピーカ210や、外部接続されるイヤホン、外部スピーカ(いずれも図示せず)等の外部音声出力デバイスにおいて出力するようにスピーカ210、外部音声出力デバイスを制御する。 Further, the output control unit 204 is externally connected to the speaker 210 or the sound effect sound adapted to the user interface or the audio of the music data decoded by the music reproduction unit 214 according to the contents of the input trigger. The speaker 210 and the external audio output device are controlled so as to be output from an external audio output device such as an earphone or an external speaker (both not shown).
 また、出力制御部204は、入力されたトリガの内容に応じて、振動を発生させるように振動発生部212を、または発光を行うようにLEDランプ(図示せず)を制御する。 Further, the output control unit 204 controls the vibration generating unit 212 to generate vibration or the LED lamp (not shown) to emit light according to the contents of the input trigger.
 さらに、出力制御部204は、サーバ100より受信部202を介して取得した通知情報を出力部206が出力するように制御を行う。したがって、出力制御部204は、音楽データの再生時において通知情報が音楽データの部分と関連付けてユーザに通知されるように出力部(通知部)206に対し通知情報を出力する通知情報出力部の一例である。 Further, the output control unit 204 performs control so that the output unit 206 outputs the notification information acquired from the server 100 via the receiving unit 202. Accordingly, the output control unit 204 is a notification information output unit that outputs notification information to the output unit (notification unit) 206 so that the notification information is notified to the user in association with the music data part when the music data is reproduced. It is an example.
 通知情報出力部としての出力制御部204は、出力部206に対して通知情報を出力するのみならず、出力部206を制御することができる。より具体的には、通知情報出力部としての出力制御部204は、通知情報の通知の方法を決定、変更し、出力部206に対し通知情報とともに当該決定、変更に関する指示情報を出力して、出力部206を制御する。ここで、出力制御部204は、通知情報を受信したユーザに関する情報、および感性情報の取得に関与した音楽データ(第1の音楽データ)と通知情報とともに再生される音楽データ(第2の音楽データ)のうちの少なくとも一方に関する情報のうち、少なくともいずれかに応じて出力部(通知部)206による通知の方法を変更することができる。ここで通知方法の変更とは、音、光、振動または表示といった通知手段の変更の他、同一の通知手段の場合においても通知における出力の程度(例えば、音量、光量、振動量、表示における情報量)の変更や、通知における情報の変更(例えば効果音の変更、光点滅方法の変更、振動方法の変更、表示する画像、文字の変更)も含む。例えば、出力部(通知部)206による通知が、音楽データの再生に合わせたユーザの感性情報に対応する第2の音楽データの部分における音の出力により行われる場合、通知の方法の変更は、出力される音(音情報)の変更または出力される音量の変更であることができる。また、出力制御部204は、通知制御部として機能する場合において、再生される音楽データの楽曲の種類に応じて、出力部による通知の方法を決定することができる。例えば、再生データの楽曲のジャンルまたはムードに応じて通知方法が決定され得る。 The output control unit 204 as the notification information output unit can control the output unit 206 as well as output the notification information to the output unit 206. More specifically, the output control unit 204 as the notification information output unit determines and changes the notification information notification method, and outputs instruction information about the determination and change to the output unit 206 together with the notification information. The output unit 206 is controlled. Here, the output control unit 204 includes information related to the user who has received the notification information, music data (first music data) related to the acquisition of sensitivity information, and music data (second music data) reproduced together with the notification information. ), The method of notification by the output unit (notification unit) 206 can be changed according to at least one of the information regarding at least one. Here, the change of the notification method is not only the change of the notification means such as sound, light, vibration or display, but also the degree of output in the notification even in the case of the same notification means (for example, volume, light quantity, vibration amount, display information) Change of information in the notification (for example, change of sound effect, change of light blinking method, change of vibration method, image to be displayed, change of character). For example, when the notification by the output unit (notification unit) 206 is performed by outputting sound in the second music data portion corresponding to the user's sensitivity information in accordance with the reproduction of the music data, the notification method change is The output sound (sound information) can be changed or the output sound volume can be changed. Further, when functioning as a notification control unit, the output control unit 204 can determine a notification method by the output unit according to the type of music of the music data to be played. For example, the notification method can be determined according to the genre or mood of the music in the reproduction data.
 また、出力制御部204は、音楽データについて取得した通知情報の数に応じて、出力部206による通知の方法を決定してもよい。例えば、出力制御部204は、取得した通知情報が一定数を超えると、表示部206に表示される通知情報を制限するように表示部206を制御してもよい。この場合において、例えば、通常の再生画面(後述するプレーヤ710)においては複数の通知情報の存在を知らせるアイコンのみをプログレスバー713に沿って表示し、一方で、プログレスバー713部分を拡大すると、通知情報の詳細が表示されるように、表示部206が制御されてもよい。 Further, the output control unit 204 may determine a notification method by the output unit 206 according to the number of pieces of notification information acquired for the music data. For example, the output control unit 204 may control the display unit 206 to limit the notification information displayed on the display unit 206 when the acquired notification information exceeds a certain number. In this case, for example, on a normal playback screen (player 710 described later), only an icon that indicates the presence of a plurality of notification information is displayed along the progress bar 713, while when the progress bar 713 is enlarged, a notification is displayed. The display unit 206 may be controlled so that details of the information are displayed.
 また、出力制御部204は、通知の対象となるユーザの人数に合わせて、効果音の音量や、効果音の種類を変更してもよい。例えば、出力制御部204は、通知の対象となるユーザの人数が連想可能に、通知方法を変更することができる。例えば、人数が10人以内の場合には、「ひそひそ」とした静かな効果音とし、人数が11~100人以内の場合には「ざわざわ」とした中程度の音量の効果音とし、人数が101人以上の場合には、「がやがわ」とした大音量の効果音とすることができる。これにより、通知を受けたユーザは、どの程度の他のユーザが観賞する楽曲に対し興味を有しているかを把握することができる。 Further, the output control unit 204 may change the volume of the sound effect and the type of the sound effect according to the number of users who are the target of notification. For example, the output control unit 204 can change the notification method so that the number of users to be notified can be associated. For example, when the number of people is 10 or less, the sound effect is quiet, and when the number of people is 11 to 100, the sound effect is medium and the sound is medium. In the case of 101 people or more, it can be set as a loud sound effect of “GAYAGAWA”. Thereby, the user who received the notification can grasp how much other users are interested in the music to be appreciated.
 また、出力制御部204は、音楽データが再生される際のユーザ端末200の周囲環境に応じて、出力部206による通知の方法を決定することができる。例えば、周囲の環境が、比較的静かな場合と、比較的騒々しい場合とで、効果音の種類が変更され得る。 Also, the output control unit 204 can determine a notification method by the output unit 206 according to the surrounding environment of the user terminal 200 when the music data is reproduced. For example, the type of sound effect can be changed depending on whether the surrounding environment is relatively quiet or relatively noisy.
 また、出力制御部204は、ユーザ端末200を保有するユーザの指示に応じて通知方法を変更し得る。例えば、ユーザの指示に応じて通知情報を通知しないように設定することができる。また、この場合において、ユーザは、通知しない通知情報の種類(効果音、通知画像等)を選択可能である。また、ユーザは、通知の対象となるユーザを、適宜選択することも可能である。 Further, the output control unit 204 can change the notification method in accordance with an instruction from a user who owns the user terminal 200. For example, the notification information can be set not to be notified in response to a user instruction. In this case, the user can select the type of notification information not to be notified (sound effect, notification image, etc.). In addition, the user can appropriately select a user to be notified.
 また、出力制御部204は、通知の対象となるユーザの指示に応じて通知方法を変更し得る。例えば、出力制御部204は、通知の対象となるユーザにより指定される効果音、通知画像を出力するように、出力部206を制御し得る。 Also, the output control unit 204 can change the notification method in accordance with an instruction from a user who is a notification target. For example, the output control unit 204 can control the output unit 206 so as to output a sound effect and a notification image specified by a user who is a notification target.
 (出力部)
 出力部206は、出力制御部204による制御指示に応じて、情報の出力を行う。出力部206は、表示部208と、スピーカ210と、振動発生部212とを有している。なお、出力部206は、ユーザ端末200を保有するユーザが認識可能に推薦ユーザの提示を行う提示部の一例である。さらに、出力部206は、上述した通知情報の通知を行う通知部としても機能する。
(Output part)
The output unit 206 outputs information in response to a control instruction from the output control unit 204. The output unit 206 includes a display unit 208, a speaker 210, and a vibration generation unit 212. The output unit 206 is an example of a presentation unit that presents a recommended user so that a user who owns the user terminal 200 can recognize. Further, the output unit 206 also functions as a notification unit that notifies the notification information described above.
 (表示部)
 表示部208は、ディスプレイを含んで構成されており、出力制御部204による制御指示に応じて、静止画および/または動画等の画像を表示する。
(Display section)
The display unit 208 includes a display, and displays an image such as a still image and / or a moving image in accordance with a control instruction from the output control unit 204.
 (スピーカ)
 スピーカ210は、音波を発生させるための装置である。スピーカ210は、出力制御部204による制御指示により、音声を発生させる。
(Speaker)
The speaker 210 is a device for generating sound waves. The speaker 210 generates sound according to a control instruction from the output control unit 204.
 (振動発生部)
 振動発生部212は、モータ等によって振動を発生させることが可能な装置により構成される。振動発生部212は、出力制御部204による制御指示により、振動を発生させ、ユーザ端末200を振動させる。
(Vibration generator)
The vibration generating unit 212 is configured by a device that can generate vibration by a motor or the like. The vibration generation unit 212 generates a vibration in response to a control instruction from the output control unit 204 and causes the user terminal 200 to vibrate.
 (楽曲再生部)
 楽曲再生部214は、デコーダを含み、記憶部220が備える楽曲データベース224より音楽データを取得し、音楽データを復号する。そして、楽曲再生部214は、復号された音楽データを出力制御部204、出力部206を介して音声を含む情報として出力する。
 楽曲再生部214は、さらに、音楽データの再生時における楽曲の進行状況についての進行情報を感性情報生成部216に入力する。
(Music player)
The music playback unit 214 includes a decoder, acquires music data from the music database 224 provided in the storage unit 220, and decodes the music data. Then, the music playback unit 214 outputs the decrypted music data as information including sound via the output control unit 204 and the output unit 206.
The music playback unit 214 further inputs progress information about the progress of music during playback of music data to the sensibility information generation unit 216.
 (感性情報生成部)
 感性情報生成部216は、音楽データに対するユーザの感情の動き(感性ポイント)を検出し、感性情報を生成する。具体的には、感性情報生成部216は、楽曲再生部214より楽曲の進行情報が入力されると、入力部226より入力される感性ポイントを検出し、感性ポイントを音楽データの時間軸と関連付ける。次いで、感性情報生成部216は、時間軸と関連付けられた感性ポイントを音楽データを通して収集し、感性情報を生成する。
(Kansei information generator)
The sensibility information generation unit 216 detects a user's emotional movement (sensitivity point) with respect to the music data, and generates sensibility information. Specifically, when the music progress information is input from the music playback unit 214, the sensitivity information generation unit 216 detects the sensitivity points input from the input unit 226 and associates the sensitivity points with the time axis of the music data. . Next, the sensibility information generation unit 216 collects sensibility points associated with the time axis through music data, and generates sensibility information.
 なお、本実施形態においては、感性情報生成部216は、音楽データの時間軸として、音楽データの部分を使用する。音楽データの部分については、上述した通りに区分されることができる。このような音楽データの部分についての情報は、ムード解析部218による音楽データの解析により生成され、感性情報生成部216が取得可能となる。 In the present embodiment, the sensitivity information generation unit 216 uses the music data portion as the time axis of the music data. The music data portion can be classified as described above. Such information on the music data portion is generated by analysis of the music data by the mood analysis unit 218 and can be acquired by the sensitivity information generation unit 216.
 本実施形態においては、感性情報は、音楽データの各部分に対するユーザの体動に基づいて生成される。感性情報は、音楽データの各部分に対するユーザの体動の頻度に基づいて算出された体動情報を含む。より具体的には、体動情報は、音楽データの各部分に対するユーザの体動の頻度を部分毎に積算することにより算出される。体動は、ユーザが音楽を鑑賞する際に感情の動きとして表れやすく、また比較的客観的に測定可能な指標である。したがって、このような体動に関する体動情報を感性情報の一部として利用することにより、感性情報の精度が向上する。 In the present embodiment, the sensitivity information is generated based on the user's body movement with respect to each part of the music data. Sensitivity information includes body motion information calculated based on the frequency of body motion of the user for each part of the music data. More specifically, the body motion information is calculated by integrating the frequency of the user's body motion for each part of the music data for each part. Body movement is an index that can be easily expressed as a movement of emotion when a user appreciates music and can be measured relatively objectively. Therefore, by using the body motion information related to such body motion as part of the sensitivity information, the accuracy of the sensitivity information is improved.
 (ムード解析部)
 ムード解析部218は、音楽データを解析し、音楽データに関するムード情報を得る。具体的には、ムード解析部218は、まず、記憶部220の楽曲データベース224に保存された音楽データを取得する。次いで、ムード解析部218は、音楽データより得られる楽曲の波形を時間と音程毎のエネルギの2軸に解析することにより、時間-音程データを得る。ここで、音程は、1オクターブあたり12の音程(ドレミ)に解析される。例えば、ムード解析部218は、楽曲データを時間軸に沿って1秒分の音楽に相当する部分に分けると共に、1オクターブの12音階それぞれに相当する周波数帯域毎のエネルギを抽出する。
(Mood Analysis Department)
The mood analysis unit 218 analyzes the music data and obtains mood information regarding the music data. Specifically, the mood analysis unit 218 first acquires music data stored in the music database 224 of the storage unit 220. Next, the mood analysis unit 218 obtains time-pitch data by analyzing the waveform of the music obtained from the music data into two axes of energy for each time and pitch. Here, the pitch is analyzed into 12 pitches (doremi) per octave. For example, the mood analysis unit 218 divides the music data into portions corresponding to music for one second along the time axis, and extracts energy for each frequency band corresponding to each of the 12 scales of one octave.
 次いで、ムード解析部218は、解析により得られた情報により、ビート構造、コード進行、キー、楽曲構造等の特徴量を音楽理論に即して解析する。ムード解析部218は、得られた特徴量に基づき、音楽データに含まれる楽曲の部分毎のムードを推定する。このようなムードは、例えば「ノリノリ(Euphoric)」、「幸せ(Happy)」、「楽しい(Joyful)」、「穏やか(Mild)」、「悲しい(Sad)」、「おごそか(Solemn)」、「明るい(Bright)」、「癒し(Healing)」、「爽やか(Fresh)」および「優雅(Elegant)」等の複数種に分類されることができる。そして、ムード解析部218は、分類されたムード毎について音楽データの上記部分の特徴量に応じた値を決定する。次いでムード解析部218は、複数の分類されたムードのうち、上記特徴量に応じた値が最も高いムードを、対象となる部分のムードであると推定することができる。なお、各部分に対し、複数のムードを割り当てることもできる。この場合、上記特徴量に応じた値に応じて、例えば特徴量に応じた値が大きい順に、複数のムードが決定される。また、このような推定は、例えば予め準備された特徴量とムードとの関係を示すパターンテーブルに基づいて行ってもよい。以上の解析は、例えば、12音解析技術を採用することにより行うことができる。そして、ムード解析部218は、ムードと音楽データの各部分との対応を示すムード情報を生成する。なお、本開示は上述したような態様に限定されず、ムード解析部218は、音楽データのメタ情報に含まれるCDDB(Compact Disc DataBase)、Music ID等の音楽認識情報に基づき、ムード情報を生成してもよい。このような音楽認識情報は、また、外部のデータベースから取得されてもよい。 Next, the mood analysis unit 218 analyzes feature quantities such as beat structure, chord progression, key, and music structure according to the music theory based on the information obtained by the analysis. The mood analysis unit 218 estimates the mood of each piece of music included in the music data based on the obtained feature amount. Such moods include, for example, “Euphoric”, “Happy”, “Joyful”, “Mild”, “Sad”, “Solen”, “Solem” It can be classified into multiple types such as “Bright”, “Healing”, “Fresh”, and “Elegant”. Then, the mood analysis unit 218 determines a value corresponding to the feature amount of the portion of the music data for each classified mood. Next, the mood analysis unit 218 can estimate the mood having the highest value according to the feature amount among the plurality of classified moods as the mood of the target portion. A plurality of moods can be assigned to each part. In this case, according to the value corresponding to the feature amount, for example, a plurality of moods are determined in descending order of the value corresponding to the feature amount. Further, such estimation may be performed based on, for example, a pattern table indicating a relationship between a feature amount and mood prepared in advance. The above analysis can be performed, for example, by adopting a twelve sound analysis technique. Then, the mood analysis unit 218 generates mood information indicating the correspondence between the mood and each part of the music data. Note that the present disclosure is not limited to the above-described mode, and the mood analysis unit 218 generates mood information based on music recognition information such as CDDB (Compact Data DataBase) and Music ID included in the meta information of the music data. May be. Such music recognition information may also be obtained from an external database.
 ムード解析部218は、上述したように、ムード情報のみならず、音楽データについての種々の解析データを提供する。また、ムード解析部218は、上記の種々の解析データに基づいて音楽データの適切な部分への区分も行うことが可能である。 As described above, the mood analysis unit 218 provides not only mood information but also various analysis data for music data. The mood analysis unit 218 can also classify music data into appropriate parts based on the various analysis data.
 ムード解析部218は、得られたムード情報を含む解析情報を、楽曲データベース224に入力するとともに、送信部230を介してサーバ100に送信する。 The mood analysis unit 218 inputs analysis information including the obtained mood information to the music database 224 and transmits it to the server 100 via the transmission unit 230.
 (記憶部)
 記憶部220は、ユーザ端末200の制御に必要な各種情報を格納する。また、記憶部220は、ユーザ情報記憶部222と、楽曲データベース224とを有している。
(Memory part)
The storage unit 220 stores various information necessary for controlling the user terminal 200. The storage unit 220 includes a user information storage unit 222 and a music database 224.
 ユーザ情報記憶部222は、ユーザ端末200を保有するユーザに関する情報を記憶する。このようなユーザに関する情報としては、上述したユーザ情報データベース104において保存される情報の他、他のユーザまたはグループとの通信履歴が挙げられる。 The user information storage unit 222 stores information regarding the user who owns the user terminal 200. Examples of such information relating to the user include communication history with other users or groups in addition to the information stored in the user information database 104 described above.
 楽曲データベース224は、ユーザが保有する楽曲の音楽データおよび音楽データに関する情報、例えば各音楽データの楽曲ID、楽曲名、アーティスト情報、アルバム情報、カバー画像、ジャンルおよびムード情報等のメタ情報を記憶する。 The music database 224 stores music data of the music held by the user and information related to the music data, for example, meta information such as music ID, music name, artist information, album information, cover image, genre, and mood information of each music data. .
 (入力部)
 入力部226は、ユーザや、他の機器より情報を入力可能に構成されたデバイスである。本実施形態においては、入力部226は、タッチパネルを含んで構成される。入力部226には、例えば、ユーザからの各種指示や、ユーザの音楽に対する感情の動きに関する情報が入力される。上述したように、本実施形態では、ユーザの音楽に対する感情の動きは、音楽データの再生時における体動として検出される。具体的には、本実施形態においては、体動は、ユーザがタッチパネルの所定の部位をタップする等のユーザの入力により検出される。なお、本開示は図示の態様に限定されず、ユーザの音楽に対する感情の動きとしての体動を含む生体情報の変化は、心拍計、血圧計、脳波測定機、脈拍計、体温計、加速度センサ、ジャイロセンサおよび地磁気センサ等の生体情報検出部により自動的に検出されることもできる。
(Input section)
The input unit 226 is a device configured to be able to input information from a user or another device. In the present embodiment, the input unit 226 includes a touch panel. The input unit 226 receives, for example, various instructions from the user and information related to emotional movement of the user's music. As described above, in the present embodiment, the emotional movement of the user with respect to the music is detected as a body movement during reproduction of the music data. Specifically, in the present embodiment, body movement is detected by a user input such as a user tapping a predetermined part of the touch panel. Note that the present disclosure is not limited to the illustrated embodiment, and changes in biological information including body movements as emotional movements of the user's music include a heart rate monitor, a sphygmomanometer, an electroencephalograph, a pulse meter, a thermometer, an acceleration sensor, It can also be automatically detected by a biological information detection unit such as a gyro sensor or a geomagnetic sensor.
 (位置情報検出部)
 位置情報検出部228は、ユーザ端末の位置情報を検出可能に構成された装置、例えばGPS(グローバル・ポジショニング・システム)である。位置情報検出部228は、検出したユーザ端末200の位置情報を、取得して記憶部220へ入力するとともに、必要に応じて、送信部230を介してサーバ100や他のユーザ端末200へ送信する。
(Position information detector)
The position information detection unit 228 is a device configured to be able to detect the position information of the user terminal, for example, a GPS (Global Positioning System). The position information detection unit 228 acquires and inputs the detected position information of the user terminal 200 to the storage unit 220 and transmits it to the server 100 and other user terminals 200 via the transmission unit 230 as necessary. .
 (送信部)
 送信部230は、ネットワーク300に接続されており、ネットワーク300を介して、サーバ100や他のユーザ端末200等の電子機器へ情報を送信することができる。具体的には、送信部230は、サーバ100へ、ユーザ端末200を所有するユーザの、感性情報、ユーザ情報、再生履歴、保有楽曲情報等のユーザに関する情報ならびにユーザ端末200に保存されている音楽データのメタ情報およびムード情報等を送信する。また、入力部226により入力された他のユーザへのメッセージを、他のユーザ端末200に送信する。
(Transmitter)
The transmission unit 230 is connected to the network 300, and can transmit information to the electronic device such as the server 100 or another user terminal 200 via the network 300. Specifically, the transmission unit 230 sends information related to the user, such as sensitivity information, user information, reproduction history, and stored music information, of the user who owns the user terminal 200 to the server 100 and music stored in the user terminal 200. Send meta information and mood information of data. In addition, a message to another user input by the input unit 226 is transmitted to the other user terminal 200.
 <4.ユーザ端末のユーザインタフェース例>
 以上、本実施形態に係るユーザ端末200の構成例について説明した。次に、図15~22を参照しつつ、本実施形態に係るユーザ端末200のユーザインタフェース例について説明する。図15は、本開示の実施形態に係るユーザ端末200のユーザインタフェースの画面遷移図、図16~図22は、本開示の実施形態に係るユーザ端末200のユーザインタフェース画面の一例である。
<4. User interface example of user terminal>
Heretofore, the configuration example of the user terminal 200 according to the present embodiment has been described. Next, an example of a user interface of the user terminal 200 according to the present embodiment will be described with reference to FIGS. FIG. 15 is a screen transition diagram of the user interface of the user terminal 200 according to the embodiment of the present disclosure, and FIGS. 16 to 22 are examples of the user interface screen of the user terminal 200 according to the embodiment of the present disclosure.
 図15に示すように、本実施形態に係るユーザ端末200のユーザインタフェース画面は、第1階層としてのメインメニュー700と、第2階層としてのプレーヤ710、ライブラリ720、リコメンド730、コンタクト740および設定750と、第3階層としてのソング760、ユーザプロフィール770、タイムライン781、グループプロフィール782、チャット783およびマイプロフィール784とを有している。なお、これらのユーザインタフェース画面は、出力制御部204により生成され、表示部208に表示される。 As shown in FIG. 15, the user interface screen of the user terminal 200 according to the present embodiment includes a main menu 700 as the first hierarchy, a player 710, a library 720, a recommendation 730, a contact 740, and a setting 750 as the second hierarchy. And a song 760, a user profile 770, a timeline 781, a group profile 782, a chat 783, and a my profile 784 as a third hierarchy. These user interface screens are generated by the output control unit 204 and displayed on the display unit 208.
 図16に示すメインメニュー700は、ユーザ端末200起動時に表示される画面である。メインメニュー700の上方にはユーザ端末200を保有するユーザのユーザ情報画像500が表示されている。 The main menu 700 shown in FIG. 16 is a screen displayed when the user terminal 200 is activated. Above the main menu 700, a user information image 500 of the user who owns the user terminal 200 is displayed.
 ユーザ情報画像500には、外形が円形をなすユーザ画像501が表示され、その周囲を囲むようにしてユーザについての嗜好情報に基づいたムードに関する画像(ムード画像)503が表示されている。このユーザ画像501と嗜好情報503の表示形式は、図4に示すものと同様である。図4に示すように、ユーザ画像501の周囲には複数の、本実施形態では3つの異なるムード画像503A、503B、503Cが色分けして表示されている。また、ムード画像503A、503B、503Cは、ユーザ画像501の縁部に沿って配置された環、本実施形態では円環をなす画像である。そして、また、ムード画像503A、503B、503Cは、それぞれ異なる複数の色を含んで構成されており、このような色はそれぞれ異なるムードを表す。このようなムード画像503A~503Cは、例えば、ユーザの嗜好情報のうち、上述した計算において最も数値が大きい3つを表す。これにより、ユーザ端末200を保有するユーザは、自己の嗜好を客観的に認識できる。 In the user information image 500, a user image 501 having a circular outer shape is displayed, and an image (mood image) 503 related to the mood based on preference information about the user is displayed so as to surround the user image 501. The display format of the user image 501 and the preference information 503 is the same as that shown in FIG. As shown in FIG. 4, a plurality of mood images 503A, 503B, and 503C in the present embodiment are displayed in different colors around the user image 501. In addition, the mood images 503A, 503B, and 503C are images that are arranged along the edge of the user image 501 and form a ring in this embodiment. The mood images 503A, 503B, and 503C include a plurality of different colors, and such colors represent different moods. Such mood images 503A to 503C represent, for example, three of the user preference information having the largest numerical values in the above-described calculation. Thereby, the user who owns the user terminal 200 can objectively recognize his / her preference.
 なお、図示の態様に限定されず、ムード画像は、例えば図17に示されるものから選択される。図17は、本実施形態に係るユーザのムード画像の一例を示す図である。図17に示すムード画像503A~503Jは、それぞれ色分けされ(図中ではハッチングによって区別され)、異なる嗜好を示している。具体的には、ムード画像503Aは「ノリノリ(Euphoric)」を、ムード画像503Bは「幸せ(Happy)」を、ムード画像503Cは「楽しい(Joyful)」を、ムード画像503Dは「穏やか(Mild)」を、ムード画像503Eは「悲しい(Sad)」を、ムード画像503Fは「おごそか(Solemn)」を、ムード画像503Gは「明るい(Bright)」を、ムード画像503Hは「癒し(Healing)」を、ムード画像503Iは「爽やか(Fresh)」を、ムード画像503Jは「優雅(Elegant)」を、それぞれ示す。 It should be noted that the mood image is not limited to the illustrated mode, and the mood image is selected from those shown in FIG. FIG. 17 is a diagram illustrating an example of a mood image of a user according to the present embodiment. Each of the mood images 503A to 503J shown in FIG. 17 is color-coded (differentiated by hatching in the figure) and shows different preferences. More specifically, the mood image 503A is “Euphoric”, the mood image 503B is “Happy”, the mood image 503C is “Joyful”, and the mood image 503D is “Mild”. , Mood image 503E is “Sad”, mood image 503F is “Solen”, mood image 503G is “Bright”, and mood image 503H is “Healing”. The mood image 503I indicates “Fresh”, and the mood image 503J indicates “Elegant”.
 また、ユーザ情報画像500のユーザ画像501の右側には、ユーザ名505と、お気に入り情報507とが表示されている。 Also, on the right side of the user image 501 of the user information image 500, a user name 505 and favorite information 507 are displayed.
 また、メインメニュー700のユーザ情報画像500の下方には、プレーヤボタン701、ライブラリボタン702、リコメンドボタン703、コンタクトボタン704および設定ボタン705が配置されている。ユーザがこれらの画像上のタッチパネルをタップすると、ユーザインタフェース画面は、プレーヤ710、ライブラリ720、リコメンド730、コンタクト740および設定750の各画面にそれぞれ遷移する。 Also, below the user information image 500 of the main menu 700, a player button 701, a library button 702, a recommendation button 703, a contact button 704, and a setting button 705 are arranged. When the user taps the touch panel on these images, the user interface screen changes to the player 710, the library 720, the recommendation 730, the contact 740, and the setting 750, respectively.
 図18に示すプレーヤ710は、音楽データ再生時において表示される画面である。プレーヤ710の上部中央には、音楽データの楽曲情報画像711が表示されており、その下には操作ボタン712が配置される。操作ボタン712を押圧することにより音楽データの再生に関する操作が可能である。また、楽曲情報画像711を押圧すると、ユーザインタフェース画面は、その楽曲に関する情報を表示する画面であるソング760に遷移する。 A player 710 shown in FIG. 18 is a screen displayed when music data is reproduced. A music information image 711 of music data is displayed in the upper center of the player 710, and an operation button 712 is disposed below the music information image 711. By pressing the operation button 712, an operation related to reproduction of music data can be performed. Further, when the music information image 711 is pressed, the user interface screen changes to a song 760 that is a screen for displaying information related to the music.
 また、操作ボタン712の下方には、楽曲の進行を示すプログレスバー713が配置されており、プログレスバー713に沿って、自己の感性ポイント716と、他のユーザの感性情報としてのユーザ情報画像500が示されている。この感性ポイント716およびユーザ情報画像500の画像および配置位置は、通知情報に基づいて生成される。 Further, a progress bar 713 indicating the progress of the music is arranged below the operation button 712. Along with the progress bar 713, the user's sensitivity point 716 and the user information image 500 as sensitivity information of other users. It is shown. Images and arrangement positions of the sensitivity point 716 and the user information image 500 are generated based on the notification information.
 したがって、ユーザは、他のユーザの感情が動いた(ノッた)部分を視認することが可能となるとともに、自己の感性ポイント716と他のユーザの感性情報としてのユーザ情報画像500とを比較して、自己と他のユーザとの共感部分を認識することができる。また、このユーザ情報画像500には、嗜好情報が記載されていることから、ユーザは、同嗜好情報と自己の嗜好とを比較して、他のユーザと自己の音楽に対する感性の相性を検討することができる。 Therefore, the user can visually recognize the part where the emotion of another user has moved (knocked), and compares his / her sensitivity point 716 with the user information image 500 as sensitivity information of the other user. Thus, it is possible to recognize the sympathy part between the user and other users. In addition, since preference information is described in the user information image 500, the user compares the preference information with his / her preference, and examines the compatibility of his / her sensibility with his / her own music. be able to.
 また、操作ボタン712の右側には、感性入力ボタン714が配置されている。ユーザが音楽データの再生に合わせて感性入力ボタン714を押圧すると、感性情報を構成するための体動情報が入力部226に入力される。 Further, a sensitivity input button 714 is arranged on the right side of the operation button 712. When the user presses the sensitivity input button 714 in accordance with the reproduction of the music data, body movement information for configuring the sensitivity information is input to the input unit 226.
 また、楽曲情報画像711の周囲には、動く泡のアニメーション715が表示されている。アニメーション715は、通知情報に基づいて生成されるものであり、自己の感性情報に基づいて表示される。例えば、アニメーション715においては、自己の感性ポイントの和が大きい部分や大きい楽曲においては、より多くの泡が表示される。このようなアニメーション715により、ユーザは自己の楽曲に対する感性を改めて認識することができ、気分を高揚させることができる。 In addition, a moving bubble animation 715 is displayed around the music information image 711. The animation 715 is generated based on the notification information and is displayed based on its own sensitivity information. For example, in the animation 715, more bubbles are displayed in a portion where the sum of the sensitivity points of the self is large or in a large musical piece. With such an animation 715, the user can recognize the sensitivity to his / her music again, and can raise his / her mood.
 また、プログレスバー713およびこれに沿ったユーザ情報画像500の下方には、さらにコメント717がユーザ情報画像500と合わせて表示されている。コメント717には、他のユーザが行った楽曲に対するコメントが表示される。 Further, a comment 717 is further displayed together with the user information image 500 below the progress bar 713 and the user information image 500 along the progress bar 713. In the comment 717, a comment on the music performed by another user is displayed.
 なお、プレーヤ710に記載される各ユーザ情報画像500を押圧すると、ユーザインタフェース画面は、ユーザ情報画像500に対応するユーザのユーザプロフィール770に遷移する。なお設定に応じて、プレーヤ710に記載される各ユーザ情報画像500を押圧すると、ユーザインタフェース画面は、ユーザ情報画像500に対応するユーザの観賞した楽曲を時系列で表示するタイムライン781に移行する。 Note that when each user information image 500 described in the player 710 is pressed, the user interface screen changes to the user profile 770 of the user corresponding to the user information image 500. In addition, when each user information image 500 described in the player 710 is pressed according to the setting, the user interface screen shifts to a timeline 781 that displays the music viewed by the user corresponding to the user information image 500 in time series. .
 次に、ライブラリ720には、ユーザが保有する音楽データが表示される。ユーザが音楽データを適宜選択すると、ユーザインタフェース画面は、プレーヤ710に移行して音楽データが再生可能となる。または、ユーザインタフェース画面は、ソング760に遷移して、音楽データについての楽曲の情報を閲覧できる。 Next, music data held by the user is displayed in the library 720. When the user selects music data as appropriate, the user interface screen is shifted to the player 710 and the music data can be reproduced. Alternatively, the user interface screen can transition to the song 760 and browse music information about the music data.
 次に、図19に示すリコメンド730は、推薦ユーザ決定部108において決定された推薦ユーザを表示する画面である。リコメンド730は、ユーザタブ731と、グループタブ732と、ニアバイタブ733と、検索ボタン734を有している。 Next, a recommendation 730 shown in FIG. 19 is a screen that displays the recommended users determined by the recommended user determination unit 108. The recommendation 730 includes a user tab 731, a group tab 732, a near-by tab 733, and a search button 734.
 ユーザがユーザタブ731を押圧すると、推薦ユーザ決定部108において決定された推薦ユーザのユーザ情報画像500が、その推薦の度合に応じて列挙される。また、各ユーザ情報画像500においては、ユーザ画像501Aおよび嗜好情報503が表示されている。これにより、ユーザ端末200を所持するユーザは、推薦ユーザの推薦される度合のみならず音楽に対する嗜好を十分に判断できる。また、このようなユーザタブ731を押圧してあらわれるユーザ情報画面を押圧すると、ユーザインタフェース画面は、対応する推薦ユーザを表示するユーザプロフィール770に遷移する。 When the user presses the user tab 731, the user information images 500 of recommended users determined by the recommended user determining unit 108 are listed according to the degree of recommendation. In each user information image 500, a user image 501A and preference information 503 are displayed. Thereby, the user possessing the user terminal 200 can sufficiently determine not only the degree of recommendation of the recommended user but also the preference for music. When the user information screen displayed by pressing the user tab 731 is pressed, the user interface screen changes to a user profile 770 that displays a corresponding recommended user.
 また、ユーザがグループタブ732を押圧すると、ユーザに対して推薦されるグループが列挙される。このようなグループは、推薦ユーザ決定部108において決定された推薦ユーザが所属するグループの他、グループ内のチャットにおいて投稿されている楽曲またはアーティストの特性がユーザの嗜好情報に近いグループから選択される。また、このようなグループタブ732を押圧してあらわれるグループについての画像を押圧すると、ユーザインタフェース画面は、対応するグループを表示するグループプロフィール782に遷移する。 Further, when the user presses the group tab 732, groups recommended to the user are listed. Such a group is selected from the group to which the recommended user determined by the recommended user determination unit 108 belongs, and the group whose characteristics of the music or artist posted in the chat in the group are close to the user's preference information. . In addition, when an image regarding a group that appears by pressing the group tab 732 is pressed, the user interface screen changes to a group profile 782 that displays the corresponding group.
 また、ユーザがニアバイタブ733を押圧すると、ユーザのお気に入りユーザやユーザが参加しているグループ内のユーザのうち、ユーザの付近に存在するユーザが列挙される。また、このようなニアバイタブ733を押圧してあらわれるユーザについての画像を押圧すると、ユーザインタフェース画面は、対応するユーザとのチャットが可能なチャット783に遷移する。これにより、例えば、ユーザがライブに参加している場合において、ユーザは、ライブ会場付近に存在する他のユーザを知ることができ、当該他のユーザとコミュニケーションを行う、または直接会うことが可能となる。 Further, when the user presses the near buy tab 733, among the user's favorite users and the users in the group in which the user is participating, users existing in the vicinity of the user are listed. In addition, when an image about a user that appears by pressing the near-by tab 733 is pressed, the user interface screen changes to a chat 783 that allows chatting with the corresponding user. As a result, for example, when a user is participating in a live performance, the user can know other users in the vicinity of the live venue and can communicate with the other users or meet directly. Become.
 また、リコメンド730には、ユーザタブ731、グループタブ732およびニアバイタブ733より情報に配置された検索ボタン734が表示されている。ユーザが検索ボタン734を押圧すると、ユーザインタフェース画面は、他のユーザまたはグループを検索するための画面に遷移する。 Also, in the recommendation 730, a search button 734 arranged in the information from the user tab 731, the group tab 732, and the near-by tab 733 is displayed. When the user presses the search button 734, the user interface screen changes to a screen for searching for another user or group.
 図20に示すコンタクト740は、ユーザ端末200を保有するユーザが他のユーザに対し連絡をとるための画面である。コンタクト740は、ユーザタブ741と、お気に入りタブ742と、グループタブ743と、検索ボタン744とを有している。 The contact 740 shown in FIG. 20 is a screen for the user who owns the user terminal 200 to contact other users. The contact 740 includes a user tab 741, a favorite tab 742, a group tab 743, and a search button 744.
 ユーザがユーザタブ741を押圧すると、ユーザが以前に連絡を取ったことのある他のユーザが、連絡の履歴の時系列または頻度に沿って列挙される。また、このようなユーザタブ741を押圧して現れる他のユーザについての画像を押圧すると、ユーザインタフェース画面は、対応するユーザとのチャットが可能なチャット783に遷移する。 When the user presses the user tab 741, other users that the user has contacted before are listed according to the time series or frequency of the contact history. When the user tab 741 is pressed and an image about another user appearing is pressed, the user interface screen transitions to a chat 783 that allows chatting with the corresponding user.
 ユーザがお気に入りタブ742を押圧すると、ユーザのお気に入りに登録したお気に入りユーザのユーザ情報画像500が、列挙される。また、各ユーザ情報画像500においては、ユーザ画像501Aおよび嗜好情報503が表示されている。また、このようなお気に入りタブ742を押圧して現れるユーザ情報画像500を押圧すると、ユーザインタフェース画面は、対応するユーザとのチャットが可能なチャット783に遷移する。 When the user presses the favorites tab 742, user information images 500 of favorite users registered in the user's favorites are listed. In each user information image 500, a user image 501A and preference information 503 are displayed. In addition, when the user information image 500 that appears by pressing the favorite tab 742 is pressed, the user interface screen is changed to a chat 783 that allows chat with the corresponding user.
 ユーザがグループタブ743を押圧すると、ユーザが参加しているグループが、列挙される。また、このようなグループタブ743を押圧して現れるグループについての画像を押圧すると、ユーザインタフェース画面は、対応するグループのグループプロフィール782に遷移する。 When the user presses the group tab 743, groups in which the user is participating are listed. Further, when an image regarding a group that appears by pressing the group tab 743 is pressed, the user interface screen changes to the group profile 782 of the corresponding group.
 また、コンタクト740には、ユーザタブ741、お気に入りタブ742およびグループタブ743より情報に配置された検索ボタン744が表示されている。ユーザが検索ボタン744を押圧すると、ユーザインタフェース画面は、他のユーザまたはグループを検索するための画面に遷移する。 In the contact 740, a search button 744 arranged in the information from the user tab 741, the favorite tab 742, and the group tab 743 is displayed. When the user presses the search button 744, the user interface screen changes to a screen for searching for another user or group.
 設定750は、ユーザ端末200およびサーバ100の各種設定を行うための画面である。設定750においては、ユーザの入力により、例えばユーザのプロフィールや、ユーザ固有の通知情報が設定される。また、設定750においては、ユーザの入力により、例えば、ユーザ端末200が取得する通知情報において、通知情報の元となる感性情報の由来を一人のユーザとすべきか、複数のユーザとすべきか、お気に入りのユーザとすべきか、全てのユーザとすべきかが選択される。 The setting 750 is a screen for performing various settings of the user terminal 200 and the server 100. In the setting 750, for example, a user profile or user-specific notification information is set by user input. Also, in the setting 750, for example, in the notification information acquired by the user terminal 200, whether the origin of the sensitivity information that is the source of the notification information should be a single user, a plurality of users, a favorite, Or all users should be selected.
 図21に示すソング760は、音楽データに関する情報および音楽データに係る楽曲についての各ユーザのコメントを表示する画面である。ソング760の上部には、音楽データの楽曲情報画像761が表示されており、その脇には操作ボタン762が配置される。操作ボタン762を押圧することにより音楽データの再生に関する操作が可能である。 The song 760 shown in FIG. 21 is a screen that displays information related to music data and comments of each user regarding the music related to the music data. A song information image 761 of music data is displayed on the upper part of the song 760, and an operation button 762 is arranged beside it. By pressing the operation button 762, an operation relating to the reproduction of music data can be performed.
 また、ソング760の楽曲情報画像761の下方には、ユーザ端末200を保有するユーザのユーザ画像501とその周囲を囲む嗜好情報が、コメント763とともに表示されている。コメント763は、ユーザ端末200を保有するユーザが投稿した楽曲に対するコメントである。また、コメント763の下方には、リプライボタン765が配置されている。リプライボタン765をユーザが押圧すると、コメント763への応答コメントの入力が可能となる。 Also, below the music information image 761 of the song 760, a user image 501 of the user who owns the user terminal 200 and preference information surrounding the user image 501 are displayed together with a comment 763. The comment 763 is a comment on the music posted by the user who owns the user terminal 200. A reply button 765 is disposed below the comment 763. When the user presses the reply button 765, a response comment to the comment 763 can be input.
 また、ユーザ画像501とコメント763の下方には、楽曲に対するコメント764が、コメント764を投稿した他のユーザのユーザ画像501Aおよびその周囲に配置されるムード画像503とともに表示される。また、コメント764の下方には、リプライボタン765が配置されている。リプライボタン765をユーザが押圧すると、隣接するコメント764への応答コメントの入力が可能となる。 Also, below the user image 501 and the comment 763, a comment 764 for the music is displayed together with a user image 501A of another user who posted the comment 764 and a mood image 503 arranged around the user image 501A. A reply button 765 is arranged below the comment 764. When the user presses the reply button 765, a response comment can be input to the adjacent comment 764.
 図22に示すユーザプロフィール770は、ユーザ端末200を保有するユーザ以外のユーザに関する情報を表示する画面である。ユーザプロフィール770の上方中央には、対象となるユーザのユーザ画像501Aとその周囲に配置されたムード画像503とが表示される。また、ユーザ画像501Aの下方には、ユーザ名505Aと、お気に入り情報507Aとが表示されている。お気に入り情報507Aには、対象となるユーザが追加したお気に入りユーザの数が記載される。 A user profile 770 shown in FIG. 22 is a screen that displays information about users other than the user who owns the user terminal 200. In the upper center of the user profile 770, a user image 501A of the target user and a mood image 503 arranged around the user image 501 are displayed. A user name 505A and favorite information 507A are displayed below the user image 501A. The number of favorite users added by the target user is described in the favorite information 507A.
 また、ユーザ名505A、お気に入り情報507Aの下方には、お気に入り追加ボタン771およびメッセージ送信ボタン772が表示されている。お気に入り追加ボタン771が押圧されると、対象となるユーザが、ユーザ端末200を保有するユーザのお気に入りユーザに追加される。メッセージ送信ボタン772が押圧されると、ユーザインタフェース画面は、チャット783に移行する。これにより、対象となるユーザに対しメッセージを送信することが可能となる。 Also, a favorite addition button 771 and a message transmission button 772 are displayed below the user name 505A and favorite information 507A. When the favorite addition button 771 is pressed, the target user is added to the favorite user of the user who owns the user terminal 200. When message transmission button 772 is pressed, the user interface screen shifts to chat 783. As a result, a message can be transmitted to the target user.
 また、お気に入り追加ボタン771およびメッセージ送信ボタン772の下方には、フィードタブ773、ソングタブ774、グループタブ775およびお気に入りタブ776が横方向に並んで配置されている。 Further, below the favorite addition button 771 and the message transmission button 772, a feed tab 773, a song tab 774, a group tab 775, and a favorite tab 776 are arranged in the horizontal direction.
 フィードタブ773が押圧されると、表示の対象となるユーザの行動履歴777、例えば音楽の観賞履歴、お気に入りユーザの追加履歴、グループへの参加履歴等が表示される。ソングタブ774が押圧されると、対象となるユーザの所持する音楽データが列挙される。グループタブ775が押圧されると、対象となるユーザの参加しているグループが列挙される。お気に入りタブ776が押圧されると、対象となるユーザのお気に入りユーザが列挙される。 When the feed tab 773 is pressed, an action history 777 of a user to be displayed, such as a music appreciation history, a favorite user addition history, a group participation history, and the like are displayed. When the song tab 774 is pressed, music data possessed by the target user is listed. When the group tab 775 is pressed, groups in which the target user is participating are listed. When the favorite tab 776 is pressed, favorite users of the target users are listed.
 また、フィードタブ773またはお気に入りタブ776が押圧された場合において表示されたユーザについての画像が押圧されると、対応するユーザのユーザプロフィール770が表示される。また、フィードタブ773またはグループタブ775が押圧された場合において表示されたグループについての画像が押圧されると、対応するグループのグループプロフィール782が表示される。さらに、フィードタブ773またはソングタブ774が押圧された場合において表示された音楽データについての画像が押圧されると、対応する音楽データについてのソング760が表示される。 Also, when the image about the displayed user is pressed when the feed tab 773 or the favorite tab 776 is pressed, the user profile 770 of the corresponding user is displayed. In addition, when the image for the displayed group is pressed when the feed tab 773 or the group tab 775 is pressed, the group profile 782 of the corresponding group is displayed. Further, when the image for the displayed music data is pressed when the feed tab 773 or the song tab 774 is pressed, the song 760 for the corresponding music data is displayed.
 図15に示すタイムライン781は、お気に入りのユーザが観賞した曲を列挙する画面である。タイムライン781に表示されるユーザの画像が押圧されると、ユーザインタフェース画面はユーザプロフィール770に遷移する。また、タイムライン781に表示される音楽データに関する画像が押圧されると、ユーザインタフェース画面は対応する音楽データを示すソング760に遷移する。 A timeline 781 shown in FIG. 15 is a screen for listing songs that a favorite user has watched. When the user image displayed on the timeline 781 is pressed, the user interface screen changes to the user profile 770. When an image related to music data displayed on the timeline 781 is pressed, the user interface screen transitions to a song 760 indicating the corresponding music data.
 グループプロフィール782は、グループに関する情報を表示する画面である。ここで、グループは、通常複数のユーザによって構成され、所定の目的をもって形成される構成単位である。所定の目的としては、例えば、嗜好するアーティストやジャンル、楽曲等についての交流が挙げられる。 The group profile 782 is a screen that displays information about the group. Here, a group is a structural unit that is usually configured by a plurality of users and formed with a predetermined purpose. The predetermined purpose includes, for example, exchanges about favorite artists, genres, songs, and the like.
 グループプロフィール782には、グループに参加しているメンバ(ユーザ)の情報と、メンバによって投稿された情報、例えば、コメント、楽曲情報が表示される。グループプロフィール782中に表示されるメンバについての画像が押圧されると、ユーザインタフェース画面は、対応するユーザのユーザプロフィール770、または設定に応じて対応するユーザとのチャット783に遷移する。グループプロフィール782中に表示される投稿された楽曲情報についての画像が押圧されると、ユーザインタフェース画面は、対応する楽曲情報を表示するソング760に遷移する。 The group profile 782 displays information on members (users) participating in the group and information posted by the members, for example, comments and music information. When an image about a member displayed in the group profile 782 is pressed, the user interface screen transitions to a user profile 770 of the corresponding user or a chat 783 with the corresponding user depending on the setting. When an image about posted music information displayed in the group profile 782 is pressed, the user interface screen transitions to a song 760 displaying the corresponding music information.
 また、グループプロフィール782には、適宜、ユーザがグループに参加、脱退するためのボタンが配置されている。 In the group profile 782, buttons for the user to join and leave the group are arranged as appropriate.
 チャット783は、ユーザ間でチャット、すなわちメッセージの送受信を行うための画面である。チャット783においては、各ユーザが送信したメッセージが、各ユーザについての画像とともに表示される。 Chat 783 is a screen for performing chat, that is, sending and receiving messages between users. In chat 783, a message transmitted by each user is displayed together with an image of each user.
 マイプロフィール784は、ユーザ端末200を保有するユーザに関する情報を表示する画面である。マイプロフィール784には、ユーザプロフィール770に対応する形式で、ユーザ端末200を保有するユーザに関する情報が表示される。ユーザ端末200を保有するユーザは、マイプロフィール784を参照することにより、自己の情報がどのように他のユーザに公開されるかを把握することができる。 My profile 784 is a screen that displays information about the user who owns the user terminal 200. In the my profile 784, information related to the user who owns the user terminal 200 is displayed in a format corresponding to the user profile 770. A user who owns the user terminal 200 can grasp how his / her information is disclosed to other users by referring to the my profile 784.
 以上、ユーザインタフェース画面の一例について説明した。しかしながら、ユーザ端末200において使用されるユーザインタフェース画面は図示の態様に限定されるものではなく、必要に応じて適宜画面が追加または省略されることが可能である。また、上述した各画面においては、必要に応じて各ボタンまたは情報に関する構成が追加または省略されてもよい。また、ユーザインタフェース画面は、上記で説明した以外にも、画面間の遷移が行われるように設定されていてもよい。 In the above, an example of the user interface screen has been described. However, the user interface screen used in the user terminal 200 is not limited to the illustrated mode, and screens can be added or omitted as appropriate. Moreover, in each screen mentioned above, the structure regarding each button or information may be added or abbreviate | omitted as needed. Further, the user interface screen may be set so that transition between screens is performed in addition to the above description.
 <5.情報処理システムの動作例>
 次に、上述した情報処理システム1000の動作の流れについて説明する。図23~25は、本開示の実施形態に係る情報処理システム1000の動作の一例を示すシーケンス図である。なお、図中、ユーザ端末200A、200Bは、それぞれ、上述したユーザ端末200から任意に選択される端末であり、したがって、ユーザ端末200と同様の構成を有する。また、ユーザ端末200Aの保有者が第1のユーザであり、ユーザ端末200Bの保有者が第2のユーザであるとして説明する。また、以下、情報処理システム1000の動作の流れについて、推薦ユーザの表示、嗜好情報の表示、および通知情報の再生に分けて説明する。
<5. Operation example of information processing system>
Next, an operation flow of the information processing system 1000 described above will be described. 23 to 25 are sequence diagrams illustrating an example of the operation of the information processing system 1000 according to the embodiment of the present disclosure. In the figure, each of the user terminals 200A and 200B is a terminal that is arbitrarily selected from the user terminals 200 described above, and thus has the same configuration as the user terminal 200. Further, the description will be made assuming that the owner of the user terminal 200A is the first user and the owner of the user terminal 200B is the second user. Hereinafter, the operation flow of the information processing system 1000 will be described separately for display of recommended users, display of preference information, and reproduction of notification information.
  (推薦ユーザの表示)
 図23に示すように、まず、ユーザ端末200Aは、各音楽データについてユーザ端末200Aを保有する第1のユーザの感性情報を生成する(S801)。具体的には、楽曲再生部214、出力制御部204および出力部206により再生、出力される音楽データの楽曲に合わせて、入力部226より感性ポイントが入力される。感性ポイントの入力は、上述したユーザインタフェース画面のプレーヤ710において、感性入力ボタン714が第1のユーザによりタップされることにより行われる。次いで、感性情報生成部216は、入力された各感性ポイントを音楽データの時間軸と関連付けて、感性情報を生成する。
(Recommended user display)
As shown in FIG. 23, first, the user terminal 200A generates sensitivity information of the first user who owns the user terminal 200A for each music data (S801). Specifically, sensitivity points are input from the input unit 226 in accordance with the music of the music data that is played back and output by the music playback unit 214, the output control unit 204, and the output unit 206. The sensitivity points are input by tapping the sensitivity input button 714 by the first user in the player 710 on the user interface screen described above. Next, the sensitivity information generation unit 216 generates sensitivity information by associating each input sensitivity point with the time axis of the music data.
 次いで、ユーザ端末200Aは、生成した感性情報を、送信部230を介してサーバ100へ送信する(S803)。 Next, the user terminal 200A transmits the generated sensitivity information to the server 100 via the transmission unit 230 (S803).
 一方で、ユーザ端末200Bも、第2のユーザの感性情報を生成する(S805)。次いで、ユーザ端末200Bは、第2のユーザの感性情報をサーバ100に送信する(S807)。サーバ100は、第1のユーザおよび第2のユーザを含む各ユーザの感性情報を取得する(S809)。 On the other hand, the user terminal 200B also generates sensitivity information of the second user (S805). Next, the user terminal 200B transmits the sensitivity information of the second user to the server 100 (S807). The server 100 acquires sensitivity information of each user including the first user and the second user (S809).
 次いで、サーバ100は、推薦ユーザ決定部108において、感性情報を取得した複数のユーザから、第2のユーザに提示すべきユーザを推薦ユーザとして決定する(S811)。具体的な推薦ユーザの決定手順については、上述した通りであり、音楽データの部分に対する各ユーザの感性情報の関連性に基づいて決定が行われる。なお、以下の説明では、推薦ユーザが第1のユーザであるとして説明する。なお、推薦ユーザは、複数であってもよい。 Next, in the recommended user determination unit 108, the server 100 determines a user to be presented to the second user as a recommended user from the plurality of users who have acquired the sensitivity information (S811). The specific procedure for determining recommended users is as described above, and the determination is made based on the relevance of the sensitivity information of each user to the music data portion. In the following description, it is assumed that the recommended user is the first user. Note that there may be a plurality of recommended users.
 次いで、サーバ100は、推薦ユーザとしての第1のユーザのユーザ情報をユーザ端末200Bに送信し(S813)、ユーザ端末200Bは、第1のユーザのユーザ情報を取得する(S815)。 Next, the server 100 transmits the user information of the first user as the recommended user to the user terminal 200B (S813), and the user terminal 200B acquires the user information of the first user (S815).
 次いで、ユーザ端末200Bの出力制御部204は、第1のユーザのユーザ情報を表示するように表示部208を制御し、表示部208は、当該ユーザ情報を表示する(S817)。これにより、第2のユーザは、第1のユーザのユーザ情報を認識可能、本実施形態では閲覧可能となる。このようなユーザ情報の表示は、例えば、上述したユーザインタフェース画面のリコメンド730において第1のユーザのユーザ情報が表示されることにより行われる。 Next, the output control unit 204 of the user terminal 200B controls the display unit 208 to display the user information of the first user, and the display unit 208 displays the user information (S817). As a result, the second user can recognize the user information of the first user, and can browse in the present embodiment. Such display of user information is performed, for example, by displaying the user information of the first user in the above-described recommendation 730 on the user interface screen.
  (嗜好情報の表示)
 次に、嗜好情報の表示方法について説明する。図24に示すように、まず、ユーザ端末200Aは、ムード解析部218において、音楽データのムード情報を解析する(S821)。次いで、ユーザ端末200Aは、各音楽データについてユーザ端末200Aを保有する第1のユーザの感性情報を生成する(S823)。ユーザ端末200Aは、送信部230を介して得られたムード情報および感性情報をサーバ100へ送信する(S825)。サーバ100は、ムード情報および感性情報を取得する(S827)。
(Display of preference information)
Next, a method for displaying preference information will be described. As shown in FIG. 24, first, the user terminal 200A analyzes the mood information of music data in the mood analysis unit 218 (S821). Next, the user terminal 200A generates sensitivity information of the first user who owns the user terminal 200A for each music data (S823). The user terminal 200A transmits the mood information and sensitivity information obtained through the transmission unit 230 to the server 100 (S825). The server 100 acquires mood information and sensitivity information (S827).
 次いで、サーバ100の嗜好情報生成部110は、ムード情報および感性情報に基づいて、第1のユーザの嗜好情報を生成する(S829)。サーバ100は、生成した嗜好情報とともに、ユーザ情報データベース104に記憶されたユーザ情報をユーザ端末200Bに送信する(S831)。 Next, the preference information generation unit 110 of the server 100 generates the first user's preference information based on the mood information and the sensitivity information (S829). The server 100 transmits the user information stored in the user information database 104 together with the generated preference information to the user terminal 200B (S831).
 ユーザ端末200Bは、第1のユーザの嗜好情報およびユーザ情報を受信する(S833)。次いで、ユーザ端末200Bの出力制御部204は、受信した第1のユーザの嗜好情報およびユーザ情報を表示するように、表示部208を制御し、表示部は、第1のユーザの嗜好情報およびユーザ情報を表示する。このようなユーザ情報および嗜好情報の表示は、例えば、図4に示すような上述した方法で行われることができる。 The user terminal 200B receives the first user's preference information and user information (S833). Next, the output control unit 204 of the user terminal 200B controls the display unit 208 so as to display the received first user preference information and user information, and the display unit displays the first user preference information and the user information. Display information. Such display of user information and preference information can be performed, for example, by the method described above as shown in FIG.
  (通知情報の再生)
 次に、通知情報の再生方法について説明する。
 図25に示すように、まず、ユーザ端末200Aは、ある音楽データについてユーザ端末200Aを保有する第1のユーザの感性情報を生成する(S841)。次いで、ユーザ端末200Aは、生成した感性情報を、送信部230を介してサーバ100へ送信する(S843)。サーバ100は、第1のユーザの感性情報を取得する(S845)。
(Replay of notification information)
Next, a method for reproducing the notification information will be described.
As shown in FIG. 25, first, the user terminal 200A generates sensitivity information of the first user who owns the user terminal 200A for certain music data (S841). Next, the user terminal 200A transmits the generated sensitivity information to the server 100 via the transmission unit 230 (S843). The server 100 acquires the sensitivity information of the first user (S845).
 次いで、サーバ100の通知情報生成部112は、上記音楽データに関する感性情報および楽曲情報データベース106に記憶された音楽データのメタ情報に基づいて、通知情報を生成する(S847)。サーバ100は、生成された通知情報を送信部114を介してユーザ端末200Bに送信する(S849)。ユーザ端末200Bは、通知情報を受信する(S851)。 Next, the notification information generation unit 112 of the server 100 generates notification information based on the sensitivity information regarding the music data and the meta information of the music data stored in the music information database 106 (S847). The server 100 transmits the generated notification information to the user terminal 200B via the transmission unit 114 (S849). The user terminal 200B receives the notification information (S851).
 次いで、第2のユーザがユーザ端末200Bを操作して、ユーザ端末200Bにおいて音楽データが再生される際に、音楽データの楽曲の進行に合わせて通知情報が再生される(S853)。通知情報の再生は、例えば図3において示したように行われる。具体的には第1のユーザとしてのユーザCの感性ポイント401Cが生じた部分において、効果音403Cが生じるように、通知情報が再生される。また、例えば、図18に示すようにプログレスバー713に沿って、各ユーザによる感性ポイントの生じた位置にユーザ情報画像500が表示される。また、プログレスバー713に沿って、通知情報として自己の感性ポイント716が表示される。 Next, when the second user operates the user terminal 200B and the music data is reproduced on the user terminal 200B, the notification information is reproduced in accordance with the progress of the music in the music data (S853). The notification information is reproduced as shown in FIG. 3, for example. Specifically, the notification information is reproduced so that the sound effect 403C is generated at the portion where the sensitivity point 401C of the user C as the first user is generated. Further, for example, as shown in FIG. 18, along the progress bar 713, the user information image 500 is displayed at a position where a sensitivity point is generated by each user. In addition, along with the progress bar 713, own sensitivity points 716 are displayed as notification information.
 <6.変形例>
 以上、本開示の実施形態を説明した。以下では、本開示の上記実施形態の幾つかの変形例を説明する。なお、以下に説明する各変形例は、単独で本発明の上記実施形態に適用されてもよいし、組み合わせで本発明の上記実施形態に適用されてもよい。また、各変形例は、本開示の上記実施形態で説明した構成に代えて適用されてもよいし、本開示の上記実施形態で説明した構成に対して追加的に適用されてもよい。
<6. Modification>
The embodiment of the present disclosure has been described above. Hereinafter, some modified examples of the embodiment of the present disclosure will be described. In addition, each modification demonstrated below may be applied to the said embodiment of this invention independently, and may be applied to the said embodiment of this invention in combination. Each modification may be applied instead of the configuration described in the above-described embodiment of the present disclosure, or may be additionally applied to the configuration described in the above-described embodiment of the present disclosure.
 [第1の変形例]
 上述した実施形態においては、推薦ユーザ決定部108は、楽曲の各部分に対する感性ポイントのユーザ間の積をユーザ間の感性の関連性の指標としたが、本開示はこれに限定されず、各ユーザの音楽データの一部分に対する感性情報がユーザ間の感性の関連性の評価に用いられていればよい。図26~図28は、図5に示すサーバの推薦ユーザ決定部108で用いられる情報の他の一例である。
[First Modification]
In the above-described embodiment, the recommended user determining unit 108 uses the product of the sensitivity points for each part of the music between the users as an index of the sensitivity relationship between the users. However, the present disclosure is not limited to this, Sensitivity information for a part of the user's music data only needs to be used for evaluation of the relevance of sensitivity between users. 26 to 28 are other examples of information used by the recommended user determining unit 108 of the server shown in FIG.
 例えば、ユーザの保有する音楽データのムード情報にも基づいて、ユーザ間の感性の関連性の評価が行われてもよい。例えば、図11に示す保有する楽曲のムード毎の感性ポイントは、ユーザの保有する音楽データのムード情報および、ユーザの音楽データの各部分に対する感性ポイントに基づいて生成されている。そして、図26に示すように、図12に示す保有する楽曲のムード毎の感性ポイントの積算値の相対頻度の積を、ユーザ間で算出し、この合計をユーザ間の感性の関連性の指標とすることができる。 For example, the sensitivity relationship between users may be evaluated based on mood information of music data held by the users. For example, the sensitivity points for each mood of music held in FIG. 11 are generated based on the mood information of the music data held by the user and the sensitivity points for each part of the user's music data. Then, as shown in FIG. 26, the product of the relative frequency of the integrated value of the sensitivity points for each mood of the music shown in FIG. 12 is calculated between the users, and this sum is an index of the sensitivity relationship between the users. It can be.
 また、例えば、ユーザの嗜好情報に基づいてユーザ間の感性の関連性の評価が行われてもよい。図13に示すような嗜好情報は、ムード毎に嗜好に関する値を有している。図27に示すように、このようなムード毎の嗜好に関する値をムード毎に乗じた総和をユーザ間で算出し、この合計をユーザ間の感性の関連性の指標とすることができる。このような総和(適合度)が、一定以上の値である場合または、他のユーザと比較して一定範囲内で相対的に大きい場合には、推薦ユーザ決定部108は、このような適合度を有するユーザを推薦ユーザとして決定することができる。 Further, for example, evaluation of the relevance of sensibility between users may be performed based on user preference information. The preference information as shown in FIG. 13 has a value related to preference for each mood. As shown in FIG. 27, the sum total obtained by multiplying the value related to the preference for each mood for each mood can be calculated between users, and this sum can be used as an index of the relevance of sensitivity between users. If such total (fitness) is a value greater than or equal to a certain value, or is relatively large within a certain range compared to other users, the recommended user determination unit 108 determines such suitability. Can be determined as a recommended user.
 なお、上記のようにムード毎に積を算出した場合、推薦ユーザ決定部108は、ユーザ間で共通する音楽データについて感性情報が生成されていない場合においても、推薦ユーザを決定することが可能である。すなわち、本開示においては、推薦ユーザの決定は、各ユーザの感性情報に基づいて行われるものであればよく、必ずしも共通する音楽データについての感性情報に基づく必要はない。 When the product is calculated for each mood as described above, the recommended user determination unit 108 can determine the recommended user even when no sensitivity information is generated for music data that is common among users. is there. That is, in the present disclosure, the recommended user may be determined based on the sensitivity information of each user, and does not necessarily need to be based on the sensitivity information about the common music data.
 また、例えば、推薦ユーザの推薦を受けるユーザの音楽データの再生履歴にも基づいて、推薦ユーザが決定されてもよい。例えば、音楽データの再生履歴において、各音楽データの感性情報中の感性ポイントに対し、それぞれの音楽データの再生頻度に応じた係数を乗じてもよい。また、係数は、単純に再生頻度が多いほど大きくなるものであってもよい。また、係数は、これに加えて音楽データを再生してからの経過時間を考慮して設定されてもよい。例えば再生頻度が同一であっても、音楽データを再生してからの経過時間が長いほど、係数を小さくすることができる。このような場合、ユーザ端末200は、ユーザの再生履歴を記憶する再生履歴データベースを有し、再生履歴は、適宜サーバ100に送信される。 Further, for example, the recommended user may be determined based on the reproduction history of the music data of the user who receives the recommendation of the recommended user. For example, in the music data playback history, the sensitivity point in the sensitivity information of each music data may be multiplied by a coefficient corresponding to the playback frequency of each music data. Further, the coefficient may simply increase as the reproduction frequency increases. In addition to this, the coefficient may be set in consideration of an elapsed time after the music data is reproduced. For example, even if the reproduction frequency is the same, the coefficient can be reduced as the elapsed time from the reproduction of the music data is longer. In such a case, the user terminal 200 has a reproduction history database that stores a user's reproduction history, and the reproduction history is transmitted to the server 100 as appropriate.
 [第2の変形例]
 上述した実施形態においては、嗜好情報生成部110は、ユーザの楽曲のムードに対する感性情報と、ユーザの保有する楽曲のムードとに基づいて嗜好情報を生成したが、本開示はこれに限定されず、楽曲に含まれる1種以上のムードに対するユーザの感性情報と楽曲のムードとに基づいて嗜好情報を生成するものであればよい。図29、図30は、図5に示すサーバ100の嗜好情報生成部110において用いられるデータの他の一例である。
[Second Modification]
In the embodiment described above, the preference information generation unit 110 generates the preference information based on the sensitivity information for the mood of the user's music and the mood of the music held by the user, but the present disclosure is not limited thereto. What is necessary is just to generate preference information based on the user's sensibility information for one or more moods included in the music and the mood of the music. 29 and 30 show another example of data used in the preference information generation unit 110 of the server 100 shown in FIG.
 例えば、嗜好情報生成部110は、ユーザの楽曲のムードに対する感性情報のみに基づいて嗜好情報を生成してもよい。 For example, the preference information generation unit 110 may generate preference information based only on the sensitivity information for the mood of the user's music.
 また、例えば、嗜好情報生成部110は、ユーザの楽曲のムードに対する感性情報、ユーザの保有する楽曲のムードに加え、ユーザが再生した再生履歴に基づいて嗜好情報を生成してもよい。例えば、嗜好情報生成部110は、楽曲の再生履歴に基づいて、図28の表621に示すような全ての再生した楽曲におけるムード数を積算し、図29の表623に示すような全ての再生した楽曲におけるムード数の相対頻度を算出する。次いで、嗜好情報生成部110は、図29の表623、図12の表613および図10の表609における各相対頻度をユーザ、ムード毎に平均し、図30に示すようなユーザの嗜好情報を得る。 Also, for example, the preference information generation unit 110 may generate preference information based on the sensitivity information about the mood of the user's music, the mood of the music held by the user, and the reproduction history reproduced by the user. For example, the preference information generation unit 110 accumulates the number of moods in all the reproduced music pieces as shown in the table 621 in FIG. 28 based on the reproduction history of the music pieces, and performs all the reproductions as shown in the table 623 in FIG. Calculate the relative frequency of the number of moods in the music. Next, the preference information generation unit 110 averages each relative frequency in the table 623 in FIG. 29, the table 613 in FIG. 12, and the table 609 in FIG. 10 for each user and mood, and the user preference information as shown in FIG. obtain.
 なお、嗜好情報生成部110は、全ての再生した楽曲におけるムード数を積算する過程において、楽曲の再生頻度に応じて楽曲毎にムードに所定の係数を乗じてもよい。すなわち、楽曲の再生頻度に応じて楽曲に重みづけを行ってもよい。係数としては、上述した推薦ユーザの決定時における係数と同様とすることができる。 Note that the preference information generation unit 110 may multiply the mood by a predetermined coefficient for each piece of music in accordance with the frequency of reproduction of the music in the process of accumulating the number of moods of all played music. That is, the music may be weighted according to the frequency of music playback. The coefficient can be the same as the coefficient at the time of determining the recommended user described above.
 また、嗜好情報生成部110は、異なる情報に基づいた複数の嗜好情報を生成してもよい。この場合において、嗜好情報の少なくとも1つは、楽曲に含まれる1種以上のムードに対するユーザの感性情報と楽曲のムードとに基づいて生成される。 Further, the preference information generation unit 110 may generate a plurality of preference information based on different information. In this case, at least one of the preference information is generated based on the user's sensibility information and the mood of the music for one or more moods included in the music.
 [第3の変形例]
 上述した実施形態においては、ユーザ情報画像500には、外形が円形をなすユーザ画像501が表示され、その周囲を囲むようにしてユーザについてのムード画像503が表示されているものとして説明したが、本開示はこれに限定されない。図31~図35は、本開示の変形例に係るムード画像の表示を示す。
[Third Modification]
In the above-described embodiment, the user information image 500 has been described on the assumption that the user image 501 having a circular outer shape is displayed and the mood image 503 about the user is displayed so as to surround the user image 501. Is not limited to this. 31 to 35 show the display of mood images according to the modified example of the present disclosure.
 例えば、ユーザ画像は多角形、例えば3~12の角を有する多角形、特に四角形であり、その外周に沿って例えば環状のムード画像が表示されてもよい。また、例えば、表示されるムード画像の数は、特に限定されない。 For example, the user image may be a polygon, for example, a polygon having 3 to 12 corners, particularly a quadrangle, and an annular mood image may be displayed along the outer periphery thereof. For example, the number of mood images to be displayed is not particularly limited.
 また、図31、図32に示されるように、ムード画像513A~513C、523A~523Cは、ムードを表す色の帯が複数並んで構成されていてもよい。この場合、例えば、図31、図32に示されるように、ムード画像513A~513C、523A~523Cは、ユーザ画像501の背景として表示されることができる。なお、ムード画像は、図31、図32に示されるように縦縞として表示されていてもよいし、横縞として表示されてもよいし、水平線から所定の角度を有する縞として表示されていてもよい。 Further, as shown in FIGS. 31 and 32, the mood images 513A to 513C and 523A to 523C may be configured by arranging a plurality of color bands representing the mood. In this case, for example, as shown in FIGS. 31 and 32, the mood images 513A to 513C and 523A to 523C can be displayed as the background of the user image 501. The mood image may be displayed as vertical stripes as shown in FIGS. 31 and 32, may be displayed as horizontal stripes, or may be displayed as stripes having a predetermined angle from the horizontal line. .
 また、図31に示されるように、出力制御部204は、ムード画像513A~513Cがユーザの各嗜好の多寡に関わらず均等に表示されるように表示部208を制御してもよい。この場合において、出力制御部204は、例えば、よりユーザの嗜好性の高い嗜好情報に関するムード画像(第1のムード)513Bが、他の嗜好情報に関するムード画像(第2のムード)513A、513Cよりもユーザ画像501に近接するように表示されるように表示部208を制御してもよい。 Also, as shown in FIG. 31, the output control unit 204 may control the display unit 208 so that the mood images 513A to 513C are displayed evenly regardless of the user's preference. In this case, for example, the output control unit 204 is configured such that the mood image (first mood) 513B related to the preference information with higher user preference is from the mood images (second mood) 513A and 513C related to other preference information. Alternatively, the display unit 208 may be controlled so as to be displayed close to the user image 501.
 また、図32に示されるように、出力制御部204は、よりユーザの嗜好性の高い嗜好情報(第2のムード)に関するムード画像523Aが、他の嗜好情報(第1のムード、第3のムード)に関するムード画像よりも大きく表示されるように、表示部208を制御してもよい。この場合において、出力制御部204は、例えば、嗜好情報生成部110において算出された第1のムードについての値と前記第2のムードについての値との関係に基づいて、第1のムードに関する画像の面積に対する第2のムードに関する画像の面積の比を決定することができる。より具体的には、各ムード画像513A~513Cの面積は、嗜好情報生成部110において算出された各ムードについての値に応じて決定することができる。 Also, as shown in FIG. 32, the output control unit 204 indicates that the mood image 523A related to the preference information (second mood) with higher user preference is the other preference information (first mood, third mood). The display unit 208 may be controlled so as to be displayed larger than the mood image related to the mood. In this case, the output control unit 204, for example, based on the relationship between the value about the first mood calculated by the preference information generation unit 110 and the value about the second mood, the image about the first mood The ratio of the area of the image for the second mood to the area of can be determined. More specifically, the areas of the mood images 513A to 513C can be determined according to the value for each mood calculated by the preference information generation unit 110.
 また、出力制御部204は、基準点を有し、かつユーザの嗜好情報を表示する画像において、楽曲のムードのうち第2のムードよりもユーザの嗜好性の高い第1のムードに関する画像が、第2のムードに関する画像よりも基準部分に近く表示されるように、表示部208を制御してもよい。ここで、基準部分は、ムード画像を表示するために設定される表示部208の画面上の点、線、または一定の面積を有する部位である。このような基準部分は、例えばユーザ画像501の表示部位であることができる。 The output control unit 204 includes a reference point and displays user preference information. An image related to the first mood having a higher user preference than the second mood among the music moods is displayed. You may control the display part 208 so that it may be displayed nearer to a reference | standard part rather than the image regarding a 2nd mood. Here, the reference portion is a point, a line, or a part having a certain area on the screen of the display unit 208 set to display the mood image. Such a reference portion can be, for example, a display part of the user image 501.
 例えば、図33、図34に示されるように、出力制御部204は、各ムード画像533A~533I、533A~533Iが、画像の中心を基準部分として、よりユーザの嗜好性の高いムードがより基準部分近くに表示されるように、表示部208を制御してもよい。なお、この場合においても、出力制御部204は、よりユーザの嗜好性の高い嗜好情報に関するムード画像が、他の嗜好情報に関するムード画像よりも大きく表示されるように、表示部208を制御してもよい。 For example, as shown in FIG. 33 and FIG. 34, the output control unit 204 uses the mood images 533A to 533I and 533A to 533I as the reference part, and the mood with higher user preference is the reference. You may control the display part 208 so that it may display near a part. Even in this case, the output control unit 204 controls the display unit 208 so that a mood image related to preference information having higher user preference is displayed larger than a mood image related to other preference information. Also good.
 また、複数の嗜好情報が表示される場合において、図35に示されるように、出力制御部204は、異なる情報に基づいた複数の嗜好情報に関するムード画像563、573、583が表示されるように表示部208を制御してもよい。例えば、図35においては、ムード画像563、573、583が同心円を形成するようにして重畳して表示されている。例えば、図中、中心に記載されるムード画像563は、ユーザの楽曲のムードに対する感性情報に基づくものである。また、嗜好情報563の外周に沿って配置されるムード画像573は、ユーザの保有する楽曲のムードに基づくものである。さらに、ムード画像573の外周に沿って配置されるムード画像583は、ユーザが再生した再生履歴に基づくものである。これらのムード画像563、573、583のための嗜好情報は、上述した嗜好情報生成部110において生成される。 When a plurality of preference information is displayed, as shown in FIG. 35, the output control unit 204 displays mood images 563, 573, and 583 related to a plurality of preference information based on different information. The display unit 208 may be controlled. For example, in FIG. 35, mood images 563, 573, and 583 are superimposed and displayed so as to form concentric circles. For example, the mood image 563 described at the center in the figure is based on the sensitivity information for the mood of the user's music. Further, the mood image 573 arranged along the outer periphery of the preference information 563 is based on the mood of the music held by the user. Furthermore, the mood image 583 arranged along the outer periphery of the mood image 573 is based on the reproduction history reproduced by the user. The preference information for the mood images 563, 573, and 583 is generated by the above-described preference information generation unit 110.
 [第4の変形例]
 また、ユーザ端末200は、音楽データに基づいて当該音楽データを編集した編集音楽データを生成する、編集音楽データ生成部を有してもよい。例えば、編集音楽データ生成部は、ユーザの音楽データについての感性情報に基づいて音楽データを編集することができる。図36は、本開示の変形例に係るユーザ端末が備える編集音楽データ生成部による音楽データの編集を示す概要図である。
[Fourth Modification]
The user terminal 200 may include an edited music data generation unit that generates edited music data obtained by editing the music data based on the music data. For example, the edited music data generation unit can edit music data based on sensitivity information about the user's music data. FIG. 36 is a schematic diagram illustrating editing of music data by the edited music data generation unit included in the user terminal according to the modified example of the present disclosure.
 図36に示すように、音楽データ400の再生時において、音楽データ400の部分410、部分420、部分430における小区分411、421、431について感性ポイント401が生じたとする。このような場合において、編集音楽データ生成部は、感性ポイント401が生じた小区分411、421、431を抽出するとともに結合し、新たな編集音楽データ400Aを生成する。このような編集音楽データ400Aは、ユーザの感性ポイント401が生じた部分を抽出しているため、ユーザは、音楽データ400の気に入った部分のみを編集音楽データ400Aにより観賞することができる。 36, when the music data 400 is reproduced, it is assumed that the sensitivity points 401 are generated for the subsections 411, 421, and 431 in the portion 410, the portion 420, and the portion 430 of the music data 400. In such a case, the edited music data generation unit extracts and combines the subsections 411, 421, and 431 where the sensitivity points 401 are generated, and generates new edited music data 400A. Since such edited music data 400A is extracted from the portion where the user's sensitivity point 401 is generated, the user can view only the favorite portion of the music data 400 with the edited music data 400A.
 [第5の変形例]
 上述した実施形態においては、ユーザ端末200において感性入力ボタン714が押圧されることにより感性ポイントが検出されるものとして説明したが、本開示はこれに限定されない。例えば、ユーザ端末200は、ユーザの生体情報の変化を検出可能な生体情報検出部を備えていてもよい。このような生体情報検出部としては、検出の対象となる生体情報に対応する機器が適宜選択される。例えば、生体情報検出部は、心拍計、血圧計、脳波測定機、脈拍計、体温計、加速度センサ、ジャイロセンサまたは地磁気センサである。
[Fifth Modification]
In the above-described embodiment, it has been described that a sensitivity point is detected by pressing the sensitivity input button 714 on the user terminal 200, but the present disclosure is not limited thereto. For example, the user terminal 200 may include a biological information detection unit that can detect a change in the biological information of the user. As such a biological information detection unit, a device corresponding to the biological information to be detected is appropriately selected. For example, the biological information detection unit is a heart rate monitor, a blood pressure monitor, an electroencephalograph, a pulse meter, a thermometer, an acceleration sensor, a gyro sensor, or a geomagnetic sensor.
 これらのうち加速度センサ、ジャイロセンサおよび地磁気センサは、その構成に応じて、例えば体動や、体位の検出に利用可能である。例えば、生体情報検出部が加速度センサである場合、生体情報検出部をユーザの頭部、頸部、体幹付近に取り付け可能に構成すると、音楽データの再生に合わせたユーザの動きを検出可能である。 Among these, the acceleration sensor, the gyro sensor, and the geomagnetic sensor can be used, for example, for detecting body movement and body position, depending on the configuration. For example, when the biological information detection unit is an acceleration sensor, if the biological information detection unit is configured to be attached to the vicinity of the user's head, neck, and trunk, it is possible to detect the user's movement in accordance with the reproduction of music data. is there.
 このような場合において、体動は、音楽データの各部分が有する楽曲のテンポと、ユーザの動きにおける振幅の周期とを比較することにより検出されることができる。例えば、楽曲のテンポと、振幅の周期との相関が一定以上の場合に、感性ポイントが検出される。 In such a case, the body movement can be detected by comparing the tempo of the music that each part of the music data has with the period of the amplitude in the user's movement. For example, the sensitivity point is detected when the correlation between the tempo of the music and the period of the amplitude is a certain level or more.
 [第6の変形例]
 上述した実施形態においては、ユーザ端末200がムード解析部218を備えるものとして説明したが、本開示はこれに限定されず、ユーザ端末200がムード解析部を有さず、他の電子機器、例えばサーバ100がムード解析部を有してもよい。このような場合、ユーザ端末200毎の音楽データのムード解析を行う必要がない。また、サーバ100において音楽データのムード解析を集約的に行うことにより、重複する音楽データのムード解析を防止することができる。
[Sixth Modification]
In the above-described embodiment, the user terminal 200 has been described as including the mood analysis unit 218. However, the present disclosure is not limited to this, and the user terminal 200 does not include the mood analysis unit. The server 100 may have a mood analysis unit. In such a case, it is not necessary to perform mood analysis of music data for each user terminal 200. Further, by performing the mood analysis of music data collectively in the server 100, it is possible to prevent the mood analysis of overlapping music data.
 なお、サーバ100がムード解析部を有する場合、サーバ100が音楽データを記憶する楽曲データベースを有するものであってもよいし、サーバ100は外部の楽曲データベースよりムード解析を行うための音楽データを取得してもよい。 When the server 100 has a mood analysis unit, the server 100 may have a music database for storing music data, or the server 100 acquires music data for performing mood analysis from an external music database. May be.
 [第7の変形例]
 上述した実施形態においては、サーバ100が嗜好情報生成部110を備えるものとして説明したが、本開示はこれに限定されず、ユーザ端末200が嗜好情報生成部を有してもよい。このような場合、例えば、ユーザ端末200において生成される嗜好情報が適宜サーバ100のユーザ情報データベース104に送信される。
[Seventh Modification]
In the above-described embodiment, the server 100 has been described as including the preference information generation unit 110. However, the present disclosure is not limited to this, and the user terminal 200 may include a preference information generation unit. In such a case, for example, preference information generated in the user terminal 200 is appropriately transmitted to the user information database 104 of the server 100.
 <7.サーバのハードウェア構成例>
 次に、上述したサーバ100のハードウェア構成について説明する。図37は、図1に示すサーバ100のハードウェア構成を示すブロック図である。サーバ100は、CPU120と、ROM122と、RAM124と、記憶装置126と、通信装置128とを有している。
<7. Server hardware configuration example>
Next, the hardware configuration of the server 100 described above will be described. FIG. 37 is a block diagram showing a hardware configuration of the server 100 shown in FIG. The server 100 includes a CPU 120, a ROM 122, a RAM 124, a storage device 126, and a communication device 128.
 CPU120は、演算処理装置および制御装置として機能するプロセッサであり、ROM122、RAM124、記憶装置126に記録された各種プログラムに従って、サーバ100内の動作全般またはサーバ100内の動作の一部を制御する。ROM122は、CPU120が使用するプログラムや演算パラメータ等を記憶する。RAM124は、CPU120が使用するプログラムや、プログラムの実行において適宜変化するパラメータ等を一次記憶する。CPU120、ROM122およびRAM124は、CPUバス等の内部バスにより構成されるホストバスにより相互に接続されている。 The CPU 120 is a processor that functions as an arithmetic processing unit and a control unit, and controls the overall operation in the server 100 or a part of the operation in the server 100 according to various programs recorded in the ROM 122, the RAM 124, and the storage device 126. The ROM 122 stores programs used by the CPU 120, calculation parameters, and the like. The RAM 124 primarily stores programs used by the CPU 120, parameters that change as appropriate during execution of the programs, and the like. The CPU 120, the ROM 122, and the RAM 124 are connected to each other by a host bus constituted by an internal bus such as a CPU bus.
 記憶装置126は、ユーザ情報データベース104、楽曲情報データベース106に記憶され等の記憶部の一例として構成されたデータ格納用の装置である。記憶装置126は、例えば、HDD(Hard Disk Drive)等の磁気記憶部デバイス、半導体記憶デバイス、光記憶デバイス、または光磁気記憶デバイス等により構成される。この記憶装置126は、CPU120が実行するプログラムや各種データ、および外部から取得した各種のデータなどを格納する。 The storage device 126 is a device for data storage configured as an example of a storage unit such as stored in the user information database 104 and the music information database 106. The storage device 126 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device. The storage device 126 stores programs executed by the CPU 120, various data, various data acquired from the outside, and the like.
 通信装置128は、例えば、ネットワーク300に接続するための通信デバイス等で構成された通信インターフェースである。通信装置128は、例えば、有線または無線LAN(Local Area Network)、Bluetooth(登録商標)、またはWUSB(Wireless USB)用の通信カード等である。この通信装置128は、例えば、インターネットや他の通信機器との間で、例えばTCP/IP等の所定のプロトコルに則して信号等を送受信することができる。 The communication device 128 is a communication interface configured with, for example, a communication device for connecting to the network 300. The communication device 128 is, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB). The communication device 128 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet or other communication devices.
 以上、本開示の一実施形態に係るサーバ100の機能を実現可能なハードウェア構成の一例を示した。上記の各構成要素は、汎用的な部材を用いて構成されていてもよいし、各構成要素の機能に特化したハードウェアにより構成されていてもよい。また、本実施形態を実施する時々の技術レベルに応じて、適宜、利用するハードウェア構成を変更することが可能である。 Heretofore, an example of a hardware configuration capable of realizing the function of the server 100 according to an embodiment of the present disclosure has been shown. Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Moreover, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out the present embodiment.
 以上の、サーバ100のハードウェアとソフトウェアとの協働により、例えば、受信部102、ユーザ情報データベース104、楽曲情報データベース106、推薦ユーザ決定部108、嗜好情報生成部110、通知情報生成部112および送信部114の機能が実現される。 Through the above-described cooperation between the hardware and software of the server 100, for example, the receiving unit 102, the user information database 104, the music information database 106, the recommended user determining unit 108, the preference information generating unit 110, the notification information generating unit 112, and The function of the transmission unit 114 is realized.
 <8.ユーザ端末のハードウェア構成>
 次に、ユーザ端末200のハードウェア構成例について説明する。図38は、図1に示すユーザ端末200のハードウェア構成を示すブロック図である。
<8. Hardware configuration of user terminal>
Next, a hardware configuration example of the user terminal 200 will be described. FIG. 38 is a block diagram showing a hardware configuration of the user terminal 200 shown in FIG.
 まず、ユーザ端末200について説明する。図38に示すように、ユーザ端末200は、CPU240と、ROM242と、RAM244と、表示装置246と、スピーカ210と、入力装置248と、記憶装置250と、通信装置252と、振動装置254とを有している。CPU240、ROM242、RAM244、記憶装置250の構成は、上述したCPU120、ROM122、RAM124、記憶装置126の構成と同様とすることができるため、説明を省略する。また、スピーカ210については、上述した。 First, the user terminal 200 will be described. As shown in FIG. 38, the user terminal 200 includes a CPU 240, a ROM 242, a RAM 244, a display device 246, a speaker 210, an input device 248, a storage device 250, a communication device 252, and a vibration device 254. Have. The configurations of the CPU 240, the ROM 242, the RAM 244, and the storage device 250 can be the same as the configurations of the CPU 120, the ROM 122, the RAM 124, and the storage device 126 described above, and thus the description thereof is omitted. The speaker 210 has been described above.
 表示装置246は、取得した情報をユーザに対して視覚的に通知することが可能な装置である。表示装置246は、表示部208を構成する。表示装置246は、例えば、CRTディスプレイ装置、液晶ディスプレイ装置、プラズマディスプレイ装置、ELディスプレイ装置およびランプなどであることができる。また、図示の態様においては、表示装置246は、ユーザ端末200中に組み込まれて構成されているが、これに限定されず、表示装置はユーザ端末200外に存在してもよい。 The display device 246 is a device that can visually notify the acquired information to the user. The display device 246 constitutes the display unit 208. The display device 246 can be, for example, a CRT display device, a liquid crystal display device, a plasma display device, an EL display device, and a lamp. Further, in the illustrated embodiment, the display device 246 is configured to be incorporated in the user terminal 200, but is not limited thereto, and the display device may exist outside the user terminal 200.
 入力装置248は、例えば、マウス、キーボード、タッチパネル、ボタン、スイッチおよびレバーなどユーザが操作する操作手段である。また、入力装置248は、例えば、赤外線やその他の電波を利用したリモートコントロール手段(いわゆる、リモコン)であってもよいし、ユーザ端末200の操作に対応した携帯電話やPDA等の外部接続機器であってもよい。さらに、入力装置248は、例えば、上記の操作手段を用いてユーザにより入力された情報に基づいて入力信号を生成し、CPU240に出力する入力制御回路などから構成されている。ユーザ端末200のユーザは、この入力装置248を操作することにより、ユーザ端末200に対して各種のデータを入力したり処理動作を指示したりすることができる。 The input device 248 is an operation means operated by the user such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever. The input device 248 may be, for example, remote control means (so-called remote control) using infrared rays or other radio waves, or an external connection device such as a mobile phone or a PDA that supports the operation of the user terminal 200. There may be. Furthermore, the input device 248 is configured by, for example, an input control circuit that generates an input signal based on information input by the user using the above-described operation means and outputs the input signal to the CPU 240. The user of the user terminal 200 can input various data and instruct processing operations to the user terminal 200 by operating the input device 248.
 通信装置252は、上述した通信装置128の構成に加え、必要に応じて、有線および/または無線のワイドエリアネットワーク(WAN)に接続するための通信デバイスを有している。 In addition to the configuration of the communication device 128 described above, the communication device 252 includes a communication device for connecting to a wired and / or wireless wide area network (WAN) as necessary.
 振動装置254は、振動発生部212を構成する、振動を発生させるための装置である。振動装置254は、例えば偏心したマスを有するモータの回転等により振動を発生させることができる。 The vibration device 254 is a device for generating vibration that constitutes the vibration generating unit 212. The vibration device 254 can generate vibration by, for example, rotation of a motor having an eccentric mass.
 以上のユーザ端末200のハードウェアとソフトウェアとの協働により、例えば受信部202、出力制御部204、出力部206、楽曲再生部214、感性情報生成部216、ムード解析部218、記憶部220、入力部226、位置情報検出部228および送信部230の機能が実現される。 By the cooperation of the hardware and software of the user terminal 200 described above, for example, the receiving unit 202, the output control unit 204, the output unit 206, the music playback unit 214, the sensitivity information generation unit 216, the mood analysis unit 218, the storage unit 220, Functions of the input unit 226, the position information detection unit 228, and the transmission unit 230 are realized.
 <9.コンピュータプログラム>
 上述した情報処理システム1000上の各装置のハードウェア、また、例えばサーバ100に内蔵されるCPU120、ROM122、RAM124および記憶装置126などのハードウェアに上述した各装置の機能を発揮させるためのコンピュータプログラムも作成可能である。特に、サーバ100がコンピュータプログラムをダウンロードおよびインストールすることにより、サーバ100に受信部102、ユーザ情報データベース104、楽曲情報データベース106、推薦ユーザ決定部108、嗜好情報生成部110、通知情報生成部112および送信部114などの機能が実装されてもよい。また、該コンピュータプログラムを記憶させた記憶媒体も提供される。
<9. Computer program>
A computer program for causing the hardware of each device on the information processing system 1000 described above, or the hardware such as the CPU 120, the ROM 122, the RAM 124, and the storage device 126 built in the server 100 to perform the functions of each device described above. Can also be created. In particular, when the server 100 downloads and installs a computer program, the receiving unit 102, the user information database 104, the music information database 106, the recommended user determining unit 108, the preference information generating unit 110, the notification information generating unit 112, and the server 100 are downloaded. Functions such as the transmission unit 114 may be implemented. A storage medium storing the computer program is also provided.
 また、例えばユーザ端末200に内蔵されるCPU220、ROM222、RAM224、記憶装置250などのハードウェアに上述した各装置の機能を発揮させるためのコンピュータプログラムも作成可能である。特に、ユーザ端末200がコンピュータプログラムをダウンロードおよびインストールすることにより、ユーザ端末200に受信部202、出力制御部204、出力部206、楽曲再生部214、感性情報生成部216、ムード解析部218、記憶部220、入力部226、位置情報検出部228および送信部230などの機能が実装されてもよい。また、該コンピュータプログラムを記憶させた記憶媒体も提供される。 Further, for example, a computer program for causing hardware such as the CPU 220, the ROM 222, the RAM 224, and the storage device 250 built in the user terminal 200 to perform the functions of each device described above can be created. In particular, when the user terminal 200 downloads and installs a computer program, the receiving unit 202, the output control unit 204, the output unit 206, the music playback unit 214, the sensitivity information generation unit 216, the mood analysis unit 218, the storage Functions such as unit 220, input unit 226, position information detection unit 228, and transmission unit 230 may be implemented. A storage medium storing the computer program is also provided.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、特許請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that it belongs to the technical scope of the present disclosure.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 In addition, the effects described in this specification are merely illustrative or illustrative, and are not limited. That is, the technology according to the present disclosure can exhibit other effects that are apparent to those skilled in the art from the description of the present specification in addition to or instead of the above effects.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 第1の音楽データの再生時における第1のユーザの感性情報に基づいて生成された、前記第1の音楽データと関連する第2の音楽データの一部分に対する通知情報を取得する情報取得部と、
 前記第2の音楽データの再生時において前記通知情報が前記一部分と関連付けて前記第1のユーザおよび第2のユーザのうち少なくとも一方に通知されるように通知部に対し前記通知情報を出力する通知情報出力部とを有する、情報処理装置。
(2)
 前記通知部による通知は、音、振動、光または表示により行われる、前記(1)に記載の情報処理装置。
(3)
 前記通知部による通知は、前記第1の音楽データの再生に合わせた前記第1のユーザの感性情報に対応する前記第2の音楽データの部分における音の出力により行われる、前記(1)または前記(2)に記載の情報処理装置。
(4)
 前記第1の音楽データと前記第2の音楽データは、いずれも同一の楽曲に関するデータである、前記(1)~(3)のいずれか一項に記載の情報処理装置。
(5)
 前記第1の音楽データは第1の楽曲に関するデータであり、前記第2の音楽データは第1の楽曲の一部分に関連する部分を含む第2の楽曲に関するデータである前記(1)~(3)のいずれか一項に記載の情報処理装置。
(6)
 前記通知部は、表示部を含み、
 前記通知部による通知は、表示部に表示された前記第2の音楽データのプログレスバー画像の時間軸に沿って前記感性情報に対応する位置に通知画像が表示されることにより行われる、前記(1)~(3)のいずれか一項に記載の情報処理装置。
(7)
 前記通知画像は、前記第1のユーザに関するユーザ情報を含み、前記ユーザ情報は、前記第1のユーザの音楽に対する嗜好情報を含む、前記(6)に記載の情報処理装置。
(8)
 前記通知情報出力部は、前記第1のユーザまたは前記第2のユーザによる指示に応じて前記通知部による通知の方法を変更する、前記(1)~(7)のいずれか一項に記載の情報処理装置。
(9)
 前記通知情報出力部は、前記第1のユーザと前記第2のユーザのうちの少なくとも一方に関する情報、前記第1の音楽データと前記第2の音楽データのうちの少なくとも一方に関する情報のうち、少なくともいずれかに応じて前記通知部による通知の方法を変更する、前記(1)~(8)のいずれか一項に記載の情報処理装置。
(10)
 前記通知部による通知は、前記第1の音楽データの再生に合わせた前記第1のユーザの感性情報に対応する前記第2の音楽データの部分における音の出力により行われ、
前記通知の方法を変更することは、出力される前記音を変更することである、前記(9)に記載の情報処理装置。
(11)
 前記通知情報出力部は、再生される前記第2の音楽データの楽曲の種類に応じて、前記通知部による通知の方法を決定する、前記(9)または前記(10)に記載の情報処理装置。
(12)
 前記通知情報出力部は、前記第2の音楽データが再生される際の前記情報処理装置の周囲環境に応じて、前記通知部による通知の方法を決定する、前記(9)~(11)のいずれか一項に記載の情報処理装置。
(13)
 前記通知情報出力部は、前記第1の音楽データについて取得した前記通知情報の数に応じて、前記通知部による通知の方法を決定する、前記(9)~(12)のいずれか一項に記載の情報処理装置。
(14)
 前記感性情報は、前記第1の音楽データの再生時における前記第1のユーザの生体情報の変化と前記第1の音楽データの再生時における前記第1のユーザの体動のうち少なくとも一方に基づいて生成される、前記(1)~(13)のいずれか一項に記載の情報処理装置。
(15)
 前記感性情報は、前記第1の音楽データの各部分に対する前記第1のユーザの体動の頻度に基づいて算出される体動情報を含む、前記(14)に記載の情報処理装置。
(16)
 前記体動情報は、前記第1の音楽データの各部分に対する前記ユーザの体動の頻度を前記部分毎に積算することにより算出される、前記(15)に記載の情報処理装置。
(17)
 前記体動は、前記第1の音楽データの各部分が有する楽曲のテンポと、前記第1のユーザの動きにおける振幅の周期とを比較することにより検出される、前記(14)~(16)のいずれか一項に記載の情報処理装置。
(18)
 前記体動は、前記第1のユーザの入力に基づいて検出される、前記(14)~(16)のいずれか一項に記載の情報処理装置。
(19)
 前記通知情報出力部は、前記通知情報に基づいてアニメーションを表示するように前記通知部を制御する、前記(1)~(18)のいずれか一項に記載の情報処理装置。
(20)
 前記通知情報出力部は、前記第1の音楽データの再生時における前記第1のユーザの生体情報の変化に応じた量を示す図を含む前記アニメーションが表示されるように前記通知部を制御する、前記(19)に記載の情報処理装置。
(21)
 第1の音楽データの再生時における第1のユーザの感性情報に基づいて生成された前記第1の音楽データと関連する第2の音楽データの一部分に対する通知情報を取得することと、
 プロセッサが、前記第2の音楽データの再生時において前記通知情報が前記一部分と関連付けて前記第1のユーザまたは第2のユーザに通知されるように通知部に対し前記通知情報を出力することと、
を有する、情報処理方法。
(22)
 コンピュータを、
 第1の音楽データの再生時における第1のユーザの感性情報に基づいて生成された前記第1の音楽データと関連する第2の音楽データの一部分に対する通知情報を取得する情報取得部と、
 前記第2の音楽データの再生時において前記通知情報が前記一部分と関連付けて前記第1のユーザまたは第2のユーザに通知されるように通知部に対し前記通知情報を出力する通知情報出力部と、として機能させるための、プログラム。
The following configurations also belong to the technical scope of the present disclosure.
(1)
An information acquisition unit for acquiring notification information for a part of the second music data related to the first music data, generated based on the sensitivity information of the first user at the time of reproduction of the first music data;
Notification for outputting the notification information to the notification unit so that the notification information is notified to at least one of the first user and the second user in association with the portion at the time of reproduction of the second music data. An information processing apparatus having an information output unit.
(2)
The information processing apparatus according to (1), wherein the notification by the notification unit is performed by sound, vibration, light, or display.
(3)
The notification by the notification unit is performed by outputting a sound in the portion of the second music data corresponding to the sensitivity information of the first user in accordance with the reproduction of the first music data, (1) or The information processing apparatus according to (2).
(4)
The information processing apparatus according to any one of (1) to (3), wherein the first music data and the second music data are data related to the same music piece.
(5)
The first music data is data relating to a first music piece, and the second music data is data relating to a second music piece including a portion related to a part of the first music piece. ). The information processing apparatus according to any one of
(6)
The notification unit includes a display unit,
The notification by the notification unit is performed by displaying a notification image at a position corresponding to the sensitivity information along the time axis of the progress bar image of the second music data displayed on the display unit. The information processing apparatus according to any one of 1) to (3).
(7)
The information processing apparatus according to (6), wherein the notification image includes user information regarding the first user, and the user information includes preference information for music of the first user.
(8)
The notification information output unit according to any one of (1) to (7), wherein a notification method by the notification unit is changed according to an instruction from the first user or the second user. Information processing device.
(9)
The notification information output unit includes at least one of information on at least one of the first user and the second user, and information on at least one of the first music data and the second music data. The information processing apparatus according to any one of (1) to (8), wherein a notification method by the notification unit is changed according to any of them.
(10)
The notification by the notification unit is performed by outputting a sound in the part of the second music data corresponding to the sensitivity information of the first user in accordance with the reproduction of the first music data,
The information processing apparatus according to (9), wherein changing the notification method is changing the output sound.
(11)
The information processing apparatus according to (9) or (10), wherein the notification information output unit determines a notification method by the notification unit according to a type of music of the second music data to be reproduced. .
(12)
The notification information output unit determines a notification method by the notification unit according to an ambient environment of the information processing apparatus when the second music data is played back. (9) to (11) The information processing apparatus according to any one of claims.
(13)
The notification information output unit determines the notification method by the notification unit according to the number of the notification information acquired for the first music data, according to any one of (9) to (12), The information processing apparatus described.
(14)
The sensibility information is based on at least one of a change in the biological information of the first user when the first music data is reproduced and a body movement of the first user when the first music data is reproduced. The information processing apparatus according to any one of (1) to (13), generated by
(15)
The information processing apparatus according to (14), wherein the sensitivity information includes body motion information calculated based on a frequency of body motion of the first user for each portion of the first music data.
(16)
The information processing apparatus according to (15), wherein the body motion information is calculated by integrating the frequency of the user's body motion for each portion of the first music data for each portion.
(17)
The body movement is detected by comparing the tempo of the music that each part of the first music data has with the period of amplitude in the movement of the first user, (14) to (16) The information processing apparatus according to any one of the above.
(18)
The information processing apparatus according to any one of (14) to (16), wherein the body movement is detected based on an input of the first user.
(19)
The information processing apparatus according to any one of (1) to (18), wherein the notification information output unit controls the notification unit to display an animation based on the notification information.
(20)
The notification information output unit controls the notification unit to display the animation including a diagram showing an amount corresponding to a change in the biological information of the first user at the time of reproduction of the first music data. The information processing apparatus according to (19).
(21)
Obtaining notification information for a part of the second music data related to the first music data generated based on the sensitivity information of the first user at the time of reproduction of the first music data;
A processor outputs the notification information to a notification unit so that the notification information is notified to the first user or the second user in association with the portion when the second music data is reproduced; ,
An information processing method.
(22)
Computer
An information acquisition unit for acquiring notification information for a part of the second music data related to the first music data generated based on the sensitivity information of the first user at the time of reproduction of the first music data;
A notification information output unit that outputs the notification information to the notification unit so that the notification information is notified to the first user or the second user in association with the portion at the time of reproduction of the second music data; , A program to make it function as.
 100           サーバ
 102           受信部
 104           ユーザ情報データベース
 106           楽曲情報データベース
 108           推薦ユーザ決定部
 110           嗜好情報生成部
 112           通知情報生成部
 114           送信部
 200、200A、200B ユーザ端末
 202           受信部
 204           出力制御部
 206           出力部
 208           表示部
 210           スピーカ
 212           振動発生部
 214           楽曲再生部
 216           感性情報生成部
 218           ムード解析部
 220           記憶部
 222           ユーザ情報記憶部
 224           楽曲データベース
 226           入力部
 228           位置情報検出部
 230           送信部
 300           ネットワーク
 1000          情報処理システム
DESCRIPTION OF SYMBOLS 100 Server 102 Reception part 104 User information database 106 Music information database 108 Recommended user determination part 110 Preference information generation part 112 Notification information generation part 114 Transmission part 200, 200A, 200B User terminal 202 Reception part 204 Output control part 206 Output part 208 Display Unit 210 speaker 212 vibration generation unit 214 music reproduction unit 216 sensitivity information generation unit 218 mood analysis unit 220 storage unit 222 user information storage unit 224 music database 226 input unit 228 position information detection unit 230 transmission unit 300 network 1000 information processing system

Claims (20)

  1.  第1の音楽データの再生時における第1のユーザの感性情報に基づいて生成された、前記第1の音楽データと関連する第2の音楽データの一部分に対する通知情報を取得する情報取得部と、
     前記第2の音楽データの再生時において前記通知情報が前記一部分と関連付けて前記第1のユーザおよび第2のユーザのうち少なくとも一方に通知されるように通知部に対し前記通知情報を出力する通知情報出力部とを有する、情報処理装置。
    An information acquisition unit for acquiring notification information for a part of the second music data related to the first music data, generated based on the sensitivity information of the first user at the time of reproduction of the first music data;
    Notification for outputting the notification information to the notification unit so that the notification information is notified to at least one of the first user and the second user in association with the portion at the time of reproduction of the second music data. An information processing apparatus having an information output unit.
  2.  前記通知部による通知は、音、振動、光または表示により行われる、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the notification by the notification unit is performed by sound, vibration, light, or display.
  3.  前記通知部による通知は、前記第1の音楽データの再生に合わせた前記第1のユーザの感性情報に対応する前記第2の音楽データの部分における音の出力により行われる、請求項1に記載の情報処理装置。 The notification by the notification unit is performed by outputting a sound in a part of the second music data corresponding to the sensitivity information of the first user in accordance with the reproduction of the first music data. Information processing device.
  4.  前記第1の音楽データと前記第2の音楽データは、いずれも同一の楽曲に関するデータである、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the first music data and the second music data are data relating to the same music piece.
  5.  前記第1の音楽データは第1の楽曲に関するデータであり、前記第2の音楽データは第1の楽曲の一部分に関連する部分を含む第2の楽曲に関するデータである、請求項1に記載の情報処理装置。 The first music data is data relating to a first music piece, and the second music data is data relating to a second music piece including a portion related to a part of the first music piece. Information processing device.
  6.  前記通知部は、表示部を含み、
     前記通知部による通知は、表示部に表示された前記第2の音楽データのプログレスバー画像の時間軸に沿って前記感性情報に対応する位置に通知画像が表示されることにより行われる、請求項1に記載の情報処理装置。
    The notification unit includes a display unit,
    The notification by the notification unit is performed by displaying a notification image at a position corresponding to the sensitivity information along a time axis of a progress bar image of the second music data displayed on the display unit. The information processing apparatus according to 1.
  7.  前記通知画像は、前記第1のユーザに関するユーザ情報を含み、前記ユーザ情報は、前記第1のユーザの音楽に対する嗜好情報を含む、請求項6に記載の情報処理装置。 The information processing apparatus according to claim 6, wherein the notification image includes user information related to the first user, and the user information includes preference information for the music of the first user.
  8.  前記通知情報出力部は、前記第1のユーザと前記第2のユーザのうちの少なくとも一方に関する情報、前記第1の音楽データと前記第2の音楽データのうちの少なくとも一方に関する情報のうち、少なくともいずれかに応じて前記通知部による通知の方法を変更する、請求項1に記載の情報処理装置。 The notification information output unit includes at least one of information on at least one of the first user and the second user, and information on at least one of the first music data and the second music data. The information processing apparatus according to claim 1, wherein a notification method by the notification unit is changed according to any one of them.
  9.  前記通知部による通知は、前記第1の音楽データの再生に合わせた前記第1のユーザの感性情報に対応する前記第2の音楽データの部分における音の出力により行われ、
    前記通知の方法を変更することは、出力される前記音を変更することである、請求項8に記載の情報処理装置。
    The notification by the notification unit is performed by outputting a sound in the part of the second music data corresponding to the sensitivity information of the first user in accordance with the reproduction of the first music data,
    The information processing apparatus according to claim 8, wherein changing the notification method is changing the output sound.
  10.  前記通知情報出力部は、再生される前記第2の音楽データの楽曲の種類に応じて、前記通知部による通知の方法を決定する、請求項8に記載の情報処理装置。 The information processing apparatus according to claim 8, wherein the notification information output unit determines a notification method by the notification unit according to a type of music of the second music data to be reproduced.
  11.  前記通知情報出力部は、前記第2の音楽データが再生される際の前記情報処理装置の周囲環境に応じて、前記通知部による通知の方法を決定する、請求項8に記載の情報処理装置。 The information processing apparatus according to claim 8, wherein the notification information output unit determines a notification method by the notification unit according to a surrounding environment of the information processing apparatus when the second music data is reproduced. .
  12.  前記通知情報出力部は、前記第1の音楽データについて取得した前記通知情報の数に応じて、前記通知部による通知の方法を決定する、請求項8に記載の情報処理装置。 The information processing apparatus according to claim 8, wherein the notification information output unit determines a notification method by the notification unit according to the number of pieces of the notification information acquired for the first music data.
  13.  前記感性情報は、前記第1の音楽データの再生時における前記第1のユーザの生体情報の変化と前記第1の音楽データの再生時における前記第1のユーザの体動のうち少なくとも一方に基づいて生成される、請求項1に記載の情報処理装置。 The sensibility information is based on at least one of a change in the biological information of the first user when the first music data is reproduced and a body movement of the first user when the first music data is reproduced. The information processing apparatus according to claim 1, wherein the information processing apparatus is generated.
  14.  前記感性情報は、前記第1の音楽データの各部分に対する前記第1のユーザの体動の頻度に基づいて算出される体動情報を含む、請求項13に記載の情報処理装置。 The information processing apparatus according to claim 13, wherein the sensitivity information includes body motion information calculated based on a frequency of body motion of the first user with respect to each portion of the first music data.
  15.  前記体動は、前記第1の音楽データの各部分が有する楽曲のテンポと、前記第1のユーザの動きにおける振幅の周期とを比較することにより検出される、請求項13に記載の情報処理装置。 The information processing according to claim 13, wherein the body movement is detected by comparing a tempo of a music included in each part of the first music data and an amplitude period in the movement of the first user. apparatus.
  16.  前記体動は、前記第1のユーザの入力に基づいて検出される、請求項13に記載の情報処理装置。 The information processing apparatus according to claim 13, wherein the body movement is detected based on an input of the first user.
  17.  前記通知情報出力部は、前記通知情報に基づいてアニメーションを表示するように前記通知部を制御する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the notification information output unit controls the notification unit to display an animation based on the notification information.
  18.  前記通知情報出力部は、前記第1の音楽データの再生時における前記第1のユーザの生体情報の変化に応じた量を示す図を含む前記アニメーションが表示されるように前記通知部を制御する、請求項17に記載の情報処理装置。 The notification information output unit controls the notification unit to display the animation including a diagram showing an amount corresponding to a change in the biological information of the first user at the time of reproduction of the first music data. The information processing apparatus according to claim 17.
  19.  第1の音楽データの再生時における第1のユーザの感性情報に基づいて生成された前記第1の音楽データと関連する第2の音楽データの一部分に対する通知情報を取得することと、
     プロセッサが、前記第2の音楽データの再生時において前記通知情報が前記一部分と関連付けて前記第1のユーザまたは第2のユーザに通知されるように通知部に対し前記通知情報を出力することと、
    を有する、情報処理方法。
    Obtaining notification information for a part of the second music data related to the first music data generated based on the sensitivity information of the first user at the time of reproduction of the first music data;
    A processor outputs the notification information to a notification unit so that the notification information is notified to the first user or the second user in association with the portion when the second music data is reproduced; ,
    An information processing method.
  20.  コンピュータを、
     第1の音楽データの再生時における第1のユーザの感性情報に基づいて生成された前記第1の音楽データと関連する第2の音楽データの一部分に対する通知情報を取得する情報取得部と、
     前記第2の音楽データの再生時において前記通知情報が前記一部分と関連付けて前記第1のユーザまたは第2のユーザに通知されるように通知部に対し前記通知情報を出力する通知情報出力部と、として機能させるための、プログラム。
    Computer
    An information acquisition unit for acquiring notification information for a part of the second music data related to the first music data generated based on the sensitivity information of the first user at the time of reproduction of the first music data;
    A notification information output unit that outputs the notification information to the notification unit so that the notification information is notified to the first user or the second user in association with the portion at the time of reproduction of the second music data; , A program to make it function as.
PCT/JP2016/077514 2015-11-30 2016-09-16 Information processing device, information processing method, and program WO2017094328A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-233860 2015-11-30
JP2015233860 2015-11-30

Publications (1)

Publication Number Publication Date
WO2017094328A1 true WO2017094328A1 (en) 2017-06-08

Family

ID=58796942

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/077514 WO2017094328A1 (en) 2015-11-30 2016-09-16 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2017094328A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002366173A (en) * 2001-06-05 2002-12-20 Open Interface Inc Method and device for sensitivity data calculation
JP2008186444A (en) * 2007-01-05 2008-08-14 Yahoo Japan Corp Sensitivity matching method, device and computer program
JP2010250528A (en) * 2009-04-15 2010-11-04 Yahoo Japan Corp Feeling matching device, feeling matching method, and program
WO2012173021A1 (en) * 2011-06-13 2012-12-20 ソニー株式会社 Information processing device, information processing method and program
JP2014097188A (en) * 2012-11-14 2014-05-29 Pioneer Electronic Corp Terminal device, communication system, determination result recording method of terminal device, and program
JP2014191374A (en) * 2013-03-26 2014-10-06 Dna:Kk System capable of providing a plurality of digital contents and method using the same
WO2014192457A1 (en) * 2013-05-30 2014-12-04 ソニー株式会社 Client device, control method, system and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002366173A (en) * 2001-06-05 2002-12-20 Open Interface Inc Method and device for sensitivity data calculation
JP2008186444A (en) * 2007-01-05 2008-08-14 Yahoo Japan Corp Sensitivity matching method, device and computer program
JP2010250528A (en) * 2009-04-15 2010-11-04 Yahoo Japan Corp Feeling matching device, feeling matching method, and program
WO2012173021A1 (en) * 2011-06-13 2012-12-20 ソニー株式会社 Information processing device, information processing method and program
JP2014097188A (en) * 2012-11-14 2014-05-29 Pioneer Electronic Corp Terminal device, communication system, determination result recording method of terminal device, and program
JP2014191374A (en) * 2013-03-26 2014-10-06 Dna:Kk System capable of providing a plurality of digital contents and method using the same
WO2014192457A1 (en) * 2013-05-30 2014-12-04 ソニー株式会社 Client device, control method, system and program

Similar Documents

Publication Publication Date Title
WO2017094326A1 (en) Information processing device, information processing method, and program
JP4555072B2 (en) Localized audio network and associated digital accessories
US11163520B2 (en) Multimedia experience according to biometrics
WO2017094327A1 (en) Information processing device, information processing system, information processing method, and program
US20150373455A1 (en) Presenting and creating audiolinks
US10506268B2 (en) Identifying media content for simultaneous playback
JP6535497B2 (en) Music recommendation system, program and music recommendation method
JP2016066389A (en) Reproduction control device and program
CN106777115A (en) Song processing method and processing device
CN110870322B (en) Information processing apparatus, information processing method, and computer program
JP2014130467A (en) Information processing device, information processing method, and computer program
Krzyzaniak et al. Six types of audio that DEFY reality! A taxonomy of audio augmented reality with examples
WO2017094328A1 (en) Information processing device, information processing method, and program
WO2015168299A1 (en) Biometric-music interaction methods and systems
JP7268547B2 (en) Information processing device and program
JP2014123085A (en) Device, method, and program for further effectively performing and providing body motion and so on to be performed by viewer according to singing in karaoke
JP5649607B2 (en) Terminal device and music playback device
CN114329001B (en) Display method and device of dynamic picture, electronic equipment and storage medium
US11740656B1 (en) Systems and methods for interaction of wearable communication devices
Jung et al. Peripheral notification with customized embedded audio cues
KR20160066879A (en) Video display device and operating method thereof
US20210110846A1 (en) Information processing apparatus, information processing method, and program
JP2022163281A (en) robot
JP6232304B2 (en) A viewing system for singing videos that determines the display priority in consideration of the viewer&#39;s evaluation
JP2023080768A (en) Content playlist creation device, content playlist creation method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16870268

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16870268

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP