JP3774358B2 - Content service method using mobile communication terminal - Google Patents

Content service method using mobile communication terminal Download PDF

Info

Publication number
JP3774358B2
JP3774358B2 JP2000215032A JP2000215032A JP3774358B2 JP 3774358 B2 JP3774358 B2 JP 3774358B2 JP 2000215032 A JP2000215032 A JP 2000215032A JP 2000215032 A JP2000215032 A JP 2000215032A JP 3774358 B2 JP3774358 B2 JP 3774358B2
Authority
JP
Japan
Prior art keywords
content
communication terminal
illumination
mobile communication
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2000215032A
Other languages
Japanese (ja)
Other versions
JP2002033802A (en
Inventor
大和 佐々木
亮 本田
修 菊地
Original Assignee
株式会社東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社東芝 filed Critical 株式会社東芝
Priority to JP2000215032A priority Critical patent/JP3774358B2/en
Publication of JP2002033802A publication Critical patent/JP2002033802A/en
Application granted granted Critical
Publication of JP3774358B2 publication Critical patent/JP3774358B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M3/00Automatic or semi-automatic exchanges
    • H04M3/42Systems providing special services or facilities to subscribers
    • H04M3/487Arrangements for providing information services, e.g. recorded voice services, time announcements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers; Analogous equipment at exchanges
    • H04M1/72Substation extension arrangements; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selecting
    • H04M1/725Cordless telephones
    • H04M1/72519Portable communication terminals with improved user interface to control a main telephone operation mode or to indicate the communication status
    • H04M1/72522With means for supporting locally a plurality of applications to increase the functionality
    • H04M1/72561With means for supporting locally a plurality of applications to increase the functionality for supporting an internet browser application

Description

[0001]
BACKGROUND OF THE INVENTION
The present invention relates to a service method for image content such as moving images and still images, audio content, and other content using a mobile communication terminal.
[0002]
[Prior art]
In the field of mobile communication, the adoption of a high-speed and large-capacity communication system represented by W-CDMA (Code Division Multiple Access) is scheduled in the near future. With the adoption of such high-speed and large-capacity communication methods, mobile communication terminals (cell phones), especially terminals that can be connected to the Internet, which are rapidly spreading at present, are becoming more sophisticated. It is also possible to easily receive animation content that is difficult to achieve, such as video content such as television images and digital camera shot images, or high-definition photo and painting still image content and audio content such as high-quality music. It is expected to be.
[0003]
[Problems to be solved by the invention]
When receiving image content and audio content on a mobile communication terminal, the problem is that both the video and audio are due to the fact that the display screen of the terminal is small and the speaker mounted on the terminal is too small to produce a high volume. The lack of power and presence. In addition, mobile communication terminals are easier than personal computers, and the number of users is expected to increase in the future. However, it is still unsatisfactory to consider content service methods that fully utilize these characteristics. That's enough.
[0004]
Therefore, a main object of the present invention is to provide a content service method capable of executing a service related to image and audio content by effectively using a mobile communication terminal compatible with a high-speed and large-capacity communication method.
[0005]
One of the more specific purposes of the present invention is to enable the reproduction of powerful and realistic content on a mobile communication terminal.
[0008]
[Means for Solving the Problems]
In order to solve the above-described problems, the present invention provides a vibration state of a vibrator unit provided in a mobile communication terminal and an illumination provided in the mobile communication terminal when creating content to be distributed to the mobile communication terminal via a network. Display an edit screen in which at least one of the display states of the unit can be set according to the content, and control information for controlling at least one of the vibration state and the display state by an input operation on the edit screen Delivered to mobile communication devices It is added to content.
[0009]
According to the present invention, a video display area for displaying a plurality of videos that are the source of content to be distributed to the mobile communication terminal, a dialogue input area for inputting dialogue corresponding to the video on the video display region, A sound setting area for setting a sound corresponding to the video on the video display area, and a setting for controlling a vibration state of a vibrator unit included in the portable communication terminal corresponding to the video on the video display area For displaying an edit screen having a vibration setting area for setting and an illumination setting area for performing settings for controlling a display state of an illumination unit provided in the mobile communication terminal corresponding to the video on the video display area Editing tools are provided.
[0010]
Further, according to the present invention, when content is distributed to a mobile communication terminal via a network, at least one of a vibration state of a vibrator unit provided in the mobile communication terminal and a display state of an illumination unit of the mobile communication terminal is set as the content content. A program including a function to be controlled accordingly is attached to the content or distributed to the mobile communication terminal prior to the content.
[0011]
Here, when the mobile communication terminal that has downloaded the content via the network is in a voice non-output state, the program is a dialog for displaying the text included in the content on the display screen of the mobile communication terminal. It may further have a display function, and may further have an advertisement display function for displaying advertisement information in the display area when the mobile communication terminal is in an audio output state.
[0019]
Here, an editing tool that can be used when creating the content may be distributed to the plurality of mobile communication terminals in response to a download request from the mobile communication terminal. This editing tool includes, for example, a video display area for displaying a plurality of videos that are the source of content, a dialogue input area for inputting dialogue corresponding to the video on the video display area, and a video on the video display area. A sound setting area for setting a sound corresponding to an image, a vibration setting area for performing a setting for controlling a vibration state of a vibrator unit provided in a mobile communication terminal corresponding to the image on the image display area, and And an editing screen having an illumination setting area for performing settings for controlling the display state of the illumination unit provided in the mobile communication terminal corresponding to the video on the video display area.
[0020]
A mobile communication terminal according to the present invention is a mobile communication terminal having a function of downloading content from outside provided with a vibrator unit that vibrates a terminal housing at the time of incoming call and an illumination unit that performs display operation at the time of incoming call, together with the content According to a downloaded program, there is provided means for controlling a vibration state of the vibrator unit and a display state of the illumination unit according to the content.
[0022]
Still another mobile communication terminal according to the present invention is a mobile communication terminal having a function of downloading content from the outside provided with a vibrator unit that vibrates the terminal housing at the time of incoming call and an illumination unit that performs display operation at the time of incoming call, In accordance with a program downloaded in advance, when a call is received from a specific party, the content downloaded in advance is displayed, and means for controlling the vibration state of the vibrator unit and the display state of the illumination unit is provided. .
[0023]
DETAILED DESCRIPTION OF THE INVENTION
Embodiments of the present invention will be described below with reference to the drawings.
[First Embodiment]
(About system configuration)
FIG. 1 shows a schematic configuration of the entire system according to the first embodiment of the present invention. The mobile communication terminal 1 is configured to be connectable to the Internet 3 via a predetermined carrier communication network 2. The Internet 3 includes a personal computer (hereinafter referred to as a PC) 4 owned by an anime writer as an element related to the present invention, an animation distribution server 5 managed by an animation distributor, and image contents such as video and still images. A content distribution server 7 managed by a content distributor that distributes audio contents such as music is connected, and further, an animation library 6 is connected to the animation distribution server 5, and various information services are provided to the content distribution server 7. An information service server 8 owned by a service provider is connected.
[0024]
In FIG. 1, the animation distribution server 5 and the animation library 6 are provided separately from the content distribution server 7, but may be provided integrally with the content distribution server 7.
[0025]
(About mobile communication terminal 1)
First, the mobile communication terminal 1 will be described. As shown in an external view of FIG. 2, the mobile communication terminal 1 includes an antenna 11, a key input unit 12, and a display unit 13 such as a color liquid crystal display, as in a normal mobile phone. , A receiving speaker 14, a transmitting microphone 15, an incoming lamp 16, a jack for connecting an earphone (or headphones) 17, and a digital camera 18 are provided. With the digital camera 18, it is possible to appropriately capture images and landscapes of the user himself / herself and persons around the user.
[0026]
Furthermore, as an element peculiar to the present embodiment, biosensors 19 are installed at positions where the user's hand contacts when the user holds the mobile communication terminal 1, for example, at several locations on both sides. The biosensor 19 is a sensor that detects the pressure, temperature, humidity (sweat state), and the like of the user's hand holding the mobile communication terminal 1.
[0027]
FIG. 3 shows the internal configuration of the mobile communication terminal 1. The configuration of the mobile communication terminal 1 is assumed to be compatible with the W-CDMA system.
The operation as a normal mobile phone will be described. First, when receiving a radio frequency signal received by the antenna 11 is input to the receiving unit 21 via the duplexer 20, amplified to an appropriate level, and then controlled by the CPU 30. By being mixed with the local signal for reception from the synthesizer 23, it is converted into an intermediate frequency signal. The intermediate frequency signal is input to the CDMA codec 24, and reception data of a predetermined format is generated through orthogonal demodulation and despreading processing. When the received high-frequency signal is an audio signal, the received data is decompressed by the audio codec 25 according to a predetermined audio decoding method to become PCM audio data. The PCM audio data is decoded by the PCM codec 26 to become an analog audio signal, which is supplied to the receiving speaker 14 via the power amplifier 27 and output as audio.
[0028]
On the other hand, at the time of transmission, a user's voice is input as a voice signal via the transmission macrophone 15, amplified by the preamplifier 28, encoded by the PCM codec 26, and becomes PCM voice data. The PCM audio data is compressed by the audio codec 25 in accordance with a predetermined audio encoding method to be transmitted data. The transmission data is input to the CDMA codec 24, subjected to spreading processing using a PN code, and then subjected to orthogonal modulation to be input to the transmission unit 22 as an orthogonal modulation signal. The quadrature modulation signal is mixed with the local signal for transmission from the synthesizer 23 by the transmission unit 22 and converted into a radio frequency signal, and then supplied to the antenna 11 through the duplexer 20 to be radiated into the space as a radio wave. It is transmitted toward the base station included in the carrier communication network 2 of FIG.
[0029]
The CPU 30 is a part that controls each part and performs various operations and processing of video / character information. Here, in addition to the CDMA codec 24, the voice codec 25, and the PCM codec 26, the key input unit 12 shown in FIG. The display unit 13, the camera 18, and the biosensor 19 are connected, and a vibrator unit 31, an illumination unit 32, a sound speaker unit 33, and a memory 34 not shown in FIG. 2 are connected. Further, power is supplied to each unit from the power supply circuit 36 connected to the battery 35 under the control of the CPU 30.
[0030]
The vibrator unit 31 is a device that vibrates the casing of the mobile communication terminal 1 using, for example, an eccentric motor, and is originally intended to notify the user of an incoming call by mechanical vibration by operating at the time of the incoming call. In the present embodiment, the CPU 30 controls the driving so as to operate with various vibration patterns according to the scene of the content even when the moving image content is reproduced.
[0031]
The illumination unit 32 corresponds to the key illumination lamp provided below the push button portion of the key input unit 12 in FIG. 2 or the incoming call lamp 16 in FIG. However, in the present embodiment, as with the vibrator unit 31, the CPU 30 controls the display so as to blink in various display patterns according to the content scene during reproduction of the moving image content.
[0032]
The sound speaker 33 is a speaker also called a sounder installed on the back side of the mobile communication terminal 1, for example, and is used for outputting sound such as a ring tone and key operation sound in a normal mobile phone. Then, it is also used to output speech and sound (sound effects) of characters appearing during playback of moving image content. When the manner button included in the key input unit 12 in FIG. 2 is pressed, the sound output from the sound speaker 33 is stopped.
[0033]
The memory 34 is used as a working memory that holds data of a process in the CPU 30 and is also used as a buffer for storing content data and a content reproduction program (so-called viewer program) when receiving moving image content. The memory 34 may be only a semiconductor memory built in the mobile communication terminal 1, or a so-called memory card in which the semiconductor memory is formed in a card shape to be removable, and a semiconductor memory built in the mobile communication terminal 1. Both of them may be included.
[0034]
FIG. 4 is a block diagram functionally showing a configuration of a portion that controls driving and control of the vibrator unit 31 and the illumination unit 32 in the CPU 30. The incoming call display control circuit 41 generates an incoming vibration control signal and an incoming illumination control signal having a predetermined pattern when the mobile communication terminal 1 receives an incoming call, and these control signals are passed through OR gates 43 and 44, respectively. Vibrator drive circuit 45 And illumination drive circuit 46 is input.
[0035]
On the other hand, the scene display control circuit 42 generates a scene vibration control signal and an illumination control signal that are pre-programmed by, for example, an anime writer according to the scene based on the content data when reproducing the moving image content. The signals are input to vibrator drive circuits 45 and 46 through gates 43 and 44, respectively.
[0036]
Therefore, the vibrator unit 31 and the illumination unit 32 vibrate or blink in a predetermined pattern when receiving an incoming call, but vibrate or blink in various patterns according to the content scene when reproducing moving image content. I do. As a result, at the time of reproduction of the moving image content, the user can enjoy watching the content more powerfully and realistically than the method of simply displaying the video on the screen of the display unit 13.
[0037]
(About the flow of video content service)
Next, a schematic flow of the processing of the moving image content service in the present embodiment will be described using the flowchart shown in FIG.
Taking animation as an example of the moving image content, the animation artist first downloads an editing tool from the anime distribution server 5 of the animation distributor, and uses this to create and edit an animation video (steps S1 to S3). Here, the editing tool is software for an anime writer to efficiently create and edit an animation, and also includes a library of various animation materials and animation creation tools stored in the animation library 6. The editing tool and the animation editing process in step S3 will be described in detail later.
[0038]
In step S3, the original content that is the source of the video content to be finally distributed to the user is generated, and this original content is transmitted to the content distribution server 7 of the content distributor via the Internet 3 (or via a dedicated line). Transferred (step S4). The content distribution server 7 edits the transferred original content by adding or inserting various information as will be described later, and registers the edited moving image content in a memory (not shown) (steps S5 to S5). S6). By registering the content to be distributed in the content distribution server 7 in this way, the moving image content can be distributed to the user.
[0039]
When the user of the mobile communication terminal 1 makes a download request for moving image content, this request is sent to the content distribution server 7 via the carrier communication network 2 and the Internet 3 (step S7). Downloaded via the carrier communication network 2 to the mobile communication terminal 1 that issued the download request (step S8). The download method includes a buffering method and a streaming method, which will be described later. The downloaded moving image content is appropriately reproduced by a reproduction instruction operation by the user (step S9).
[0040]
The video content is, for example, an animation of 12 frames / second, and can be downloaded once if it is a short content, but here the video content is a so-called series product composed of a series of works, and is in units of works. It is assumed that downloading is performed in a plurality of times (in this example, twice). In step S8, the first movie content is downloaded and reproduced in step S9. When the moving image content is a four-frame cartoon, the unit of the work may be a frame unit. Moreover, when one work consists of a plurality of chapters, it may be downloaded in a plurality of times in units of chapters.
[0041]
A user who wants to watch a continuation by playing and watching the first work, for example, performs a download request again by performing an operation of “view continuation”. When the download request for the second video content is requested, a download request is sent to the content distribution server 7 via the first carrier communication network 2 and the Internet 3 (step S10). Further, information obtained when the user performs a specific operation when reproducing the moving image content in a state where the uplink from the mobile communication terminal 1 to the content distribution server 7 is maintained, or the camera 18 And the information obtained by the biosensor 19 (these information is collectively referred to as feedback information) is continuously fed back from the mobile communication terminal 1 to the animation distribution server 5 via the carrier communication network 2 and the Internet 3 (step S11). ).
[0042]
After the above feedback information is fed back from the mobile communication terminal 1 to the animation distribution server 5 and the content distribution server 7, the second video content is downloaded in response to the download request in step S10 (step S12), and is played back by the user. It is appropriately reproduced by the instruction operation (step S13). When the reproduction of the second video content is completed, the feedback information is fed back to the animation distribution server 5 and the content distribution server 7 again as described above (step S14). Thus, in this example, the distribution of the moving image content is completed up to the second work.
[0043]
By the way, in the state in which the mobile communication terminal 1 downloads and plays back the second video content that is the last in step S13, the uplink from the mobile communication terminal 1 to the content distribution server 7 is not set. In order to return feedback information in step S14, the mobile communication terminal 1 needs to access the content distribution server 7 to set up an uplink. For the user of the mobile communication terminal 1, in general, there is no motivation to actively access the content distribution server 7 again after viewing of the moving image content, and only an extra communication fee is charged. is there.
[0044]
Therefore, for example, when the last video content is downloaded or when the video content is played, a message indicating that the content distribution server 7 gives a privilege to the user, for example, the button “3” is pressed next time. A message such as “Content viewing fee will be reduced by 30%” is transferred as text or voice to encourage feedback.
[0045]
Also, feedback information is not returned in step S14, but information serving as feedback information is held in the memory 34 in the mobile communication terminal 1 until the next download request is made, and is fed back at the next download request. Also good.
[0046]
Next, the animation distribution server 5 statisticizes the feedback information sent from the mobile communication terminal 1 in steps S11 and S14 (step S15), and transfers the obtained statistical data to the anime writer (step S16). The anime artist reflects this statistical data on the subsequent editing tools (step S17).
[0047]
In addition, the content distribution server 7 also statisticalizes the feedback information sent from the mobile communication terminal 1 in steps S11 and S14 (step S18), and distributes the obtained statistical data to other research companies and institutions as necessary. (Step S19). Note that the statistical processing in steps S15 and S18 may be performed by either the animation distribution server 5 or the content distribution server 7 when the content of the feedback information to be statistically processed is the same and the statistical method may be the same. Alternatively, it may be performed by one (for example, the content distribution server 7).
What kind of information is fed back as feedback information, and how to use them by statisticalizing will be described in detail later.
[0048]
Hereinafter, the process in the main steps of FIG. 5 will be described in more detail.
(About animation editing process)
FIG. 6 is an example of the animation editing screen 60 used when editing the animation created on the PC 4 by the animation artist in step S3 of FIG. 5. This screen is the editing tool downloaded from the animation distribution server 5 in step S1. Created according to In this animation editing screen 60, each frame of the animation video that has already been created is displayed side by side in chronological order in the video display area 61 at the top of the screen, and below it is a speech input area 62 and characters. A dialogue input area 63, a sound (sound effect) entry area 64, a vibration setting area 65, an illumination setting area 66, and a dialogue input window 67 are sequentially arranged. Further, a user evaluation display area 68 is set at the bottom of the animation editing screen 60.
[0049]
In the speech dialogue input area 62, a range in which dialogue is to be played can be specified by a bar corresponding to the video displayed in the video display area 61, and a character (such as a character) who can speak the dialogue can be selected. In the dialogue input area 63 by characters, it is possible to specify a range in which dialogue is to be played in a bar corresponding to the video displayed in the video display area 61. Dialogue input to these dialogue input areas 62 and 63 can be performed using a dialogue input window 67.
[0050]
In the sound input area 64, the sound to be output from the sound speaker unit 33 of the mobile communication terminal 1 corresponding to the video displayed in the video display area 61 (for example, rain sound, wind sound, thunder sound, wave sound). ), A range for generating a bark, squeal, car running sound, aircraft flight sound, hustle, etc.) and a sound pattern (effect sound such as type) can be selected.
[0051]
In the vibration setting area 65, a range in which the vibrator unit 31 of the mobile communication terminal 1 is vibrated corresponding to the video displayed in the video display area 61 is specified, and a vibration pattern (vibration period, vibration strength, etc.) is selected. it can.
[0052]
Similarly, in the illumination setting area 66, a range for displaying the illumination unit 32 of the mobile communication terminal 1 corresponding to the video displayed in the video display area 61 is specified, and a display pattern (for example, each display of the illumination unit 32 is displayed). The order in which the elements are displayed, the blinking cycle, the brightness, etc. can be selected.
[0053]
By using such an animation editing screen 60, an animation artist can easily edit an effective animation by a visual method. As a result of such editing, data in which the speech / character dialogue, sound information, vibration control information, and illumination control information are added to the animation video is transferred to the content distribution server 7 as the original content in step S4 of FIG.
[0054]
In the animation editing screen 60 shown in FIG. 6, in addition to being able to easily input and specify speech and character dialogue and sound information, the mobile communication terminal 1 originally has an incoming call display and secure visibility in a dark place. For this purpose, the components such as the vibrator unit 31 and the illumination unit 32 that are provided for the purpose are effectively used to reproduce the mechanical vibrations of the casing of the mobile communication terminal 1 and the excellent force and presence accompanied by various changes in the illumination. You can create the original content that you want. In particular, the fact that the vibrator unit 31 and the illumination unit 32 are also used for effective playback of moving image contents is not possible in animation playback of movies, television receivers, PCs, etc., or mobile communications not conventionally performed. This is a characteristic unique to the reproduction of moving image content on the terminal 1.
[0055]
(About content editing process)
Next, the content editing process in step S5 in FIG. 5 will be described with reference to FIG.
FIG. 7A shows the original content created by using the animation editing screen 60 shown in FIG. 6, and a plurality of scenes 0, 1, 2,. Scene 0 is a title screen, for example. For this original content, for example, a message is inserted between scene 0 and scene 1 as shown in FIG. The message here is, for example, “Happy Birthday”, “Please press the“ 1 ”button for a favorite scene,“ 0 ”for an uninteresting scene,” as illustrated in the figure. This is a message for the user of the mobile communication terminal 1.
[0056]
Among these examples, the former is created based on personal information (address, name, date of birth, telephone number, e-mail address, etc.) of the user of the mobile communication terminal 1 registered in advance in the content distribution server 7. This message is inserted into the moving image content to be downloaded to the user in response to the user who issued the download request. On the other hand, the latter is a message for an anime writer (or anime distributor or content distributor) to search for a user's evaluation for each scene of the video content (whether it is interesting or not), and the evaluation result from the user for this message Is received by the animation distributor as the above-mentioned feedback information, and this is statisticalized and the statistical data is transferred to the anime writer so that it can be reflected in the subsequent animation creation.
[0057]
With respect to the original content in FIG. 7 (a) and the content after the message insertion in FIG. 7 (b), as shown in FIG. Questionnaires are inserted before The questionnaire may be related to the content of the video content or may not be directly related, and in the example of the figure, it is a content that asks for personal preference, for example, “Which do you like Japanese confectionery or cake?” ing. A content distributor receives the user's response to such a questionnaire as the feedback information described above, statisticalizes (aggregates) it, and distributes the statistical data to a researcher or the like, which can be used for marketing of products and the like. .
[0058]
Also, as shown in FIG. 7 (d), the original content in FIG. 7 (a), the content after the message insertion in FIG. 7 (b), or the content after the questionnaire insertion in FIG. A selection screen is inserted. In the course selection screen, in the case of the so-called multi-story format in which the content contains a plurality of stories with different courses from the middle, for example, as shown in the figure, “Choose one of dens A and B” , Which prompts the user of the mobile communication terminal 1 to select the course that he / she wants to view, and the animation distributor receives the course information selected as feedback information as described above, and statistically transfers the statistical data to the anime artist. By doing so, it can be reflected in the subsequent creation of a new animation or the modification of an already created animation.
[0059]
Furthermore, as shown in FIG. 7 (e), advertisement information such as various products, information on events such as movies, concerts, and plays, and broadcast stations such as TV and radio are scheduled to be broadcast in packets at appropriate locations in the content. Insert notice information about the program being published, notice information about the books that the publisher plans to publish, etc. When the user of the mobile communication terminal 1 finds interesting advertisement information during the reproduction of the video content, it is assumed that the user returns the reproduction to the screen on which the advertisement information is displayed, stops it, and slowly sees it. However, the content distributor receives the information indicating that such a specific operation (special playback operation) of the mobile communication terminal 1 has been performed as feedback information described above, and distributes the statistical data to a research company or the like. Therefore, it can be used for marketing products.
[0060]
In this way, in the content editing process, the final moving image to be distributed to the user of the mobile communication terminal 1 is performed by editing the original content by inserting additional information such as a message, a questionnaire, a course selection screen, and an advertisement. Content is generated.
[0061]
(About downloading and playback of video content)
Next, various examples of a method of downloading the moving image content created in this way from the content server 5 to the mobile communication terminal 1 and playing it will be described with reference to FIGS. In order to download and reproduce (browse) the above-described moving image content in the mobile communication terminal 1, a browsing program (for example, Java (registered trademark)) corresponding to the moving image content (for example, Java) is used. (Hereinafter referred to as viewer program).
[0062]
8 to 10 show various examples in the case where the viewer program is downloaded from the content distribution server 7 and content data is downloaded.
In FIG. 8, when a download request is issued from the mobile communication terminal 1, first content data is downloaded together with the viewer program, stored (buffered) in the memory 34 of FIG. 3, and then reproduced (step). S21 to S22). Next, when a download request is issued again from the mobile communication terminal 1, the content data of the second work is downloaded by the viewer program downloaded in step S 21, buffered and then played based on the user's playback instruction (step Thereafter, the download, buffering, and reproduction based on the reproduction instruction to the user are repeated in the same manner up to the Nth content data (steps S25 to S26).
[0063]
In FIG. 9, when a download request is issued from the mobile communication terminal 1, only the viewer program is downloaded first (step S30). Subsequently, the first content data is downloaded and buffered by the viewer program, and the playback instruction of the user is given. Based on this, reproduction is performed (steps S31 to S32). Thereafter, each time a download request is issued from the mobile communication terminal 1, the content data is downloaded and buffered by the viewer program downloaded in step S30, and playback based on the user's playback instruction is performed on the content data up to the Nth item ( Steps S33 to S36).
[0064]
The example of FIGS. 8 and 9 described above is a case of a so-called buffering system in which playback is performed after all downloads are completed and buffering is completed for one download request.
On the other hand, FIG. 10 shows an example of a streaming method in which playback is performed while downloading content data. When a download request is issued from the mobile communication terminal 1, only the viewer program is first downloaded (step S40). The first content data is downloaded by the program and played back simultaneously (step S41). Thereafter, every time a download request is issued from the mobile communication terminal 1, the content data downloaded and played by the viewer program downloaded in step S40 is simultaneously performed on the content data up to the Nth item (steps S42 to S43).
[0065]
On the other hand, FIGS. 11 and 12 show a case where the viewer program is preinstalled in the mobile communication terminal 1 and stored, for example, in the memory 34 of FIG. 3, and content data is downloaded using this embedded viewer program. For example, FIG. 11 shows a buffering method, and FIG. 12 shows a streaming method.
[0066]
That is, in FIG. 11, when a download request is issued from the mobile communication terminal 1, the viewer program stored in the memory 34 is first read by the CPU 30 (step S 50). Content data is downloaded and reproduced based on a user's reproduction instruction (steps S51 to S52). Hereinafter, every time a download request is issued from the mobile communication terminal 1, the content data downloaded by the viewer program read in step S50 and the reproduction based on the user's reproduction instruction are performed on the content data up to the Nth item (step S53). To S56).
[0067]
In FIG. 12, when a download request is issued from the mobile communication terminal 1, the viewer program stored in the memory 34 is first read by the CPU 30 (step S60), and the content data of the first work is subsequently downloaded by this viewer program. At the same time, it is played back (step S61). Thereafter, every time a download request is issued from the mobile communication terminal 1, the content data is downloaded and played back by the viewer program read in step S60 at the same time for the Nth content data (steps S62 to S63).
[0068]
Further, when the content distribution server 7 of FIG. 1 can transmit the content data by both buffering and streaming methods, the user can download the content data on the mobile communication terminal 1 side as shown in FIG. Whether to use the streaming method or the streaming method (step S70), and according to this, whether to play after downloading the content data by the buffering method (step S71), or to play while downloading by the streaming method (Step S72) may be switched.
[0069]
Further, the memory 34 of FIG. 3 used as a buffer for temporarily storing the viewer program and content data at the time of downloading and playback of moving image content is built in the mobile communication terminal 1 or mounted in the form of a memory card. The capacity is limited and it is difficult to buffer a large amount of content data. Therefore, as shown in FIG. 14, the remaining capacity of the memory 34 (buffer) is checked (step S80). If the remaining capacity exceeds the threshold value, playback is performed by the buffering method (step S81). Is used for buffering content data, or is played back in a streaming system that uses very little even if it is used (step S82).
[0070]
In this way, by automatically switching between the buffering method and the streaming method according to the remaining buffer capacity of the memory 34, the buffering method can be normally reproduced at any time, and the image can be easily stopped or rewinded and is easy to use. When the remaining capacity of the buffer is reduced, it can be played back by the streaming method, and the moving image content can be played back with flexibility.
[0071]
(About the video content playback screen)
FIG. 15 shows a video content playback screen. A video display area 51 and a character display area 52 are assigned to the display screen 50 of the display unit 13 of the mobile communication terminal 1 shown in FIG. In this example, the character display area 52 is a band-like area set at the lower part of the video display area 51, but the position may be at the upper part of the video display area 51, or in some cases, beside the video display area 51. But you can. In the character display area 52 (also referred to as a telop display area), dialogue is displayed in FIG. 15A, and an advertisement is displayed in FIG. 15B.
[0072]
Such switching of the display contents in the character display area 52 is performed according to the procedure shown in the flowchart of FIG. As shown in FIG. 2, the mobile communication terminal 1 has a manner button as a part of the key input unit 12. When this manner button is pressed to enter a so-called manner mode, even if the mobile communication terminal 1 receives an incoming call, In order to prevent inconvenience, no ring tone is emitted from the sound speaker unit 33 in FIG. 3, and the incoming call lamp 16 blinks, the vibrator unit 31 in FIG. 3 vibrates, the illumination unit 32 below the key input unit 12 is turned on, etc. The user is notified of the incoming call.
[0073]
In the manner mode, as a matter of course, it is desirable from the viewpoint of preventing inconvenience that the dialogue at the time of reproducing the moving image content is not output as the sound from the sound speaker unit 33.
[0074]
Therefore, as shown in FIG. 16A, it is determined whether or not the mobile communication terminal 1 is set to the manner mode (step S91). In the manner mode, the character display is performed as shown in FIG. 15A. The dialogue is displayed as characters in the area 52 (step S92). Thereby, even in the manner mode in which the speech is not output from the sound speaker unit 33, it is possible to recognize the speech as a character, that is, as a caption subtitle.
[0075]
On the other hand, when not in the manner mode, the speech is output as sound from the sound speaker unit 33, so that another information in the character display area 52, for example, an advertisement inserted into the video content as shown in FIG. Is displayed as telop characters as shown in FIG. 15B (step S93). In this way, the character display area 52 can always be used effectively.
[0076]
In FIG. 16A, the dialogue is always displayed in the character display area 52 in the manner mode, but the dialogue can be heard as sound by the earphone 17 even in the manner mode. FIG. 16B is an example taking this point into consideration. In the manner mode, it is further determined whether or not the earphone 17 is used (step S94), and when the earphone 17 is not used, a dialogue is displayed. Then, when the earphone 17 is being used, an advertisement is displayed (step 93).
[0077]
(About feedback information)
Next, details of the feedback information described above and how to use it will be described. In the case of the present embodiment, as described a little earlier, the feedback information returned from the mobile communication terminal 1 to the content distribution server 7, the matters that can be inferred from the feedback information, and the effects of using this are as follows. .
[0078]
(1) Information indicating that the user has performed a special reproduction operation (specifically, operations such as still, segment repeat, frame advance, and fast forward (skip)) when reproducing the moving image content. From these special playback operation information, it is possible to know which scene of which moving image content the user is interested in and which scene feels boring. For example, it can be seen that the user showed interest in a scene in which stillness, interval repeat, frame advance, and the like were performed.
[0079]
(2) Information on the user's face image obtained from the digital camera 18. By analyzing facial expressions from facial images, it is possible to know which scene of which moving image content the user is interested in, feeling powerful, surprised, and which scene is boring.
Note that if the face image information is used as it is as feedback information, the amount of information may be too large. Therefore, it is desirable to perform feedback after binarization or quantization of several gradations. In addition, when the mobile communication terminal 1 has an image compression function based on, for example, the MPEG-4 system, it is also effective to compress the face image information and feed it back.
[0080]
(3) Information obtained by the biosensor 19 (specifically, information on pressure, temperature, and humidity when the user holds the mobile communication terminal 1). For example, when the values of pressure, temperature, humidity, etc. become high, it can be seen that the user feels excitement and tightness in the scene.
[0081]
(4) When inserting a message into the video content shown in FIG. 7B, as shown in Example 2, “Please press the“ 1 ”button for your favorite scene and the“ 0 ”button for an uninteresting scene” ” Information that is fed back from the user of the mobile communication terminal 1 when such a message is inserted. From this information, it is possible to accurately and directly know the scene that the user is interested in and the scene that is not.
[0082]
Accordingly, the feedback information (1) to (4) described above is statisticalized as described above, and the obtained statistical data is displayed, for example, on the user evaluation column 68 of the animation editing screen 60 shown in FIG. When an anime writer creates a new animation or reorganizes an animation that has already been created, by reflecting the statistical data, create an animation work that can be enjoyed by as many users as possible. New distribution is possible.
[0083]
(5) Information obtained by a user's reply operation to a questionnaire inserted in the moving image content as shown in FIG. From such information, each user's preference for food and drink can be known, and further, each user's preference for products, movies and concert genres, actors and talents, and the like can be known.
[0084]
(6) The advertisement inserted in the moving image content as shown in FIG. 7C is reproduced in the character display area 52 as shown in FIG. 15B or displayed on the entire display screen 50, for example. Information (1) to (3) obtained in this case. The feedback information indicating the user's reaction to the advertisement information is very effective when performing various marketing and promotion in a research company, for example.
[0085]
[Second Embodiment]
A schematic flow of processing of the moving image content service in the second exemplary embodiment of the present invention will be described using the flowchart shown in FIG. In the present embodiment, in the same configuration as the first embodiment, a story of a video content is offered to a plurality of users of the mobile communication terminal 1, and a professional animation writer is created based on the story that has been applied A series of services is performed in which a popular contest is performed by publishing the video content so that each user can view it, and each user is encouraged to invest in a user selected as a top prize winner.
[0086]
More specifically, first, a content distributor recruits an animation story contest (step S101). This recruitment may be made by a content distributor on the home page, or may be notified to each user of the mobile communication terminal 1 by e-mail or other methods. It is assumed that user A, user B,..., User N of mobile communication terminal 1 applied for the contest, that is, provided a story (step S102). This application may be, for example, a method in which the user A accesses the content distributor's homepage and writes a story in a predetermined column there, or a method of transmitting the contents to the content distributor server 7 of the content distributor by e-mail. Good.
[0087]
The story that user A has applied for is transferred from the content distributor to PC 4 owned by the anime writer. Based on this story as an original, the anime writer downloads an editing tool from the anime distribution server 5 of the animation distributor as in the first embodiment, and creates and edits the animation video (steps S103 to S105). In step S105, the original content that is the source of the moving image content to be distributed to the user is generated, and this original content is transferred to the content distribution server 7 of the content distributor via the Internet 3 (or via a dedicated line). (Step S106).
[0088]
The content distribution server 7 edits the transferred original content in the same manner as described above, and registers the edited moving image content in a memory (not shown) (steps S107 to S108). After the content to be distributed is registered in the content distribution server 7 in this way, for example, the content distributor can make it possible for the user A, user B,... It is published on the homepage (step S109).
[0089]
When the user of the mobile communication terminal 1, for example, the user B, makes a download request for the video content published by the content distributor, the request is sent to the content distribution server 7 via the carrier communication network 2 and the Internet 3. (Step S110), the content is downloaded from the content distribution server 7 via the Internet 3 and the carrier communication network 2 to the mobile communication terminal 1 of the user B who has issued the download request (Step S111). The download method may be the same as in the first embodiment, and is performed by either the buffering method or the streaming method.
[0090]
The downloaded moving image content is appropriately reproduced by the operation of the reproduction instruction by the user B (step S112). At the time of this reproduction, for example, the user B appropriately performs an operation corresponding to a message such as “Please press the“ 1 ”button for a favorite scene and the“ 0 ”button for an uninteresting scene” ”shown in FIG. It can be carried out. In the example of FIG. 17, the download and playback of the moving image content is completed once, but the download and playback may be performed a plurality of times as in the first embodiment.
[0091]
When the reproduction of the moving image content is completed in this way, the information obtained when the user performs a specific operation when reproducing the moving image content or the information obtained by the camera 18 and the biosensor 19 is the user B's mobile communication terminal. 1 is fed back to the animation distribution server 5 via the carrier communication network 2 and the Internet 3 as an evaluation result for the content of the moving image content (step S113).
[0092]
In a state where the mobile communication terminal 1 downloads and reproduces the moving image content in step S112, the uplink from the mobile communication terminal 1 of the user B to the content distribution server 7 is not set, so feedback information is obtained in step S113. In order to return, the mobile communication terminal 1 of the user B needs to access the content distribution server 7 and set up the uplink. Therefore, for example, at the time of contest recruitment in step S101 or at the time of publication of the entry work in step S109, a message indicating that some privilege is given to the user from the content distribution server 7, for example, “If you participate in voting, 3) Press the button 3 ". The next content viewing fee will be 30% OFF". Send a message such as text or voice to encourage feedback.
[0093]
Next, the content distribution server 7 tabulates the evaluation results as feedback information sent from the mobile communication terminal 1 of the user A in step S113 (step S114). ) As an aggregation result (step S115). In this case, if the winner's consent is obtained, the name and e-mail address of the winner may be announced together. Further, along with the announcement of the total result, an offer is made to invest in the top prize winner (step S116). The announcement of the counting results in step S115 and the solicitation of investment in step S116 are performed on the homepage of the content distributor, for example. When soliciting investment, the investment amount may be, for example, 1,000 yen per unit, or a format in which a user participating in the investment can arbitrarily specify the investment amount.
[0094]
When the user of the mobile communication terminal 1, for example, the user N, informs the content distributor of the investment and the investment amount by writing on an e-mail or a homepage, the investment amount is determined by the user N via the carrier communication network 2, for example. After being deducted in a form that is added to the communication fee to be paid to the operating carrier, it is transferred to the account of the content distributor. Then, the content distributor pays the investee user an amount obtained by subtracting a predetermined fee from the investment amount.
[0095]
In the field of animation, there is always a demand for innovative stories, but it is actually the fact that writers who write such stories (the original authors of animation) seem to be short with anime artists, and we think that demand will increase in the future. This trend is expected to become stronger. According to the present embodiment, when the user of the mobile communication terminal 1 comes up with a story idea that seems to be good at a place where he / she goes out, he / she can easily apply. Therefore, content distributors can collect stories with excellent content relatively easily and find excellent newcomers that have been hidden, greatly contributing to the development of the animation field.
[0096]
[Third Embodiment]
Next, a schematic flow of the processing of the moving image content service in the third embodiment of the present invention will be described using the flowchart shown in FIG. In the present embodiment, in the same configuration as in the first embodiment, the provision of moving image content (animation work) is recruited to a plurality of users of the mobile communication terminal 1, and each user can view the applied moving image content. In this way, a popular contest is performed, and a series of services for urging each user to invest in a user selected as a top prize winner is performed.
[0097]
First, a content distributor recruits a video content contest (step S121). This recruitment may be made by a content distributor on the home page, or may be notified to each user of the mobile communication terminal 1 by e-mail or other methods. Of the users A, B,..., N of the mobile communication terminal 1, the user A applies for the contest, that is, provides video content. In this case, the user A may create video content with his / her tool from the beginning, but in the present embodiment, the editing tool is downloaded from the animation distribution server 5 of the animation distributor in the same manner as in the first embodiment, and animation is performed. Video creation and editing are performed (steps S123 to S124). Then, the completed animation work is applied by transferring it from the mobile communication terminal 1 of the user A as the original content to the content distribution server 7 of the content distributor via the carrier communication network 2 and the Internet 3 (step S125).
[0098]
The content distribution server 7 edits the transferred original content in the same manner as described above, and registers the edited moving image content in a memory (not shown) (steps S126 to S127). After the content to be distributed is registered in the content distribution server 7 in this way, for example, the content distributor can make it possible for the user A, user B,... It is released on the homepage (step S128).
[0099]
When the user of the mobile communication terminal 1, for example, the user B, makes a download request for the video content published by the content distributor, the request is sent to the content distribution server 7 via the carrier communication network 2 and the Internet 3. In step S129, the content is downloaded from the content distribution server 7 to the mobile communication terminal 1 of the user B who has issued the download request via the Internet 3 and the carrier communication network 2 (step S130). The download method may be the same as in the first embodiment, and is performed by either the buffering method or the streaming method.
[0100]
The downloaded moving image content is appropriately reproduced by the operation of the reproduction instruction by the user B (step S131). At the time of this reproduction, for example, the user B appropriately performs an operation corresponding to a message such as “Please press the“ 1 ”button for a favorite scene and the“ 0 ”button for an uninteresting scene” ”shown in FIG. It can be carried out. In the example of FIG. 18, the download and playback of the moving image content is completed once, but the download and playback may be performed a plurality of times as in the first embodiment.
[0101]
When the reproduction of the moving image content is completed in this way, the information obtained when the user performs a specific operation when reproducing the moving image content or the information obtained by the camera 18 and the biosensor 19 is the user B's mobile communication terminal. 1 is fed back to the animation distribution server 5 via the carrier communication network 2 and the Internet 3 as an evaluation result for the content of the moving image content (step S132).
[0102]
In the state where the mobile communication terminal 1 has downloaded and played back the moving image content in step S132, the uplink from the mobile communication terminal 1 of the user B to the content distribution server 7 is not set, so feedback information is returned in step S132. In order to return, the mobile communication terminal 1 of the user B needs to access the content distribution server 7 and set up the uplink. Therefore, for example, at the time of contest recruitment in step S121 or when the entry work is released in step S128, a message that the content distribution server 7 gives some privilege to the user, for example, “If you participate in voting, 3) Press the button 3 ". The next content viewing fee will be 30% OFF". Send a message such as text or voice to encourage feedback.
[0103]
Next, the content distribution server 7 tabulates the evaluation results, which are feedback information sent from the mobile communication terminal 1 of the user A in step S132 (step S133). (Original content) is announced as a total result (step S134). In this case, if the winner's consent is obtained, the name and e-mail address of the winner may be announced together. Further, along with the announcement of the total result, a call is made to invest in the top prize winner (step S135). The announcement of the aggregation results in step S134 and the investment solicitation in step S135 are performed on the homepage of the content distributor, for example. When soliciting investment, the investment amount may be solicited as, for example, one thousand yen, or a format in which a user participating in the investment can arbitrarily specify the investment amount.
[0104]
When the user of the mobile communication terminal 1, for example, the user N, informs the content distributor of the investment and the investment amount by writing on an e-mail or a homepage, the investment amount is determined by the user N via the carrier communication network 2. After being deducted in a form that is added to the communication fee to be paid to the operating carrier, it is transferred to the account of the content distributor. Then, the content distributor pays the investee user an amount obtained by subtracting a predetermined fee from the investment amount.
[0105]
Although it is not limited to animation, the number of creators and writers who produce image content such as videos and still images is generally insufficient, and it is expected to become more serious in the future. According to the present embodiment, it is possible to greatly contribute to the discovery of newcomers involved in such content production.
[0106]
In particular, in the present embodiment, the editing tool described above is also disclosed to the user of the mobile communication terminal 1, and by making this tool an environment that can be freely used, even a user who is not familiar with content creation / editing can Using this editing tool, it is possible to easily create and edit content that can be played with powerful vibration control and illumination control functions of the mobile communication terminal 1, so that the user's willingness to create is enhanced. You can expect the effect.
[0107]
[Other Embodiments]
In each of the embodiments described above, the case where the content is mainly animation has been described as an example. However, the content targeted by the present invention is still image content such as a natural image or painting (more preferably, music, narration, etc.). Content including sound and voice), moving image content other than animation, audio content such as music, and the like.
[0108]
In each of the above-described embodiments, an example in which vibration control and illumination control are performed according to the content at the time of content playback has been described. However, an incoming call from a specific partner (incoming normal phone call and electronic It is also effective to have the mobile communication terminal have a function of performing vibration control and illumination control while displaying pre-downloaded content according to a pre-downloaded program (including incoming mail and short mail).
[0109]
【The invention's effect】
As described above, according to the present invention, it is possible to connect to the Internet and effectively use a mobile communication terminal that supports a high-speed and large-capacity communication method, and thus relates to image and audio contents that have not been realized in the past. Various services can be executed.
[Brief description of the drawings]
FIG. 1 is a diagram showing a schematic configuration of an entire system according to a first embodiment of the present invention.
FIG. 2 is an external view showing an example of a portable communication terminal used in the present invention.
FIG. 3 is a block diagram showing an internal configuration of a mobile communication terminal used in the present invention.
4 is a block diagram functionally showing a configuration related to CPU vibration control and illumination control in FIG. 3;
FIG. 5 is a flowchart showing a flow of processing of a moving image content service in the embodiment.
FIG. 6 is a diagram showing a specific example of an animation editing screen in the embodiment
FIG. 7 is a time chart showing a specific example of the moving image content editing process in the embodiment.
FIG. 8 is a flowchart showing a first procedure related to downloading and playback of moving image content by the mobile communication terminal user in the embodiment;
FIG. 9 is a flowchart showing a second procedure related to downloading and playback of moving image content by the mobile communication terminal user in the embodiment;
FIG. 10 is a flowchart showing a third procedure relating to download and playback of moving image content by the mobile communication terminal user in the embodiment;
FIG. 11 is a flowchart showing a fourth procedure related to downloading and playback of moving image content by the mobile communication terminal user in the embodiment;
FIG. 12 is a flowchart showing a fifth procedure relating to download and playback of moving image content by the mobile communication terminal user in the embodiment;
FIG. 13 is a flowchart showing a first method relating to reproduction of moving image content by the mobile communication terminal user in the embodiment;
FIG. 14 is a flowchart showing a second method related to reproduction of moving image content by the mobile communication terminal user in the embodiment;
FIG. 15 is a diagram showing an example of a video content playback screen in the embodiment
FIG. 16 is a flowchart showing a procedure related to switching display contents in the character display area of the video content playback screen in the embodiment;
FIG. 17 is a flowchart showing a flow of processing of a moving image content service according to the second embodiment of the present invention.
FIG. 18 is a flowchart showing a flow of processing of a moving image content service according to the third embodiment of the present invention.
[Explanation of symbols]
1 ... Mobile communication terminal
2 ... Carrier communication network
3 ... Internet
4 ... PC of animation artist
5 ... Anime distribution server
6 ... Anime Library
7. Content delivery server
8. Various information service servers
11 ... Antenna
12. Key input part with illumination
13 ... Display section
14 ... Speaker for receiving
15 ... Speaker microphone
16. Incoming call lamp
17 ... Earphone
18 ... Digital camera
19 ... Biosensor
30 ... CPU
31 ... Vibrator part
32. Illumination part
33 ... Speaker for sound
34 ... Memory (buffer)
50 ... Display screen
51 ... Video display area
52 ... Character display area
60 ... Animation editing screen
61 ... Video image display area
62 ... Voice input area
63 ... Dialog input area by characters
64 ... Sound (sound effect) input area
65: Vibration setting area
66. Illumination setting area
67 ... Dialog input window
68 ... User evaluation display area

Claims (5)

  1. (A) has the ability to download content from the outside, and a vibrator unit which vibrates the terminal housing, a vibrator drive means for driving the illumination unit, the vibrator unit for performing (b) blinking, (c) the and illumination driving means for driving the illumination unit, the first vibration control signal and the first illumination control signal for vibrating and flashing pattern defined respectively the vibrator unit and the illumination unit when (d) call, the Incoming call display control means to be supplied to the vibrator driving means and illumination driving means via the first and second gates, respectively , (e) according to the downloaded program, the vibration state of the vibrator unit according to the scene of the content And the il A second vibration control signal and a second illumination control signal for controlling the display state of the termination unit are supplied to the vibrator driving means and the illumination driving means through the first and second gates, respectively. A portable communication terminal having a scene display control means ;
    Content distribution system characterized by comprising a distribution server that distributes the program to the mobile communication terminal prior to attachment to or the content to the content to be distributed to the mobile communication terminal.
  2. The program has a dialogue display function for displaying dialogue contained in the content on the display screen of the portable communication terminal when the portable communication terminal that has downloaded the content via the network is at least in a voice non-output state. The content distribution system according to claim 1, further comprising:
  3. The program displays text included in the content in a predetermined display area on the display screen of the mobile communication terminal when the mobile communication terminal that has downloaded the content via the network is in a voice non-output state. 2. The content distribution system according to claim 1, further comprising: a dialog control function for displaying the advertisement information in the display area when the mobile communication terminal is in an audio output state.
  4. A mobile communication terminal having a function of downloading content from outside,
    A vibrator unit that vibrates the terminal housing;
    Illumination part that blinks,
    Vibrator driving means for driving the vibrator unit;
    Illumination driving means for driving the illumination unit;
    Driving the vibrator through a first and a second gate, respectively, for a first vibration control signal and a first illumination control signal for causing the vibrator unit and the illumination unit to vibrate and blink in a predetermined pattern when receiving an incoming call. Incoming display control means to be respectively supplied to the means and the illumination drive means;
    According to a program downloaded together with or prior to the content, a second vibration control signal and a second vibration control signal for controlling the vibration state of the vibrator unit and the display state of the illumination unit according to the scene of the content A portable communication terminal comprising scene display control means for supplying an illumination control signal to the vibrator driving means and the illumination driving means through the first and second gates, respectively .
  5. A mobile communication terminal having a function of downloading content from outside,
    A vibrator unit that vibrates the terminal housing;
    Illumination part that blinks,
    Vibrator driving means for driving the vibrator unit;
    Illumination driving means for driving the illumination unit;
    Driving the vibrator through a first and a second gate, respectively, for a first vibration control signal and a first illumination control signal for causing the vibrator unit and the illumination unit to vibrate and blink in a predetermined pattern when receiving an incoming call. Incoming display control means to be respectively supplied to the means and the illumination drive means;
    Content display means for displaying pre-downloaded content according to a pre-downloaded program when a call is received from a specific partner;
    Together with the display of the content display means, the second vibration control signal and the second illumination control signal for controlling the oscillation state and the display state of the illumination portion of the vibrator unit in accordance with the scene of the content, the first A portable communication terminal, comprising: a scene display control unit that supplies a vibrator driving unit and an illumination driving unit through a first gate and a second gate, respectively .
JP2000215032A 2000-07-14 2000-07-14 Content service method using mobile communication terminal Expired - Fee Related JP3774358B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2000215032A JP3774358B2 (en) 2000-07-14 2000-07-14 Content service method using mobile communication terminal

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2000215032A JP3774358B2 (en) 2000-07-14 2000-07-14 Content service method using mobile communication terminal
PCT/JP2000/006617 WO2002007414A1 (en) 2000-07-14 2000-09-26 Method for information service using portable communication terminal
KR1020027003359A KR100595794B1 (en) 2000-07-14 2000-09-26 Method for information service using portable communication terminal
KR1020057008157A KR100610714B1 (en) 2000-07-14 2000-09-26 Method of electronic mailing using portable communication terminal
KR1020057008155A KR100713754B1 (en) 2000-07-14 2000-09-26 Method and apparatus for information service using portable communication terminal

Publications (2)

Publication Number Publication Date
JP2002033802A JP2002033802A (en) 2002-01-31
JP3774358B2 true JP3774358B2 (en) 2006-05-10

Family

ID=18710522

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2000215032A Expired - Fee Related JP3774358B2 (en) 2000-07-14 2000-07-14 Content service method using mobile communication terminal

Country Status (3)

Country Link
JP (1) JP3774358B2 (en)
KR (3) KR100713754B1 (en)
WO (1) WO2002007414A1 (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4747537B2 (en) * 2004-09-07 2011-08-17 日本電気株式会社 Portable terminal, control method therefor, and computer program for portable terminal
US8035585B2 (en) * 2004-12-17 2011-10-11 Sony Ericsson Mobile Communications Ab Graphic data files including illumination control and related methods and computer program products
JP2006179985A (en) * 2004-12-20 2006-07-06 Dowango:Kk Content distribution system, content distribution server system, content distribution method, and content distribution program
JP2006179986A (en) * 2004-12-20 2006-07-06 Dowango:Kk Content distribution system, content distribution server system, content distribution method, and content distribution program
KR101131856B1 (en) 2006-11-03 2012-03-30 엘지전자 주식회사 Apparatus For Transmitting Broadcast Signal And Method Of Transmitting And Receiving Broadcast Signal Using Same
KR101134926B1 (en) * 2006-11-03 2012-04-17 엘지전자 주식회사 Broadcast Terminal And Method Of Controlling Vibration Of Broadcast Terminal
JP4875483B2 (en) * 2006-12-26 2012-02-15 ソフトバンクモバイル株式会社 Content reproduction method and information processing apparatus
JP2008225119A (en) * 2007-03-13 2008-09-25 Pioneer Electronic Corp Content reproduction device, content reproduction method, content reproduction program and recording medium
JP2009152952A (en) * 2007-12-21 2009-07-09 Nec Corp Distribution system, distribution method, and program
US9313245B2 (en) * 2007-12-24 2016-04-12 Qualcomm Incorporated Adaptive streaming for on demand wireless services
JP4618301B2 (en) 2008-01-08 2011-01-26 ソニー株式会社 How to provide content
WO2009151789A2 (en) 2008-04-17 2009-12-17 Sony Corporation Dual-type of playback for multimedia content
KR101615872B1 (en) * 2009-05-08 2016-04-27 삼성전자주식회사 A method for transmitting haptic function in mobile terminal and system thereof
JP5093785B2 (en) * 2009-10-28 2012-12-12 キヤノンマーケティングジャパン株式会社 Image sharing system, image management apparatus, image sharing method and program.
US9098984B2 (en) * 2013-03-14 2015-08-04 Immersion Corporation Haptic effects broadcasting during a group event
US10139907B2 (en) * 2014-06-16 2018-11-27 Immersion Corporation Systems and methods for foley-style haptic content creation

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0522437A (en) * 1991-07-10 1993-01-29 Nec Corp Prize competition application system
JPH10150505A (en) * 1996-11-19 1998-06-02 Sony Corp Information communication processing method and information communication processing unit
JPH1198224A (en) * 1997-09-18 1999-04-09 Sharp Corp Receiving device equipped with incoming reporting functionality and incoming reporting method and record medium for recording incoming report control program of receiving device
JPH11136365A (en) * 1997-10-31 1999-05-21 Hitachi Ltd Information distribution system
JP3243488B2 (en) * 1997-11-21 2002-01-07 バンダイネットワークス株式会社 Portable electronic equipment
JPH11163971A (en) * 1997-11-26 1999-06-18 Kyocera Corp System for reporting incoming call of radio telephone terminal
JPH11168498A (en) * 1997-12-02 1999-06-22 Casio Comput Co Ltd Network system, electronic mail supporting device and storage medium
JPH11187087A (en) * 1997-12-17 1999-07-09 Toshiba Corp Communication equipment with incoming call vibrator function
JPH11234368A (en) * 1998-02-13 1999-08-27 Nec Mobile Commun Ltd Transmitting and receiving method
JP3628181B2 (en) * 1998-07-30 2005-03-09 京セラ株式会社 Communication terminal device
JP2000101687A (en) * 1998-09-18 2000-04-07 Canon Inc Radio telephone equipment and picture display device
KR100484209B1 (en) * 1998-09-24 2005-09-30 삼성전자주식회사 Digital Content Encryption / Decryption Device and Method
JP2000134332A (en) * 1998-10-26 2000-05-12 J-Phone Tokai Co Ltd Incoming melody sending device for portable telephone
KR20000036731A (en) * 2000-03-27 2000-07-05 이영만 Computer-controlled content playback device made impossible to duplicate

Also Published As

Publication number Publication date
KR20020027645A (en) 2002-04-13
WO2002007414A1 (en) 2002-01-24
KR100713754B1 (en) 2007-05-07
KR100595794B1 (en) 2006-07-03
JP2002033802A (en) 2002-01-31
KR100610714B1 (en) 2006-08-09
KR20050048694A (en) 2005-05-24
KR20050046024A (en) 2005-05-17

Similar Documents

Publication Publication Date Title
JP6490635B2 (en) Information processing apparatus, information processing method, and program
US10067739B2 (en) Unitary electronic speaker device for receiving digital audio data and rendering the digital audio data
US9378278B2 (en) Method and system for constructing and presenting a consumption profile for a media item
US20180007107A1 (en) Information processing apparatus, information processing method and program
US9357245B1 (en) System and method for providing an interactive, visual complement to an audio program
US8982679B2 (en) Playlist sharing methods and apparatus
US20170229151A1 (en) Resuming A Playing Of A Video Responsive To A Beginning Of A Segment.
US8769634B2 (en) System and/or method for distributing media content
CN102884529B (en) For social activity summary can adaptive layout
CN102244812B (en) Video content recommendation
US9380282B2 (en) Providing item information during video playing
DE602005005730T2 (en) Control method for information provision, information reproduction system and information providing device
US8132209B2 (en) Information processing device
US6072521A (en) Hand held apparatus for simulating two way connectivity for one way data streams
US7823080B2 (en) Information processing apparatus, screen display method, screen display program, and recording medium having screen display program recorded therein
JP5250100B2 (en) Programming, distribution and consumption of media content
EP1497994B1 (en) User terminal, media system and method of delivering objects relating to broadcast media stream to user terminal
US7130616B2 (en) System and method for providing content, management, and interactivity for client devices
US8621531B2 (en) Real-time on demand server
CN100511208C (en) System and method for providing a multimedia contents service based on user's preferences
ES2279616T3 (en) Personalized programming in different time.
US20130086277A1 (en) System, method, and computer readable medium for creating a video clip
JP4800953B2 (en) Video playback method and system
KR100597667B1 (en) mobile communication terminal with improved user interface
JP4346688B2 (en) Audio visual system, headend and receiver unit

Legal Events

Date Code Title Description
A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20041005

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20041206

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20050405

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20050606

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20050725

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20050816

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20051012

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20051115

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20060116

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20060214

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20060217

LAPS Cancellation because of no payment of annual fees