KR20140118220A - Mobile terminal and control method thereof - Google Patents
Mobile terminal and control method thereof Download PDFInfo
- Publication number
- KR20140118220A KR20140118220A KR1020130033747A KR20130033747A KR20140118220A KR 20140118220 A KR20140118220 A KR 20140118220A KR 1020130033747 A KR1020130033747 A KR 1020130033747A KR 20130033747 A KR20130033747 A KR 20130033747A KR 20140118220 A KR20140118220 A KR 20140118220A
- Authority
- KR
- South Korea
- Prior art keywords
- sound source
- source data
- server
- target device
- mobile terminal
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000004891 communication Methods 0.000 claims abstract description 48
- 230000004044 response Effects 0.000 claims abstract description 16
- 230000005236 sound signal Effects 0.000 claims description 42
- 230000006870 function Effects 0.000 claims description 36
- 239000000284 extract Substances 0.000 claims description 2
- 238000013459 approach Methods 0.000 claims 1
- 238000010295 mobile communication Methods 0.000 description 9
- 230000005540 biological transmission Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 7
- 210000003811 finger Anatomy 0.000 description 4
- 238000007726 management method Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 229910001220 stainless steel Inorganic materials 0.000 description 2
- 239000010935 stainless steel Substances 0.000 description 2
- 239000010936 titanium Substances 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241001194788 Cylicodiscus gabunensis Species 0.000 description 1
- RTAQQCXQSZGOHL-UHFFFAOYSA-N Titanium Chemical compound [Ti] RTAQQCXQSZGOHL-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 244000240602 cacao Species 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- VJYFKVYYMZPMAB-UHFFFAOYSA-N ethoprophos Chemical compound CCCSP(=O)(OCC)SCCC VJYFKVYYMZPMAB-UHFFFAOYSA-N 0.000 description 1
- 239000010408 film Substances 0.000 description 1
- 230000020169 heat generation Effects 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 238000001746 injection moulding Methods 0.000 description 1
- 239000007769 metal material Substances 0.000 description 1
- 230000003387 muscular Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000015541 sensory perception of touch Effects 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 229920003002 synthetic resin Polymers 0.000 description 1
- 239000000057 synthetic resin Substances 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
- 229910052719 titanium Inorganic materials 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/60—Subscription-based services using application servers or record carriers, e.g. SIM application toolkits
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/80—Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W88/00—Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
- H04W88/02—Terminal devices
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- General Health & Medical Sciences (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Human Resources & Organizations (AREA)
- Theoretical Computer Science (AREA)
- Economics (AREA)
- Health & Medical Sciences (AREA)
- Telephone Function (AREA)
Abstract
Description
The present invention relates to a mobile terminal connected to a server via a network and a control method thereof.
The terminal is movable And may be divided into a mobile / portable terminal and a stationary terminal depending on whether the mobile terminal is a mobile terminal or a mobile terminal. The mobile terminal can be divided into a handheld terminal and a vehicle mount terminal according to whether the user can directly carry the mobile terminal.
As the functions of the terminal are diversified, the terminal is implemented in the form of a multimedia device having a combination of functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, have. Further, in order to support and enhance the function of the terminal, it may be considered to improve the structural and software parts of the terminal.
As the function of the terminal increases as described above, the terminal can further expand the functions of the terminal by communicating with external electronic devices and servers.
Accordingly, an object of an embodiment of the present invention is to provide a mobile terminal that connects to a server, freely shares sound source data with other electronic devices, enables sound source data to be simultaneously played back on an electronic device responding to a sound source data sharing request, Method.
A mobile terminal according to an embodiment of the present invention includes: a main body; A wireless communication unit that connects to a server through a network and receives sound source data capable of stream reproduction; The server transmits a sharing request message of the sound source data to the target device, and when a response message of the target device connected to the server is received from the server, the sound source data is transmitted to the main body and the target And a controller for controlling the wireless communication unit to reproduce streams simultaneously in the device.
In one embodiment, the mobile terminal further includes a user input unit for receiving a control command for executing one of the first mode and the second mode depending on whether the sound source data is shared with the target device,
Wherein the control unit causes the sound source data to be stream-reproduced based on a first input signal in the first mode, and in the second mode, the sound source data is simultaneously reproduced So as to be controlled.
In one embodiment, the mobile terminal comprises: a microphone; An audio output unit for outputting the sound signal input through the microphone along with the sound source data along with the sound source data to be reproduced as a stream; And a display unit for displaying score data corresponding to the sound source data and the sound signal received from the server when the streaming of the sound source data is completed.
In one embodiment, the display unit displays score data corresponding to the sound source data and the sound signal output from the main body in the first mode, and the score data corresponding to the sound source data and the sound source data output from the main body and the target device in the second mode. And score data corresponding to the sound signal are displayed.
In one embodiment, if the sound signal includes a plurality of sound characteristics, the controller extracts a plurality of sound characteristics included in the sound signal and provides information about the extracted sound characteristics to the server , And the display unit displays a plurality of score data corresponding to the plurality of voice characteristics received from the server, respectively.
In one embodiment, the control unit may generate a recording file for the sound source data and the sound signal output through the sound output unit, and transmit the generated recording file to the server in accordance with the first control signal, To the user.
In one embodiment, the display unit is configured to perform a touch input and outputs an image object corresponding to the recording file according to the first control signal. When the first touch input is applied to the image object, A second recording file is generated by performing an editing function on the recording file when a second touch input is applied to the image object, And the second recording file is provided to the device.
In one embodiment, the wireless communication unit transmits the sound source data to the target device using a near field radio signal when the target device is close to the main body.
In one embodiment, the sound source data is sound source data selected from the one stored in any one of the server and the memory associated with the server.
According to another aspect of the present invention, there is provided a method of controlling a mobile terminal, comprising: connecting to a server through a network; Selecting sound source data capable of reproducing a stream from the server; Sending a sharing request message of the selected tone generator data to the target device to the server; And controlling the sound source data to be simultaneously played back in the main body and the target device based on the control signal when the response message of the target device connected to the server is received from the server.
In one embodiment, the control method further comprises: receiving a control command for executing one of a first mode and a second mode that is different from whether or not to share the sound source data with the target device; And controls the sound source data to be stream-reproduced based on the first control signal in the first mode, and controls the sound source data to be reproduced simultaneously to the main body and the target device based on the second control signal in the second mode The method comprising the steps of:
According to the mobile terminal and its control method according to the embodiment of the present invention, the selected sound source can be shared with the target device connected to the server. In the target device responding to the sharing request when reproducing the sound source data, So that it is possible to realize a function of simultaneously singing in a plurality of spaced apart mobile terminals.
1 is a block diagram illustrating a mobile terminal according to an embodiment of the present invention.
2A and 2B are perspective views illustrating an appearance of a mobile terminal according to an exemplary embodiment of the present invention.
3A and 3B are diagrams illustrating an example of a connection between a mobile terminal and a server according to an embodiment of the present invention.
4 is a flowchart illustrating a method of controlling a mobile terminal according to an embodiment of the present invention.
FIG. 5 is a flowchart for explaining operation modes in which selected sound source data are shared or not according to an embodiment of the present invention.
6 is a flowchart illustrating a control method for performing a song confrontation by a plurality of people along with sound source data reproduced in a stream in one mobile terminal, according to an embodiment of the present invention.
7 is a flowchart illustrating a control method for sharing a recording file generated according to sound source data reproduced by a stream with other electronic devices according to an embodiment of the present invention.
FIG. 8A is a conceptual diagram for explaining the control method of FIG. 4, according to an embodiment of the present invention.
FIG. 8B is a conceptual diagram for explaining the method of FIG. 7, according to an embodiment of the present invention.
Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings so that those skilled in the art can easily carry out the technical idea of the present invention. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and like parts are denoted by similar reference numerals throughout the specification.
The mobile terminal described herein may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), a navigator, have. However, it should be understood that the configuration according to the embodiments described herein may be applied to a fixed terminal such as a digital TV, a desktop computer, and the like, unless it is applicable only to a mobile terminal. It will be easy to see.
1 is a block diagram illustrating a
1, a
Hereinafter, the
The
The broadcast receiving module 111 receives broadcast signals and broadcast related information from an external broadcast management server through a broadcast channel. Here, the broadcast-related information means information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information can also be provided through a mobile communication network. In this case, the broadcast-related information may be received by the
The
The
The short-
The position information module 115 is a module for acquiring the position of the
1, an A / V (Audio / Video)
The
The
The
The
The touch sensor may have the form of a touch film, a touch sheet, a touch pad, or the like. The touch sensor may be configured to convert a pressure applied to a specific portion of the
When the touch sensor and the
If there is a touch input via the touch screen, corresponding signals are sent to a touch controller (not shown). The touch controller processes the signals transmitted from the touch sensor, and then transmits data corresponding to the processed signals to the
In the case where the touch screen is electrostatic, it can be configured to detect the proximity of the sensing object by a change of the electric field along the proximity of the sensing target. Such a touch screen may be classified as proximity sensor 141. [
The proximity sensor 141 refers to a sensor that detects the presence or absence of an object to be sensed without mechanical contact using an electromagnetic force or infrared rays. The proximity sensor 141 has a longer life than the contact type sensor and its utilization is also high. Examples of the proximity sensor 141 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
Hereinafter, for convenience of explanation, a proximity action is referred to as " proximity touch " while an object to be sensed does not touch the touch screen, and an action of touching the sensing object on the touch screen is called & touch ".
The proximity sensor 141 detects the presence or absence of a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, The information corresponding to the presence / absence of proximity touch and the proximity touch pattern can be output to the touch screen.
The
The
The
At least one display (or display element) included in the
There may be two or
The
The
The haptic module 155 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 155 is vibration. The intensity, pattern, and the like of the vibration generated by the tactile module 155 are controllable. For example, different vibrations may be synthesized and output or sequentially output.
In addition to the vibration, the haptic module 155 may be configured to perform various functions such as a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or a suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, Various effects can be generated such as the effect of heat generation and the effect of reproducing the cold sensation using the heat absorbing or heatable element.
The haptic module 155 can be configured to not only transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sense of the finger or arm. At least two haptic modules 155 may be provided according to the configuration of the
The
The
The
The identification module is a chip for storing various information for authenticating the usage right of the
The
The
The
The various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using software, hardware, or a combination thereof.
According to a hardware implementation, the embodiments described herein may be applied to various types of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays Microprocessors, microprocessors, and other electronic units for performing other functions, as will be appreciated by those skilled in the art. In some cases, the embodiments described herein may be implemented by the
According to a software implementation, embodiments such as the procedures and functions described herein may be implemented with separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code may be implemented in a software application written in a suitable programming language. Such software codes may be stored in the
Hereinafter, a method of processing user input to the
The
Various kinds of time information can be displayed on the
The
The
One function of the
2A and 2B are perspective views showing the appearance of the
FIG. 2A shows a front side and one side of the
Referring to FIG. 2A, the
The terminal body includes a case (a casing, a housing, a cover, and the like) that forms an appearance. In the embodiment, the case may be divided into a
The cases may be formed by injection molding of a synthetic resin or may be formed to have a metal material such as stainless steel (STS), titanium (Ti) or the like.
The
The
The
The first or
Referring to FIG. 2B, a rear camera 121 'may be additionally mounted on the rear surface of the terminal body, that is, the
For example, the
Meanwhile, the
The
A rear sound output unit 152 'may be additionally disposed on the rear surface of the terminal body. The rear sound output unit 152 'may perform a stereo function together with the front sound output unit 152 (see FIG. 2A), and may perform a speakerphone function during a call.
In addition to the antenna for communication, an
A
The
The
3A is a conceptual diagram illustrating a communication network to which a mobile terminal and a server are connected according to an embodiment of the present invention. The communication network to which the mobile terminal and the server are connected is composed of an electronic device and an IP (Internet Protocol) server.
Referring to FIG. 3A, the electronic device may include all electronic devices capable of wireless communication such as a mobile terminal, a laptop, and a television.
The electronic device can transmit data to the RNC control station through the Node B base station. An RNC (Radio Network Controller) control station is also called a radio network control station. The RNC control station can perform functions for radio resource management of the asynchronous mobile communication system, management of the base station in the wireless subscriber network, and management of interfaces between the radio network controller and other network elements.
The data transmitted to the RNC control station may be transmitted to the IP server via the network. Networks can be classified as packet networks and circuit networks.
That is, when the wireless communication is connected from the electronic device to the IP server, the data is divided into packet units, and when the packet is transmitted to the IP server through a different path for each packet, the data is transmitted to the IP server through the packet network. In this case, when data is transmitted, the data path can be shared with other packets.
On the other hand, when wireless communication is connected from the electronic device to the IP server, data is transmitted to the IP server through the circuit network when the data is transmitted to the IP server through the same path from the start to the release of the wireless communication. In this case, a predetermined data path can be used exclusively for data transmission.
In more detail, the packet network may include an SGSN and a GGSN.
Here, Serving GPRS Support Node (SGSN) is also referred to as a packet switching support node. SGSN refers to a node that is responsible for delivering data packets to and from a mobile station within a service area. The SGSN can perform packet routing and transmission functions. The location register of the SGSN may store location information of a General Packet Radio Service (GPRS) user registered in the SGSN, a user profile (e.g., International Mobile Station Identification Number (IMSI)), and the like.
Next, the Gateway GPRS Support Node (GGSN) is also referred to as a packet gateway support node. The GGSN means a node responsible for the connection function between the GPRS backbone network and the external packet data network. The GGSN converts the GPRS packet received from the SGSN into an appropriate Packet Data Protocol (PDP) format and transmits it to the IP server and converts the PDP address of the incoming packet data to the recipient's global mobile communication system (GSM) address Can be performed. In addition, the GGSN can store the user's profile and the address of the SGSN of the user stored in the location register of the SGSN.
On the other hand, the circuit network includes a mobile switching center (MSC). The MSC, also called the mobile switching center, can control the entire system.
Specifically, the MSC may select the path through which data received from the RNC control station is to be forwarded to the IP server. To this end, the MSC can perform control functions necessary for the mobility of electronic devices and the efficient operation of frequency resources. In addition, the MSC can perform a central control function of processing signals originated or received from the Node B base station and adjusting the Node B base station so that it can be operated efficiently.
As described above, data is transmitted from the electronic device to the RNC control station through the Node B base station, and data transmitted to the RNC control station can be transmitted to the IP server through the packet network and the circuit network.
3B is a conceptual diagram illustrating an example of a communication network to which a mobile terminal and a server are connected according to an embodiment of the present invention. As shown in FIG. 3B, the communication network may include a
Here, the electronic device may be at least one of electronic devices capable of wireless communication such as a mobile terminal, a laptop, and a television, as shown in FIG. 3A.
The
Meanwhile, the
In order to connect the
Herein, "connecting" the
As described above, the connection between the
Hereinafter, a method of sharing sound source data of another electronic device with the mobile terminal in a state where the mobile terminal and the server are connected will be described with reference to FIGS. 4 and 8A. FIG. 4 is a flowchart illustrating a method of controlling a mobile terminal according to an embodiment of the present invention. FIG. 8A is a conceptual diagram illustrating a control method of FIG.
Meanwhile, in the following embodiments, an electronic device will be described as an example of a mobile terminal, and the mobile terminal may include at least one of the components discussed above with reference to FIGS. 1, 2A, and 2B.
4, a
Here, the
In addition, 'stream reproduction' means that the selected sound source data is transmitted in the form of streaming from the
In addition, the 'server' disclosed herein can store and provide at least one sound source data, and can calculate score data, for example, a song score and a rank, based on an acoustic signal received from the outside, It is possible to provide synchronized lyrics data and lyrics synchronization data corresponding to the lyrics data.
The
In order to share sound source data with the
When the response message of the
Hereinafter, a method for controlling the mobile terminal 100a to share sound source data with the
First, the
On the other hand, if the connection to the
When the sound source data is selected as described above, the
Here, the friend list means a list of target devices in which identification information is stored in at least one of the
In this case, the identification information may be a unified resource identifier (URI), for example, at least one of a Tell URI and a SIP URI. As described above, in the
When the target device to share sound source data is selected, the
In response to the sharing request message being transmitted, the
When the 'Accept' response message is received from the
That is, when a control signal for instructing stream reproduction of sound source data is input through the
At this time, if stream reproduction is stopped in the
Meanwhile, in another embodiment, when the target device is close to the
When the sound source data is simultaneously reproduced in the
At this time, the
8A, a
8A, when the song reproduction is completed, that is, when stream reproduction of the sound source data is completed, the
As described above, embodiments of the present invention can share a selected sound source with a target device connected to a server, and simultaneously reproduce the sound source data in a target device responding to a sharing request when reproducing sound source data, It is possible to implement a function of simultaneously singing in a plurality of spaced apart mobile terminals.
Hereinafter, a control method of the mobile terminal, which varies depending on whether or not to share sound source data, will be described with reference to FIGS. 1, 4, and 5. FIG. In this regard, reference numeral 5 is a flowchart for explaining operation modes that are different depending on whether or not selected sound source data are shared according to an embodiment of the present invention.
Referring to FIG. 5, the
When the sound source data is selected, the
If the first operation mode is selected, that is, the mode is not the sound source data sharing mode, the
If the second operation mode is selected, that is, in the sound source data sharing mode, the
More specifically, when the second operation mode is selected, the
When the stream reproduction in steps S530 and S560 is completed, the score data received from the server is received on the display unit of the mobile terminal and / or the target device (S570).
Specifically, in the first mode, only the score data corresponding to the sound source data and the sound signal output from the
As described above, it is possible to selectively perform song singing by causing the user to stream music data to only his / her mobile terminal, or to perform streaming simultaneously on his / her mobile terminal and target device, Thereby satisfying various user's preferences and providing convenience.
Hereinafter, a control method capable of singing together in one mobile terminal even when the sound source data is set to be reproduced only by the own mobile terminal will be described. In this regard, FIG. 6 is a flowchart illustrating a control method for performing singing by a plurality of people according to sound source data reproduced by a single mobile terminal, according to an embodiment of the present invention.
Referring to FIG. 6, when sound source data capable of stream reproduction is selected by the
The
At this time, a plurality of voice characteristics may be included in the outputted acoustic signal. That is, when a sound signal inputted along the sound source data reproduced by the stream is a voice input by a plurality of users at the same time, the output sound signal includes a plurality of sound characteristics.
The
As a result of the determination, if the sound signal is a sound signal input by a single user, the sound signal is directly transmitted to the
On the other hand, if it is determined in step S620 that the sound signal is a sound signal input by a plurality of users, that is, if the outputted sound signal includes a plurality of sound characteristics, And provides the extracted voice characteristic information to the server 200 (S640).
For example, when the output sound signal includes the sounds of the
As described above, since the selected sound source data is stream-reproduced only by one mobile terminal, a plurality of users can perform singing together so that a pleasure called together and a realistic environment such as a real karaoke can be provided have.
Hereinafter, with reference to FIG. 1, FIG. 4, FIG. 7, and FIG. 8B, a control method for sharing a recording file created while singing songs along stream source data to be played back to other users will be described. In this regard, FIG. 7 is a flowchart illustrating a control method for sharing a recording file generated according to sound source data reproduced by a stream with other electronic devices, according to an embodiment of the present invention.
7, when sound source data capable of stream reproduction is selected by the
Accordingly, the
As stream reproduction of the sound source data proceeds, the
When the recording file is generated, the
When the recording file is generated as described above, the
Referring to FIG. 8B, when the screen information related to stream reproduction of the selected sound source data is output to the
Referring to FIG. 7 again, the
Specifically, when a first touch input is applied to an output image object, a playback function for the provided recording file is performed, and when a second touch input is applied to the output image object, an editing function for the provided recording file is performed, You can create a recording file. The generated second recording file is provided to the mobile terminal body and the target device connected to the server according to the second control signal.
On the other hand, if the edit function for the recorded file is performed to generate the second recorded file (S760), the steps described above (S730 to S760) can be repeatedly performed.
8B, an
At this time, if the touch input of the user is applied to the
When the 'play' key is selected in the pop-up
As described above, according to the mobile terminal and its control method according to the embodiment of the present invention, a selected sound source can be shared with a target device connected to a server, and a target device By simultaneously reproducing the sound source data, it is possible to implement a function of simultaneously singing in a plurality of spaced apart mobile terminals.
Further, according to the embodiment disclosed herein, the above-described method can be implemented as a code that can be read by a processor on a medium on which the program is recorded. Examples of the medium that can be read by the processor include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, etc., and may be implemented in the form of a carrier wave (e.g., transmission over the Internet) .
The above-described mobile terminal and its control method are not limited to the configuration and method of the above-described embodiments, but the embodiments may be modified such that all or some of the embodiments are selectively And may be configured in combination.
Claims (11)
A wireless communication unit for connecting to a server through a network and receiving sound source data capable of stream reproduction; And
The server transmits a sharing request message of the sound source data to the target device, and when a response message of the target device connected to the server is received from the server, the sound source data is transmitted to the main body and the target And controls the wireless communication unit to reproduce the stream simultaneously in the device.
Further comprising a user input unit for receiving a control command for executing one of a first mode and a second mode which is different from whether the sound source data is shared with the target device,
Wherein,
In the first mode, the sound source data is stream-reproduced based on the first control signal, and in the second mode, the sound source data is simultaneously reproduced on the main body and the target device on the basis of the second control signal The mobile terminal comprising:
MIC;
An audio output unit for outputting the sound signal input through the microphone along with the sound source data along with sound source data reproduced as a stream; And
Further comprising a display unit for displaying score data corresponding to the sound source data and the sound signal output from the server when stream reproduction of the sound source data is completed.
The display unit includes:
In the first mode, score data corresponding to sound source data and sound signals output from the main body are displayed
And the score data corresponding to the sound source data and the sound signal output from the main body and the target device in the second mode, respectively.
If the acoustic signal includes a plurality of speech characteristics,
Wherein the control unit extracts a plurality of voice characteristics included in the sound signal and provides information about the extracted voice characteristics to the server,
Wherein the display unit displays a plurality of score data corresponding to the plurality of voice characteristics received from the server, respectively.
Wherein,
And generating a recording file for the sound source data and the sound signal output through the sound output unit and simultaneously providing the generated recording file to the main body and the target device connected to the server according to the first control signal terminal.
Wherein the display unit is configured to enable touch input and outputs an image object corresponding to the recording file according to the first control signal,
Wherein,
When a first touch input is applied to the image object, a playback function for the recorded file is performed. If a second touch input is applied to the image object, an edit function for the recorded file is performed to generate a second recorded file And provides the second recording file to the main body and the target device according to a second control signal.
The wireless communication unit includes:
And transmits the sound source data to the target device using the near field radio signal when the target device approaches the main body.
Wherein the sound source data is sound source data selected from the one stored in the memory of the main body, the server, and the memory interlocked with the server.
Selecting sound source data capable of reproducing a stream from the server;
Transmitting a sharing request message of the selected tone generator data to the target device to the server; And
And controlling the sound source data to be reproduced simultaneously in the main body and the target device based on a control signal when the response message of the target device connected to the server is received from the server.
Receiving a control command for executing one of a first mode and a second mode that is different from whether the sound source data is shared with the target device; And
And controls the sound source data to be stream-reproduced based on the first control signal in the first mode, and controls the sound source data to be reproduced simultaneously to the main body and the target device based on the second control signal in the second mode The method comprising the steps of:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130033747A KR20140118220A (en) | 2013-03-28 | 2013-03-28 | Mobile terminal and control method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020130033747A KR20140118220A (en) | 2013-03-28 | 2013-03-28 | Mobile terminal and control method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20140118220A true KR20140118220A (en) | 2014-10-08 |
Family
ID=51991105
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020130033747A KR20140118220A (en) | 2013-03-28 | 2013-03-28 | Mobile terminal and control method thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20140118220A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024058568A1 (en) * | 2022-09-16 | 2024-03-21 | 삼성전자주식회사 | Singing mode operation method and electronic device performing same |
-
2013
- 2013-03-28 KR KR1020130033747A patent/KR20140118220A/en not_active Application Discontinuation
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024058568A1 (en) * | 2022-09-16 | 2024-03-21 | 삼성전자주식회사 | Singing mode operation method and electronic device performing same |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101588730B1 (en) | Mobile terminal and method for communicating using instant messaging service thereof | |
CN102467343B (en) | Mobile terminal and the method for controlling mobile terminal | |
KR101917691B1 (en) | Mobile terminal and control method thereof | |
KR101887453B1 (en) | Mobile terminal and control method thereof | |
US9182901B2 (en) | Mobile terminal and control method thereof | |
KR101990040B1 (en) | Mobile terminal and cloud system using the mobile terminal | |
KR101917696B1 (en) | Mobile terminal and control method thereof | |
KR101688145B1 (en) | Method for reproducing moving picture and mobile terminal using this method | |
KR20140062886A (en) | Mobile terminal and control method thereof | |
KR101870181B1 (en) | Mobile terminal and control method thereof | |
KR101952178B1 (en) | Mobile terminal and cloud system using the mobile terminal | |
KR101672215B1 (en) | Mobile terminal and operation method thereof | |
KR101977259B1 (en) | Mobile terminal and cloud system using the mobile terminal | |
KR20160087969A (en) | Mobile terminal and dual lcd co-processing method thereof | |
CN106331797A (en) | Mobile terminal and method for controlling the same | |
KR101739387B1 (en) | Mobile terminal and control method thereof | |
KR20160006518A (en) | Mobile terminal | |
KR20140118220A (en) | Mobile terminal and control method thereof | |
KR20110136589A (en) | Mobile terminal and operating method thereof | |
KR101781453B1 (en) | Electronic device, account management method thereof, and account management system using the same | |
KR20170083905A (en) | Mobile terminal and method for controlling the same | |
KR101598226B1 (en) | Method for transmitting data related moving image method for displaying moving image and mobile terminal using the same | |
KR101984088B1 (en) | Mobile terminal and cloud system | |
KR20130030691A (en) | Mobile terminal and electronic device control system using the same | |
KR101966947B1 (en) | Mobile terminal and cloud system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WITN | Withdrawal due to no request for examination |