KR20170073901A - Mobile terminal and the control method thereof - Google Patents
Mobile terminal and the control method thereof Download PDFInfo
- Publication number
- KR20170073901A KR20170073901A KR1020150182658A KR20150182658A KR20170073901A KR 20170073901 A KR20170073901 A KR 20170073901A KR 1020150182658 A KR1020150182658 A KR 1020150182658A KR 20150182658 A KR20150182658 A KR 20150182658A KR 20170073901 A KR20170073901 A KR 20170073901A
- Authority
- KR
- South Korea
- Prior art keywords
- mobile terminal
- content
- user
- terminal
- area
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 49
- 238000004891 communication Methods 0.000 claims abstract description 50
- 230000002996 emotional effect Effects 0.000 claims description 3
- 230000006870 function Effects 0.000 description 24
- 210000003128 head Anatomy 0.000 description 17
- 238000010586 diagram Methods 0.000 description 16
- 230000005540 biological transmission Effects 0.000 description 15
- 230000000694 effects Effects 0.000 description 10
- 238000010295 mobile communication Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 9
- 230000008451 emotion Effects 0.000 description 8
- 230000008859 change Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 239000010408 film Substances 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 229910052751 metal Inorganic materials 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 229910001220 stainless steel Inorganic materials 0.000 description 2
- 239000010935 stainless steel Substances 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 229920003002 synthetic resin Polymers 0.000 description 2
- 239000000057 synthetic resin Substances 0.000 description 2
- 239000010936 titanium Substances 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- RTAQQCXQSZGOHL-UHFFFAOYSA-N Titanium Chemical compound [Ti] RTAQQCXQSZGOHL-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 230000008034 disappearance Effects 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- VJYFKVYYMZPMAB-UHFFFAOYSA-N ethoprophos Chemical compound CCCSP(=O)(OCC)SCCC VJYFKVYYMZPMAB-UHFFFAOYSA-N 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 238000001746 injection moulding Methods 0.000 description 1
- 238000001646 magnetic resonance method Methods 0.000 description 1
- 230000004060 metabolic process Effects 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000009774 resonance method Methods 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 239000010454 slate Substances 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 229910052719 titanium Inorganic materials 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H04M1/7253—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G06Q50/30—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2201/00—Electronic components, circuits, software, systems or apparatus used in telephone systems
- H04M2201/42—Graphical user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Operations Research (AREA)
- Tourism & Hospitality (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A mobile terminal is disclosed. The apparatus includes a display unit, a communication unit for communicating with another terminal, and a control unit. The control unit transmits the content to the other terminal through the communication unit, and displays the first virtual position of the mobile terminal user or the second virtual position of the other terminal user for viewing the content with respect to the area of the virtual space displayed on the display unit, And controls the mobile terminal to display the content in one area of the virtual space according to a first view (view) of the playback area at the first virtual position, And controls the other terminal to display the content in one area of the virtual space according to the second view of the playback area at the virtual position.
Description
BACKGROUND OF THE
A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.
The functions of mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.
Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .
In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.
Virtual reality (VR), on the other hand, refers to an environment or situation created by computer graphics with an environment similar to the real world. Therefore, the user can interact with the virtual reality in real time through the manipulation of the device, and can experience a sensory experience similar to reality. Also, Augmented Reality (AR) is a computer graphics technique that combines virtual objects or information in a real environment to make them look like objects in the original environment. Mixed Reality).
Therefore, it is necessary to implement various functions for realizing a virtual reality or augmented reality through a terminal.
It is an object of the present invention to provide the same effect as viewing a content in an actual space by displaying a view of a content image according to a virtual position of each of a plurality of users .
In order to achieve the above object, a mobile terminal according to an embodiment of the present invention includes a display unit, a communication unit for communicating with another terminal, and a communication unit for transmitting contents to another terminal, An interface for selecting at least one of a first virtual position of the mobile terminal user or a second virtual position of the other terminal user for viewing the content with respect to an area where the content is reproduced is displayed on the display unit, Controls the mobile terminal to display the content in one area of the virtual space according to a first view of the area and displays the content in one area of the virtual space according to the second view of the playback area at the second virtual position And a control unit for controlling the other terminals.
In addition, the control unit may display a background selection screen for selecting a background screen in the virtual space other than the playback area on the display unit.
In addition, the control unit may display an icon indicating at least one of the emotion and the viewing state of another terminal user on a specific scene of the content on one side of the display unit.
Further, the control unit may transmit any one of a text or an icon indicating at least one of the emotional state and the viewing state of the mobile terminal user to a specific scene of the content to the other terminal through the communication unit.
Also, the control unit can upload either the text or the icon to the social network service (SNS) of the user account.
In addition, the controller may display a virtual position of each of a plurality of other terminal users viewing the content.
The control unit may further include a microphone, and the control unit may transmit the voice of the user of the mobile terminal to at least one of the plurality of other terminal users during reproduction of the content.
The control unit may further include a sensor for sensing a posture of the user of the mobile terminal, and the control unit may be configured to control the posture of the user of the mobile terminal with respect to the other terminal located at any one of left and right sides of the virtual position of the mobile terminal user, And receive voice of another terminal user.
In addition, the controller may control at least one of a playback speed and a playback interval of the content transmitted to the other terminal.
In addition, the control unit may display at least one of an icon indicating a playback speed and a progress bar indicating a playback interval on one side of the display unit.
In addition, the control unit may control at least one of the playback speed and the playback interval for each of the plurality of other terminals.
In addition, the control unit may display a playback screen of another terminal on one side of the display unit.
Meanwhile, a mobile terminal according to another exemplary embodiment of the present invention receives content from another terminal through a display unit, a communication unit for communicating with another terminal, and a communication unit, and receives a content The display unit displays an interface for selecting a virtual location of the mobile terminal user for viewing the content on the display unit and displays the content in one area of the virtual space according to a view of the reproduction area at the selected virtual location. And a control unit for controlling the control unit.
Meanwhile, a mobile terminal according to another exemplary embodiment of the present invention receives connection information of a content server from another terminal through a display unit, a communication unit for communicating with another terminal and a content server, and a communication unit, An interface for receiving content from a content server and for selecting a virtual location of a mobile terminal user for viewing content with respect to an area of the virtual space displayed on the display unit in which the content is reproduced is displayed on the display unit, And a controller for controlling the mobile terminal to display the content in one area of the virtual space according to a view of the playback area.
Meanwhile, a method of controlling a mobile terminal according to an exemplary embodiment of the present invention includes transmitting content to another terminal, displaying a content of the virtual space displayed on the display unit, Displaying an interface for selecting at least one of a first virtual location or a second virtual location of a terminal user on the display unit in accordance with a first view for a playback area at a first virtual position; Controlling the mobile terminal to display the content in one area of the virtual space according to the second view of the playback area at the second virtual position, and controlling the other terminal to display the content in one area of the virtual space.
According to at least one of the embodiments of the present invention, there can be provided an effect of providing a virtual reality in which the same contents are viewed in the same space even between remote users.
Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.
1 is a block diagram of a mobile terminal according to an embodiment of the present invention,
FIG. 2 is a conceptual diagram of a mobile terminal according to an embodiment of the present invention,
3 to 4 show various examples in which a mobile terminal according to an embodiment of the present invention is mounted on a wearable device,
FIG. 5 is a view illustrating a wearable wearable apparatus equipped with a mobile terminal according to an embodiment of the present invention,
6 is a flowchart illustrating a method of controlling a mobile terminal according to an exemplary embodiment of the present invention.
7 to 8 are diagrams for explaining a method of transmitting contents to another terminal in a mobile terminal according to an embodiment of the present invention;
9 to 10 are diagrams for explaining a method of selecting a background screen in a mobile terminal according to an embodiment of the present invention,
11 to 12 are diagrams for explaining a method of selecting a virtual location of a user in a mobile terminal according to an embodiment of the present invention,
13 to 15 are views for explaining a view of contents according to a selected virtual location in a mobile terminal according to an embodiment of the present invention,
16 to 18 are views for explaining screens of another terminal receiving content according to an embodiment of the present invention;
19 to 25 are views for explaining a screen on which contents are reproduced in a mobile terminal according to an embodiment of the present invention,
26 to 30 are diagrams for explaining voice transmission / reception among a plurality of users in a mobile terminal according to an embodiment of the present invention,
33 to 41 are diagrams for explaining a method of controlling content reproduction in a mobile terminal according to an embodiment of the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
The use of the terms "comprising" or "having" in this application is intended to specify the presence of stated features, integers, steps, operations, elements, parts, or combinations thereof, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, parts, or combinations thereof.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.
However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, .
1 is an exemplary block diagram of a
More specifically, the
The
The
The
The
The
In addition, the
In addition to the operations associated with the application program, the
In addition, the
The
At least some of the components may operate in cooperation with each other to implement a method of operation, control, or control of the mobile terminal according to various embodiments described below. The method of operation, control, or control of the mobile terminal may also be implemented on the mobile terminal by driving at least one application program stored in the
Hereinafter, the various components of the
First, referring to the
The
The wireless signal may include various types of data depending on a voice call signal, a video call signal, or a text / multimedia message transmission / reception.
The
Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA (Digital Living Network Alliance), WiBro for example, a
The
The short-
Here, the other
The
Next, the
The
The
Meanwhile, the
First, the
Examples of the
For the sake of convenience of explanation, the act of recognizing that an object is located on the touch screen while the object is not in contact with the touch screen is referred to as "proximity touch" Quot; contact touch ". The position at which an object is touched on the touch screen means a position at which the object corresponds vertically to the touch screen when the object is touched. The
The touch sensor uses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, Detection.
For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion into an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.
If there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
On the other hand, the
On the other hand, the touch sensors and proximity sensors described above can be used independently or in combination to provide short (touch), long (touch), multi touch, drag touch Touches such as flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, Sensing can be performed.
The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the
The
The
The
Also, the
In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), a projection system (holographic system) can be applied.
The
The
In addition to the vibration, the
The
The
The signal output from the
The
Meanwhile, the identification module is a chip for storing various kinds of information for authenticating the usage right of the
The
The
The
Meanwhile, as described above, the
The
The
Also, the
As another example, the
In the following, the various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
2 is a conceptual diagram of a
Here, the terminal body can be understood as a concept of referring to the
The
A
Electronic components may be mounted on the
As shown, when the
These
Unlike the above example in which the
Meanwhile, the
The
2, a
Also, these configurations are not limited to this arrangement. These configurations may be excluded or replaced as needed or placed on another side. For example, the
The
The
In addition, the
The
On the other hand, the touch sensor is formed as a film having a touch pattern, and is disposed between the
In this way, the
The first
The
The
The
The
In this figure, the
The contents input by the
On the other hand, a back input unit (not shown) may be provided on the rear surface of the terminal body as another example of the
The rear input unit may be disposed so as to overlap with the
When a rear input unit is provided on the rear surface of the terminal body, a new type of user interface using the rear input unit can be realized. Also, when the
Meanwhile, the
The
The
A second camera (not shown) may be disposed on the rear surface of the terminal body. In this case, the second camera (not shown) has a photographing direction which is substantially opposite to that of the
The second camera (not shown) may include a plurality of lenses arranged along at least one line. The plurality of lenses may be arranged in a matrix form. Such a camera can be called an array camera. When the second camera (not shown) is composed of an array camera, images can be taken in various ways using a plurality of lenses, and a better quality image can be obtained.
Flash 124 may be disposed adjacent to a second camera (not shown). The flash 124 illuminates the subject toward the subject when the subject is photographed by the second camera (not shown).
A second sound output unit (not shown) may be additionally disposed in the terminal body. The second sound output unit (not shown) may implement a stereo function together with the first
The terminal body may be provided with at least one antenna for wireless communication. The antenna may be embedded in the terminal body or formed in the case. For example, an antenna constituting a part of the broadcast receiving module 111 (see FIG. 1) may be configured to be able to be drawn out from the terminal body. Alternatively, the antenna may be formed in a film type and attached to the inner surface of the
The terminal body is provided with a power supply unit 190 (see FIG. 1) for supplying power to the
The battery 191 may be configured to receive power through a power cable connected to the
In the figure, the
The
Hereinafter, embodiments related to a control method that can be implemented in the
3 to 4 are various examples in which a
3 to 4, the
The
The
Meanwhile, the
6 is a flowchart illustrating a control method of the
First, the
Specifically, the
Thereafter, the
Specifically, the
In addition, the
In the above description, the
Then, the
The
In the above description, the case where the
In addition, the
Hereinafter, the present invention will be described with reference to specific examples.
7 to 8 are diagrams for explaining a method of transmitting contents to another terminal (not shown) in the
7 is a transmission screen for transmitting contents to another terminal (not shown).
First, the
The background of the virtual space displayed in the
Also, the user can set the background of the virtual space on the background selection screen, which will be described with reference to FIG. 9 to FIG. 9 to 10 are diagrams for explaining a method of selecting a background screen in the
When the
Meanwhile, the
Meanwhile, the
Meanwhile, the
When the
The virtual location of each of the user of the
Meanwhile, the
Meanwhile, the
Meanwhile, the transmission screen is not limited to the above-described contents, and various modifications are possible.
13 to 15 are views for explaining a view of contents according to a selected virtual position in the
For example, as shown in Fig. 12, it is assumed that seats are selected on the left side portion, the center portion, and the right side portion of the screen on the position selection screen 133-1 where the stage or screen is arranged on the front portion.
In this case, if the content is reproduced, a screen as shown in FIG. 13 may be displayed on the
Thus, each user can experience the same effect as actually listening to the contents through the terminal.
16 to 18 are views for explaining screens of another terminal (not shown) receiving contents according to an embodiment of the present invention. In this figure, the screen shown in another terminal (not shown) receiving the content will be described, and a description of a part overlapping with the above description will be omitted.
When the content is transmitted from the
First, the background area is used to set the background of the virtual space, as described above.
In the
In the
On the other hand, the
While the
Meanwhile, the
Also, if the operation according to the guide message is performed in another terminal (not shown), a message informing that the
16, the case where the virtual location of the user of another terminal (not shown) is set by the user of the
Specifically, the
In this case, if the content viewing is charged, the connection information may include payment information for the content. Accordingly, when the user of the
On the other hand, when the content viewing is free, that is, when the user does not need a separate payment when viewing the content, the user of the
17 shows various examples of a connection screen for a content server (not shown) displayed on another terminal (not shown) by the above-described method. 17 (a) and 17 (b) or a display (not shown) as shown in FIG. 17 (c), the other terminal (not shown) receives all the access information necessary for content viewing from the
19 to 25 are views for explaining a screen in which contents are reproduced in the
FIG. 19 shows a case where a video messenger is displayed in a content reproduction area in a virtual space before contents are reproduced, as described above.
Then, as shown in FIG. 20, the selected virtual space is displayed, and the content can be reproduced in the content reproduction area formed in the virtual space.
Here, as shown in FIG. 20, an icon may be displayed on one side of the virtual space screen. The icon may indicate an emotion or a viewing state of a user of the
Accordingly, the icon corresponding to the state of the user of the
On the other hand, as shown in FIG. 21, a message window M1 may be displayed on one side of the virtual space screen. In the message window M1, a dialog or a caption at the time when the content is being reproduced is displayed, and the dialog or the caption selected by the user of the
On the other hand, as shown in FIG. 22, a message indicating that a terminal (not shown) has watched the scene selected by the user of the
On the other hand, as shown in FIG. 23, a dialog window C can be displayed on one side of the virtual space screen. Therefore, a chat between each user can be performed while enjoying contents.
On the other hand, as shown in FIG. 24, a message indicating the emotion or viewing state of the user of the
On the other hand, as shown in FIG. 25, in the message window M4, a message indicating the emotion or viewing state of the user of the
As a result, each user can experience the effect of sharing the emotions by listening to the contents in the same space even if they are located at a remote distance.
26 to 32 are diagrams for explaining voice transmission / reception among a plurality of users in the
Referring to FIG. 26, virtual locations of four users are selected on the location selection screen 133-1. That is, it is assumed that the first to fourth users A1 to A4 listen to the contents at virtual first to fourth positions P1 to P4, respectively. Specific details are as described in Figs. 11 to 12. Fig.
Hereinafter, voice transmission / reception between the first to fourth users A1 to A4 according to the arrangement of the seats described in Fig. 26 will be described. In this case, even if each user is shown as being adjacent to each other in the drawing, it should be understood that they are located on-line, that is, in the virtual space, and that the distance between the users is offline have.
Referring to FIG. 27, the second user A2 wears the
The head direction of the user can be sensed through
On the other hand, referring to FIG. 28, the second user A2 speaks a voice in a state in which the head is directed to the right. The dotted arrow in Fig. 28 may mean that the second user A2 is heading head to the right with respect to the imaginary screen S center axis. In this case, the voice of the second user A2 may not be transmitted to the first user A1 on the left side, but may be transmitted only to the third user A3 and the fourth user A4 on the right side. The concentric circle D shown in FIG. 28 may mean that the voice uttered by the second user A2 is being delivered only to the third user A3 and the fourth user A4. In this case, a text or an image indicating that the received voice is the voice of the second user A2 may be displayed on the display unit of each terminal worn by the third user A3 and the fourth user A4.
Referring to FIG. 29, the second user A2 speaks a voice in a state in which the head is inclined to the right while the head is directed to the right. In this case, the voice of the second user (A2) can be delivered only to the adjacent third user (A3) among the users of the right side. The concentric circle D shown in FIG. 29 may mean that the voice uttered by the second user A2 is being delivered only to the third user A3. In this case, the third user A3 may be able to communicate with other users as well as the second user A2, even in a state of listening to the voice of the second user A2.
Referring to FIG. 30, the second user A2 tilts the body to the right while directing the head to the right, and the fourth user A4 also tilts the body to the left while directing the head to the left. In this state, voice transmission / reception can be performed only between the second user A2 and the third user A3.
Referring to FIG. 31, the first user A1 tilts the body to the right while directing the head to the right, and the third user A3 also tilts the body to the left while directing the head to the left. In this state, voice transmission and reception may be performed only between the first user A1 and the third user A3, and these conversations may not be heard by the second user A2 heading to the front. The concentric circles D1 and D2 shown in FIG. 31 may mean that only the first user A1 and the third user A3 are in conversation. However, the third user A3 tilting the body to the left with the head turned to the left may be able to communicate with the first user A1.
Referring to FIG. 32, the seat is empty between the second user (A2) and the fourth user (A4). That is, the third user A3 has not attended or has withdrawn from the virtual space during the reproduction of the content. In this case, the second user (A2) and the fourth user (A4) can talk by directing only the head to the direction of the other party without leaning.
On the other hand, even if each user directs the head portion to the left or right, the display portion of the terminal worn by each user may be unchanged. That is, even if the user rotates the head in one direction while the content is being displayed in one area of the display unit, the content can be continuously displayed in one area of the display unit.
In the above description, it is assumed that the first to fourth users A1 to A4 are sequentially seated in the first to fourth virtual positions P1 to P4, respectively, but the present invention is not limited thereto. Therefore, the above description can be applied to the case where at least two of the first to fourth users A1 to A4 are seated in the same virtual position.
For example, it is assumed that the third user A3 and the fourth user A4 all are seated at the third position P3. In this case, the user of the
The seat arrangement thus modified may be displayed only on the
As a result, each user can listen to the content and conduct a conversation with a desired partner.
33 to 41 are diagrams for explaining a method of controlling content reproduction in the
33, when a user of the
34, when the user of the
However, there may be cases where a specific user has participated in the game late or has stopped playing the content for a certain period of time. Therefore, the specific user needs to return to the time point before the reproduction time of the content currently watched by the remaining users.
Therefore, as shown in FIG. 35, when a specific user touches the touch pad (223 of FIG. 5) with one finger or faces one face of the
In this case, a popup window (PU) is displayed on one side of the background so that an image displayed on the remaining terminals can be displayed. In addition, as shown in Fig. 36, the playback back point PB may be displayed at the first point S1 and the non-playback point at the second point S2, respectively.
In addition, the above-mentioned specific user needs to match the end point of content listening with the rest of the user.
Therefore, as shown in FIG. 37, a specific user can select a third point S3 for matching a sink with a terminal of the remaining user. Accordingly, the terminal of a specific user can listen to the content again from the first point S1 where playback is performed, and from the third point S3, the content can be enjoyed in the same manner as the terminal of the remaining user.
However, since all the participants' terminals display the same image from the third point S3, the reproduction speed may be increased from the first point S1 to the third point S3 in the case of a specific user's terminal. In this case, a number or icon indicating the playback speed may be displayed on one side of the screen. When the third point S3 is reached, the popup window PU displayed on the terminal of the specific user disappears, and the playback speed can be the same as the playback speed of the terminal of the other user.
In the above description, the playback period and playback speed are controlled by the specific user himself, but the present invention is not limited thereto. Therefore, it may be controlled by the
38 to 41, a description will be made of a case where a specific user pauses the content reproduction for his / her terminal, and a description of a part overlapping with the above description will be omitted.
Referring to FIG. 38, a progress bar PB is displayed, as described with reference to FIG. In particular, a plurality of control buttons B are displayed, and a stop button during pause, rewind, or fast forward can be selected.
In this case, as shown in Fig. 39, a playback button, a progress bar PB, and a popup window PU for replaying the content can be displayed. This is described above.
In this case, the first point S1 may be a position of a frame in which a specific user stops reproduction, and the second point S2 may be a position of a frame currently watched by the remaining users. Therefore, as the stop period of a specific user terminal becomes longer, the interval between the first point S1 and the second point S2 can be increased.
Then, as shown in FIG. 40, the specific user can select the play button to reproduce the content again. In this case, as described above, the terminal of a specific user can reproduce the content at an increased rate so as to match the reproduction sync with the remaining terminals. At the point where the first point S1 and the second point S2 become equal to each other, Content can be played back at a high speed. The disappearance of the popup window PU at the point where the first point S1 and the second point S2 become the same is as described above, as shown in FIG.
The present invention described above can be embodied as computer readable codes on a medium on which a program is recorded. A computer readable medium includes any type of recording device in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). The computer may also include a
100: mobile terminal 110: wireless communication unit
120: Input unit
140: sensing unit 150: output unit
160: interface unit 170: memory
180: control unit 190: power supply unit
Claims (15)
A display unit;
A communication unit for communicating with another terminal; And
The content is transmitted to the other terminal through the communication unit,
An interface for selecting at least one of a first virtual position of the mobile terminal user or a second virtual position of the other terminal user for viewing the content with respect to an area of the virtual space displayed on the display unit On the display unit,
Controls the mobile terminal to display the content in one area of the virtual space according to a first view (view) of the playback area at the first virtual position,
And controlling the other terminal to display the content in one area of the virtual space according to a second view of the playback area at the second virtual position.
Wherein,
And displays a background selection screen for selecting a background screen of the virtual space excluding the playback area on the display unit.
Wherein,
And displays an icon indicating at least one of the emotional state and the viewing state of the other terminal user with respect to a specific scene of the content on one side of the display unit.
Wherein,
Wherein the mobile terminal transmits either text or icon indicating at least one of the emotional state and the viewing state of the mobile terminal user to a specific scene of the content to the other terminal through the communication unit.
Wherein,
And uploads either the text or the icon to the Social Network Service (SNS) of the user account.
Wherein,
And displays a virtual position of each of a plurality of other terminal users viewing the content.
Further comprising: a microphone;
Wherein,
And transmits the voice of the user of the mobile terminal to at least one of the plurality of other terminal users during reproduction of the content.
And a sensor for detecting a posture of the mobile terminal user,
Wherein,
In response to the detected attitude, transmits the voice of the mobile terminal user and the voice of the other terminal user to the other terminal user located at either the left or right side of the virtual position of the mobile terminal user .
Wherein,
And controls at least one of a playback speed and a playback interval of the content transmitted to the other terminal.
Wherein,
Wherein the display unit displays at least one of an icon representing the playback speed and a progress bar indicating the playback duration on one side of the display unit.
Wherein,
And controls at least one of the playback speed and the playback interval for each of the plurality of other terminals.
Wherein,
And displays a reproduction screen on the other terminal on one side of the display unit.
A display unit;
A communication unit for communicating with another terminal; And
Receiving content from the other terminal through the communication unit,
Displaying on the display unit an interface for allowing a user to select a virtual location of the user of the mobile terminal for viewing the content with respect to an area of the virtual space displayed on the display unit,
And controlling the mobile terminal to display the content in one area of the virtual space according to a view of the playback area at the selected virtual location.
A display unit;
A communication unit for communicating with another terminal and a content server; And
Receiving connection information of the content server from the other terminal through the communication unit,
Receiving content from the content server using the received connection information,
Displaying on the display unit an interface for allowing a user to select a virtual location of the user of the mobile terminal for viewing the content with respect to an area of the virtual space displayed on the display unit,
And controlling the mobile terminal to display the content in one area of the virtual space according to a view of the playback area at the selected virtual location.
Transmitting content to another terminal;
An interface for selecting at least one of a first virtual position of the mobile terminal user or a second virtual position of the other terminal user for viewing the content with respect to an area in which the content is reproduced, Displaying on the display unit;
Controlling the mobile terminal to display the content in one area of the virtual space according to a first view for the playback area in the first virtual position; And
And controlling the other terminal to display the content in one area of the virtual space according to a second view of the playback area at the second virtual location.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150182658A KR20170073901A (en) | 2015-12-21 | 2015-12-21 | Mobile terminal and the control method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150182658A KR20170073901A (en) | 2015-12-21 | 2015-12-21 | Mobile terminal and the control method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170073901A true KR20170073901A (en) | 2017-06-29 |
Family
ID=59280235
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150182658A KR20170073901A (en) | 2015-12-21 | 2015-12-21 | Mobile terminal and the control method thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170073901A (en) |
-
2015
- 2015-12-21 KR KR1020150182658A patent/KR20170073901A/en unknown
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101667736B1 (en) | Mobile terminal and method for controlling the same | |
CN105549839B (en) | Mobile terminal and control method thereof | |
CN106850395B (en) | Mobile terminal and control method thereof | |
KR20170131104A (en) | Mobile terminal and method for controlling the same | |
KR20170131101A (en) | Mobile terminal and method for controlling the same | |
US9939642B2 (en) | Glass type terminal and control method thereof | |
KR20170011190A (en) | Mobile terminal and control method thereof | |
KR20160014226A (en) | Mobile terminal and method for controlling the same | |
KR20150134972A (en) | Mobile terminal and method for controlling the same | |
KR101598710B1 (en) | Mobile terminal and method for controlling the same | |
KR20170025177A (en) | Mobile terminal and method for controlling the same | |
KR20160143134A (en) | Head mounted display | |
KR20170010485A (en) | Terminal device and controlling method thereof | |
KR20180028211A (en) | Head mounted display and method for controlling the same | |
KR20160023212A (en) | Glass type mobile terminal and method for controlling the same | |
KR20170055867A (en) | Mobile terminal and method for controlling the same | |
KR20160087969A (en) | Mobile terminal and dual lcd co-processing method thereof | |
KR20160019279A (en) | Mobile terminal and method for controlling the same | |
US20170006235A1 (en) | Mobile terminal and method for controlling the same | |
KR20180056062A (en) | Display device and method for controlling the same | |
KR20170058756A (en) | Tethering type head mounted display and method for controlling the same | |
KR20160006518A (en) | Mobile terminal | |
KR20140147057A (en) | Wearable glass-type device and method of controlling the device | |
KR20160092820A (en) | Mobile terminal and method for controlling the same | |
KR20170035755A (en) | Mobile terminal and method for controlling the same |