KR20150049168A - Mobile terminal and operating method thereof - Google Patents

Mobile terminal and operating method thereof Download PDF

Info

Publication number
KR20150049168A
KR20150049168A KR1020130129379A KR20130129379A KR20150049168A KR 20150049168 A KR20150049168 A KR 20150049168A KR 1020130129379 A KR1020130129379 A KR 1020130129379A KR 20130129379 A KR20130129379 A KR 20130129379A KR 20150049168 A KR20150049168 A KR 20150049168A
Authority
KR
South Korea
Prior art keywords
subject
mobile terminal
frame
image
laser
Prior art date
Application number
KR1020130129379A
Other languages
Korean (ko)
Inventor
최용준
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020130129379A priority Critical patent/KR20150049168A/en
Publication of KR20150049168A publication Critical patent/KR20150049168A/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/3827Portable transceivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Telephone Function (AREA)

Abstract

The present embodiment includes a femto photographic camera for sensing a photon returning from a hidden object by a frame as a frame, and a controller for analyzing the frame to implement the object as a 3D image, A mirror system for splitting the laser beam and changing a firing position; and a photodetector for reflecting the laser beam reflected and scattered by the reflection wall and the photon, And a streak camera for sensing the frame.
According to the present invention, it is possible to take a 3D image of a subject hidden by an obstacle.

Description

[0001] MOBILE TERMINAL AND OPERATING METHOD THEREOF [0002]

The present invention relates to a mobile terminal and an operation method thereof, and more particularly, to a mobile terminal equipped with a femto photographic camera and a control method thereof.

A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mount terminal according to whether the user can directly carry the mobile terminal.

Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .

In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.

Recently, a femto-photography camera has been developed, which visualizes the propagation of light and implements it as an image.

The femto photographic camera is a camera that can capture a hidden object that exists behind a corner of a wall or a building and can embody an obstructed object as a 3D image.

The development of such a femto photographic camera and researches on technologies applied thereto have been actively performed. Especially, when a femtosophotol camera is applied to a mobile terminal, various advantages are expected to be manifested.

The present invention is intended to effectively obtain a high-quality photograph by mounting a femto photographic camera on a mobile terminal and providing various shooting modes.

The present embodiment includes a femto photographic camera for sensing a photon returning from a hidden object by a frame as a frame, and a control unit for analyzing the frame and implementing the subject as a 3D image, A mirror system for splitting the laser beam and changing a firing position; and a photodetector for reflecting the laser beam reflected and scattered by the reflection wall and the photon, And a streak camera for sensing the frame.

According to another aspect of the present invention, there is provided a method of controlling a mobile terminal equipped with a femto photographic camera and equipped with a wearable mobile terminal, the method comprising the steps of: receiving a video call execution command to activate a femto photographic camera; The method includes the steps of emitting a reflected laser beam to a subject through a reflecting wall, sensing the laser beam scattered and reflected by the reflecting wall and a subject in a frame, continuously changing a firing position of the laser, Analyzing the sensed frames and analyzing the subject into a 3D image, and transmitting the 3D image to the video call counterpart.

According to the embodiment of the present invention, there is an advantage that a subject hidden by the obstacle can be photographed in a 3D image.

In addition, according to the embodiment of the present invention, the user can easily make a video call while wearing the wearable mobile terminal.

Further, according to another embodiment of the present invention, a photograph can be taken at a high angle angle from the top of the subject.

Finally, according to another embodiment of the present invention, there is an advantage that a four-sided image of a subject can be photographed and realized as a 3D image.

1 is a block diagram of a mobile terminal equipped with a femto photographic camera of the present invention.
FIG. 2 is a schematic view of a femto photographic camera photographing a subject obstructed by an obstacle according to an embodiment of the present invention.
Figure 3 shows frames sensed in a streaming camera, according to an embodiment of the present invention.
FIG. 4 shows a sequence of (a) to (i) sequentially capturing a subject having a complicated shape using a femto photographic camera and then rendering the subject as a 3D image according to an embodiment of the present invention.
5 illustrates a video call using a femto photographic camera mounted on a wearable mobile terminal according to a first embodiment of the present invention.
6 shows a state in which a subject is photographed in a high angle photographing mode according to the second embodiment of the present invention.
FIG. 7 shows a photograph photographed in the high angle photographing mode according to the second embodiment of the present invention.
8 shows a state in which a subject is photographed in a stereoscopic shooting mode according to the third embodiment of the present invention.
9 shows a 3D image of a subject photographed in a stereoscopic shooting mode according to a third embodiment of the present invention.

Hereinafter, the present embodiment will be described in detail with reference to the accompanying drawings. It should be understood, however, that the scope of the inventive concept of the present embodiment can be determined from the matters disclosed in the present embodiment, and the spirit of the present invention possessed by the present embodiment is not limited to the embodiments in which addition, Variations.

The suffix "module" and "part" used in the description of the mobile terminal according to the present invention are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The mobile terminal described in this specification may include a mobile phone, a smart phone, a tablet PC, a notebook computer, a digital broadcasting terminal, a PDA (Personal Digital Assistant), a PMP (Portable Multimedia Player), and navigation. However, the configuration according to the embodiment described herein can be applied to a fixed terminal such as a digital TV, a desktop computer, and the like, except when it is applicable only to a mobile terminal.

The configuration of the mobile terminal to which the idea of the present invention can be applied

1 is a block diagram showing the configuration of a mobile terminal according to the present embodiment.

The mobile terminal 100 includes a wireless communication unit 110, an audio / video (FT) camera 120, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, A control unit 180, a power supply unit 190, and the like. Since the mobile terminal of Fig. 1 is not shown only as essential components, it may be implemented as a terminal that further includes other components or does not include some of the components shown.

Each component of the mobile terminal is examined in order.

The wireless communication unit 110 serves to enable wireless communication between the mobile terminal and the wireless communication system or between the mobile terminals, and may include one or more modules. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short range communication module 115, and a location information module 115 .

The broadcast receiving module 111 can receive broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel includes a satellite channel and a terrestrial channel. The broadcast management server receives a broadcast signal and / or broadcast related information generated by a server for generating and transmitting broadcast signals and / or broadcast related information, Lt; / RTI > The broadcast signal may include a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, and may include a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.

The broadcast-related information includes information related to a broadcast channel, a broadcast program, or a broadcast service provider, and the broadcast-related information can be provided through a mobile communication network. When broadcast-related information is provided through the mobile communication network, the mobile communication module 112 can receive the broadcast-related information.

The broadcast-related information may exist in various forms, for example, an electronic program guide (EPG) of Digitla Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) of Digital Vid vs. Broadcast-Handheld And the like.

For example, the broadcast receiving module 111 may be a Digital Multimedia Broadcasting-Terrestrial (DMB-T), a Digital Multimedia Broadcasting-Satellite (DMB-S), a Media Forward Link Only And a Digital Broadcasting System (ISDB-T) (Integrated Services Digital Broadcast-Terrestrial). Of course, it is also possible that the broadcast receiving module 111 is adapted to a broadcasting system other than the digital broadcasting system described above.

The broadcast signal and / or the broadcast-related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 111 transmits and receives a radio signal to and from a base station, an external terminal, and a server on a mobile communication network, and the radio signal to be transmitted and received is a voice signal, a video call signal or a text / And may include various types of data.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in the mobile terminal 100 or may be externally connected. WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access) and the like can be used as wireless Internet technologies.

The short-range communication module 114 is a module for short-range communication. The short-range communication module 114 is a module for short-range communication, for example, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB) So that the same short-range communication can be performed.

The location information module 115 plays a role for acquiring the location of the mobile terminal, and may be a Global Position System (GPS) module, for example.

Particularly, in the mobile terminal 100, which is the core of the present invention, Femto Photos  The camera 120 will be described in detail.

The femto photographic camera 120 refers to a camera that is located beyond a corner of a wall or a building and shoots a laser beam on an unobservable hidden object and then analyzes the reflected light back by a mathematical algorithm to form a 3D image .

The femtosecond camera 120 includes an ultra-fast laser 121 for emitting a laser beam having a pulse period of femtoseconds per second, a mirror 121 for splitting a laser beam and adjusting a firing position, A mirror system 122 and a streak camera 123 that senses photons that are reflected and scattered back to a subject where the laser waves are hidden.

FIG. 2 schematically shows a femto photographic camera according to an embodiment of the present invention photographing a hidden object covered by corners.

2, the laser 121 first irradiates a laser beam having a pulse in units of femto to the reflecting wall 20 (r1), and the laser beam is reflected and scattered on the reflecting wall 20 to form a photon (photo (R2). At this time, the laser 121 may be a titanium sapphire laser that emits a laser wave having a pulse of 12 nanoseconds or less.

The photon reaching the hidden object 10 is reflected and scattered again by the shape of the hidden object and moves to the reflecting wall 20 (r3), and the photons reflected and scattered by the reflecting wall 20 are reflected by the strike camera 123 (r4).

The stream camera 123 reflects and scatters through the r1 through r4 paths and senses the moved photons with accuracy of pico units.

More specifically, the strick camera 123 is a method of sensing photons at a speed of 1/2 times as large as that in which a photon reflected from a hidden object 10 is made to pass through a slit, The light is converted into electrons and then a high voltage is applied to the light by deflecting the light in a vertical direction to a phosphor screen to convert the instantaneous light intensity into a spatial luminance distribution of the shape of the hidden object 10 Thereby measuring an ultra-fast light emission phenomenon. In this way, the reflected and scattered photons from the hidden object 10 can be output as a frame.

Figure 3 shows frames sensed in a streaming camera, according to an embodiment of the present invention.

Referring to FIG. 3, when a laser beam is emitted in a direction different from a hidden object 10 having an area of 2 cm 2, the photons sensed by the strike camera 123 according to the direction are represented by respective frames.

The reason why the laser 121 emits a laser beam to the reflective wall 20 in a different direction is that there is a limit to the number of cases required for image formation as information of light returned by a laser beam emitted in one direction This is because the shape of the hidden object 10 can not be determined.

That is, it can be considered that the hidden object 10 can have various shapes (p, q, r) when the frame is sensed by emitting the laser wave in the first direction in FIG. 3A.

Therefore, in order to adjust the firing direction of the laser wave, a mirror system 122 is provided in front of the laser 121.

That is, the mirror system 122 is provided in front of the laser 121 to scatter and reflect the emitted laser wave, and at this time, the mirror system 122 can control the position of the laser wave on the wall irradiated with the photon.

The mirror system 122 senses the photons of the laser beams emitted in three directions by the strike camera 123, and then the control unit 180 synthesizes the photons with the single shape, As shown in FIG.

At this time, the controller 180 uses a time-of-flight technique and a computational reconstruction algorithm. The above algorithms are a method of mathematically calculating the path of light by analyzing the time of the photons returning from the scattered and reflected laser waves. The path of the photons is analyzed considering the intensity of the photons and the time taken for the light to return And reconstructs the subject 10 hidden by computer graphics technology.

FIGS. 3 (b) to 3 (c) illustrate frames reconstructed using the above algorithm. Blue pixels indicate that the intensity of the photons is strong, and red pixels indicate that the intensity of the photons is weak. From this, it can be analyzed that the hidden object 10 is closer to the wall as the photon's light intensity becomes weaker.

FIG. 4 is a flowchart illustrating a process (a) to (i) for capturing a subject having a complicated shape with a femto photographic camera and implementing the 3D image through a rendering process according to an embodiment of the present invention.

Fig. 4 (a) shows hidden objects, Fig. 4 (b) shows frames photographed in a stricken camera, Fig. 4 (c) shows such frames as a back map, The figure shows a specific example of the subject. 4 (d) shows a state after filtering to obtain the z value of the object by differentiating the heat map with xy in order to realize the intensity of the photon and the depth of the object, (e) shows the depth of the object (f) represents a confidence frame, and (g) represents a 3D image that implements the appearance of the hidden object by rendering the 3D frames by rendering the frames above.

Through the series of processes performed by the controller 180, the femtosecond camera 120 can capture a subject hidden by the obstacle. The present invention aims to provide a variety of photographing modes by mounting such a femtoc photo camera 120 in a mobile terminal.

The femtosecond camera 120 may also perform general camera functions. That is, it is possible to process and output an image frame of a still image or a moving image obtained by the image sensor in the video communication mode or the photographing mode. The image frame thus processed can be displayed on the display unit 151. [

The image frame processed by the camera 121 may be stored in the memory 160 or wirelessly transmitted through the wireless communication unit 110. [ The camera 121 may include a camera for photographing the front surface of the display 151 and a camera for photographing the back side of the terminal corresponding to the opposite side. The camera 121 may also receive video and audio signals photographed from a plurality of cameras It is possible.

The sensing unit 140 of the mobile terminal plays a role to confirm various operation states, operations, motions, etc. of the mobile terminal, and the operation of the mobile terminal can be controlled according to the information sensed by the sensing unit 140 . For example, the sensing unit 140 generates a sensing signal for controlling the operation of the mobile terminal by detecting the state of the mobile terminal such as the open / close state of the mobile terminal, the position, the presence / absence of user touch, . Also, the sensing unit 140 may sense whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like.

The sensing unit 140 may include a proximity sensor 141 and a motion sensor 142 as shown in the figure, and a fingerprint sensor for sensing a fingerprint of the user may also be included.

The motion sensor 142 may sense motion, direction, direction, speed, etc. of the mobile terminal, and may include, for example, a geomagnetic sensor, an acceleration sensor, a gyro sensor, an altimeter, and the like.

The geomagnetic sensor is a sensor that can detect the direction of the magnetic field generated by the earth and detect the orientation like a compass. The acceleration sensor is a sensor capable of detecting a change in the operation vibration (acceleration) of a sensor moving body that specifies a physical quantity called a change in velocity. The gyro sensor is a sensor that senses the vertical force by the same principle as the acceleration sensor when the coriolis force is generated in the vertical direction of the rotating angle. The altimeter is a sensor that measures the pressure difference (pressure) that varies with altitude. The fingerprint sensor is a sensor capable of scanning the user's fingerprint on the front or rear surface of the mobile terminal, and can perform image extraction and data processing on a specific area of the fingerprint.

The output unit 150 of the mobile terminal generates the output related to the visual, auditory or tactile sense and includes a display unit 151, an audio output module 152, an alarm unit 153, a haptic module 154, .

The display unit 151 displays information processed by the mobile terminal. For example, when the mobile terminal is in the call mode, the display unit 151 displays a UI or a GUI related to the call, In the shooting mode, a captured image or a received image is displayed, or a UI and a GUI associated with the displayed image are displayed.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor liquid crystal display (TFT LCD), an organic light emitting diode (OLED), a flexible display, or a three dimensional display. Some of these displays may be transparent or light transmissive so that they can be seen through. This can be referred to as a transparent display, and an example of such a transparent display may be TOLED (Transparent OLED). In addition, the rear structure of the display 151 may also be of a light transmission type. In this case, the user can see an object located behind the terminal body through an area occupied by the display 151 of the terminal body.

According to the embodiment of the mobile terminal 100, a plurality of displays 151 may be provided. For example, in the mobile terminal, a plurality of displays may be disposed on one surface, It is possible.

In the case where the display 151 and the sensor (touch sensor) for sensing the touch operation form a mutual layer structure (touch screen), the display 151 may be used as an input device other than the output device. The touch sensor may be in the form of, for example, a touch film, a touch sheet, a touch pad, or the like.

The touch sensor is a sensor for detecting a change in a capacitance applied to a specific area of the display 151 or a capacitance generated in a specific area of the display 151 or a change in a magnetic field generated in a specific area of the display 151, / RTI > The touch sensor can be configured to detect not only the position and area to be touched but also the pressure of the touch moment.

When a touch input is detected by the touch sensor, a signal corresponding to the touch input is sent to the touch controller, and the touch controller transmits data corresponding to the received signal to the controller 180. [ Through this process, the controller 180 can confirm whether or not the display has a touch, which area of the display has been touched, and in what area of the display the touch has been performed.

The proximity sensor 141 detects presence or absence of an object approaching a predetermined detection surface or an object existing in the vicinity of the detection surface without mechanical contact using an electromagnetic force or an infrared ray. The proximity sensor 141 may be disposed near the touch screen or inside the mobile terminal.

Examples of the proximity sensor 141 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. In the case where the touch screen is made electrostatic, proximity of the pointer can be detected by the change of the electric field along the proximity of the pointer. In this case, the touch screen may be classified as a proximity sensor.

Hereinafter, for the sake of convenience, it is referred to as "proximity touch" for recognizing that the pointer is positioned on the touch screen in a state in which the pointer is not in contact with the touch screen, Quot; contact touch "is an action that the pointer actually touches.

The proximity sensor 141 may sense a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, . Information corresponding to the detected proximity touch operation and the proximity touch pattern may be output on the touch screen.

The audio output module 152 of the output unit 150 receives audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a communication mode or a recording mode, a voice recognition mode, . The sound output module 152 also outputs sound signals related to the functions (call signal reception sound, message reception sound, etc.) performed in the mobile terminal. The audio output module 152 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 153 of the output unit 150 outputs a signal for notifying the occurrence of an event of the mobile terminal. Examples of the event generated in the mobile terminal include reception of a call signal, reception of a message, input of a key signal, touch input, and the like. In addition to the video signal and the audio signal, the alarm unit 153 can notify the occurrence of the event by the vibration of the mobile terminal. Since the video signal or the audio signal can also be output through the display 151 or the audio output module 152, these displays and audio output modules can also be classified as alarms.

The haptic module 154 of the output unit 150 plays a role of generating various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 154 is vibration. By controlling the haptic module 154, the intensity and pattern of the generated vibration can be variously changed. For example, different vibrations may be synthesized and outputted or sequentially different intensity or pattern may be outputted .

The haptic module 154 can be implemented not only to transmit the tactile effect through direct contact but also to allow the user to feel the tactile effect through the muscular sense of the finger or arm. A plurality of haptic modules 154 may be provided in the mobile terminal, and in this case, it is possible to generate vibrations of various strengths and patterns.

The memory 160 of the mobile terminal stores a program for processing and control of the controller 180 or plays a role for temporary storage of input / output data (e.g., telephone directory, message, audio, Can be performed. The memory 160 may also store the frequency of use of each of the data. When a user's touch is input to the touch screen, data on vibration and sound of various patterns to be output may be stored in the memory 160. [

The memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (SD), a RAM, an SRAM (Static RAM), a ROM, an EEPROM (Electrically Erasable Programmable ROM ), A PROM (Programmable ROM), a magnetic memory, a magnetic disk, and an optical disk. The mobile terminal may also operate in connection with web storage performing the storage function of the memory 160 over the Internet.

The interface unit 170 of the mobile terminal serves as a connection path to various external devices and receives data or power from an external device / device and can transmit the data or components to the respective components inside the mobile terminal. Conversely, To be transmitted to an external device. For example, it is possible to connect a device with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, an identification module, an audio I / O port, A port, an earphone port, and the like may be included in the interface unit 170.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal and includes a user identity module (UIM), a subscriber identity module (SIM), a universal user identity module Module, USIM), and the like. The device (identification device) provided with the identification module can be manufactured in the form of a smart card, and can be connected to the mobile terminal through the identification device.

The interface unit 170 acts as a channel through which power is supplied from the cradle to the mobile terminal when the mobile terminal is connected to the external cradle and transmits various command signals input from the cradle to the mobile terminal by the user . The various command signals or power input from the cradle may be operated as a signal to recognize that the mobile terminal is effectively mounted on the cradle.

The control unit 180 of the mobile terminal controls each of the overall operations of the mobile terminal and performs control and processing related to, for example, voice call, data communication, video call, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented as a separate member / device.

The control unit 180 may perform a pattern recognition process for recognizing handwriting input or drawing input performed on the touch screen as characters and images, respectively.

The power supply unit 190 of the mobile terminal receives external power and internal power under the control of the controller 180 and supplies power required for operation of the respective components.

The various embodiments described herein may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.

According to a hardware implementation, a plurality of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs) , Micro-controllers, micro-processors, and other electronic units for performing other functions. In some cases, the embodiments described herein may be implemented by the controller 180 itself.

According to a software implementation, embodiments such as the procedures and functions described herein may be implemented in separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code can be implemented in a software application written in a suitable programming language. The software code is stored in the memory 160 and can be executed / operated by the control unit 180. [

Hereinafter, the various Examples  5 To  9.

As described above, the embodiments of the present invention show embodiments in which the module of the femtocell camera 121 is mounted on the mobile terminal to provide various shooting methods.

5 illustrates a video call using a femto photographic camera mounted on a wearable mobile terminal according to a first embodiment of the present invention.

A wearable mobile terminal (100) is a mobile terminal (100) which is worn on a body of a user and has a shape such as a clock, glasses, clothes, necklace, ring, etc., And the mobile terminal 100.

However, in the case of the wearable mobile terminal 100, it may be difficult to change the angle of the camera 120 toward a subject worn by the wearer's body.

Referring to FIG. 5, the wearable mobile terminal 100 is worn on the neck of the user 10 by the necklace-type mobile terminal 100.

In the case of the necklace-type mobile terminal 100 worn on the neck of the user 10, if the subject is in front of the user, it is not difficult to photograph the camera 120 mounted on the necklace-type mobile terminal 100, It is difficult to photograph the user 10 in close contact with the upper side of the user.

Particularly, in recent years, a video call is becoming common. If the camera 120 of the wearable mobile terminal 100 can not photograph the user 10, this may cause a great inconvenience to the user.

In order to overcome this problem, the first embodiment is to mount a femto photographic camera 120 on the wearable mobile terminal 100 to easily photograph a desired user.

Hereinafter, a process of providing the video call mode in the necklace-type terminal 100 will be described.

When the user inputs a video call execution command to the necklace-type terminal 100, the control unit 180 activates the femtocell camera 120. [

The activated femtocell photographic camera 120 fires the laser beam toward the reflecting wall with the laser 121.

The emitted laser wave is scattered and reflected by the reflecting wall by the photon, reaches the user 10, and is scattered and reflected on the user 10 and the reflecting wall again to reach the stricken camera 123, 123) senses this as a frame.

Thereafter, the mirror system 122 mounted in front of the laser 121 changes the firing position of the laser wave, and the streak camera 123 repeatedly scatters and reflects by the laser waves fired in various directions, As a frame.

The control unit 180 analyzes the various frames thus sensed, and displays the 3D image. The control unit 180 transmits the 3D image to the video call destination so that the user can provide the video call mode while wearing the necklace-type terminal 100.

At this time, if the image transmitted from the video communication partner is projected on the reflecting wall using the small beam device, it is also possible to make a video call while watching the image of the other party.

FIG. 6 shows a state in which a subject is photographed in a high angle photographing mode according to a second embodiment of the present invention, and FIG. 7 shows a photograph photographed in a high angle photographing mode according to the second embodiment of the present invention.

Since the femtosyphotograph camera 120 can photograph the field of view viewed from the reflecting wall at an angle, when the reflecting wall is located above the subject, the subject can be easily photographed at a high angle. At this time, the high angle means a line of sight of the camera 120 to the subject 10.

6, the mobile terminal 100 is located at the same position as the subject 10 and has a ceiling (not shown) on the upper side of the subject 10 and the mobile terminal 100.

If the ceiling is used as a reflecting wall and the femtosecophotol camera 120 photographs the subject 10, the subject 10 can be photographed at a high angle that is viewed from the ceiling.

In more detail, a process of providing the high-angle photographing mode in the mobile terminal 100 will be described.

First, when the user enters an instruction to enter the high angle photographing mode, the controller 180 activates the femtocell camera 120.

When the control unit 180 receives the user's high angle photographing instruction, the laser 121 of the femtocell camera 120 causes the laser beam to be emitted to the ceiling.

The emitted laser waves are scattered and reflected on the ceiling with photons, reaching the user 10, and the photons scattered and reflected back to the ceiling with the user 10 reach the camera 123.

Thereafter, the strick camera 123 senses the returned photon as a frame.

The mirror system 122 provided in front of the laser 121 changes the firing position of the laser wave in various directions and the stricken camera 123 reflects the photons reflected and scattered by the laser waves fired in various directions Continuously sensing with frames.

The control unit 180 analyzes various frames sensed as described above, and implements the image, thereby obtaining the photographed image in the high angle photographing mode.

Fig. 8 shows a state in which a subject is photographed in a stereoscopic imaging mode according to a third embodiment of the present invention, and Fig. 9 shows a 3D image of a subject photographed in a stereoscopic imaging mode according to a third embodiment of the present invention .

The stereoscopic image capture mode is a mode of capturing images of a subject in a 3D image.

The femto photographic camera 120 is capable of photographing a subject as a 3D image in which a three-dimensional image is expressed. Also, the back side of the subject can be photographed if there is a reflecting wall.

  Hereinafter, a method of implementing a subject as a 3D image using the femto photographic camera 120 will be described.

First, when the user inputs a shooting command for a subject in the stereoscopic shooting mode, the controller 180 activates the femtosophotol camera 120 to start stereoscopic shooting.

At this time, in order to photograph the back side of the subject, there must be a reflection wall 20 for reflecting photons to the back side of the subject. Therefore, the controller 180 detects the presence of the reflection wall by the femto photographic camera 120 If not, the user is notified by the touch screen that the stereoscopic mode is not available.

The femtocell photographic camera 120 causes the laser 121 to emit a laser beam to the subject 10 and the reflecting wall 20 behind the subject 10.

The emitted laser waves are directly scattered and reflected by the photon 10 by the photon, and are reflected back to the reflecting wall 20 and the subject. The strick camera 123 senses this photon as a frame.

The mirror system 122 provided in front of the laser 121 changes the firing position of the laser beam and the stricken camera 123 continuously reflects the photons that are scattered and reflected by the laser beam emitted in various directions, .

The controller 180 analyzes the thus-sensed frames to display 3D images on the touch screen.

In addition, the controller 180 may provide a UI for changing the viewing angle of the 3D image upon receipt of a user's touch drag input or the like.

Claims (5)

A femto photographic camera that fires a laser wave on a hidden object and senses the returned photon as a frame,
And a controller for analyzing the frame to implement the subject as a 3D image,
The femtosecond camera includes: a laser for emitting the laser beam to a reflecting wall facing the subject; a mirror system for splitting the laser beam and changing a firing position; And a streak camera for sensing the photons reflected back to the frame.
The method according to claim 1,
Wherein the outer shape of the mobile terminal is a shape of a clock, a necklace, a ring, or glasses that can be worn on the body.
There is provided a control method of a mobile terminal in which a wearable mobile terminal is equipped with a femto photographic camera and provides a video call,
Receiving a video call execution command to activate a femto photographic camera;
The laser emitting a laser wave through the reflecting wall to the object,
Sensing the laser beam scattered and reflected by the reflection wall and the object with a frame;
Continuously changing the firing position of the laser and sensing the reflected laser wave as a frame,
Analyzing the sensed frames and analyzing the subject into a 3D image;
And transmitting the 3D image to the video call counterpart.
There is provided a control method of a mobile terminal that provides a high angle photographing mode in which a photograph is photographed with a line of sight facing the subject from above,
Activating a femto photographic camera by receiving a user's command to enter the high angle shooting mode;
Firing a laser beam on a ceiling located above the object;
Sensing the scattered and reflected photons in the ceiling and the object as a frame,
Changing a firing position of the laser wave, sensing the photons scattered and reflected by the changed laser wave in a frame,
And analyzing the sensed frame and outputting the photographed image as a high angle perspective photograph.
A stereoscopic image capturing mode in which a four-sided image of a subject is photographed and implemented as a 3D image,
Receiving a stereoscopic mode entry instruction from a user and activating a femtosecond camera;
Firing a laser beam to the subject and a reflecting wall located behind the subject,
Sensing the photons scattered and reflected by the subject and the reflecting wall by the laser wave in a frame,
Changing a firing position of the laser wave, sensing the photons scattered and reflected by the changed laser wave in a frame,
And analyzing the sensed frame to implement the subject as a 3D image.
KR1020130129379A 2013-10-29 2013-10-29 Mobile terminal and operating method thereof KR20150049168A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130129379A KR20150049168A (en) 2013-10-29 2013-10-29 Mobile terminal and operating method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130129379A KR20150049168A (en) 2013-10-29 2013-10-29 Mobile terminal and operating method thereof

Publications (1)

Publication Number Publication Date
KR20150049168A true KR20150049168A (en) 2015-05-08

Family

ID=53387304

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130129379A KR20150049168A (en) 2013-10-29 2013-10-29 Mobile terminal and operating method thereof

Country Status (1)

Country Link
KR (1) KR20150049168A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018164316A1 (en) * 2017-03-07 2018-09-13 링크플로우 주식회사 Omnidirectional image capturing method and device for performing method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018164316A1 (en) * 2017-03-07 2018-09-13 링크플로우 주식회사 Omnidirectional image capturing method and device for performing method
US10419670B2 (en) 2017-03-07 2019-09-17 Linkflow Co. Ltd Omnidirectional image capturing method and apparatus performing the method

Similar Documents

Publication Publication Date Title
KR102497683B1 (en) Method, device, device and storage medium for controlling multiple virtual characters
CN106371782B (en) Mobile terminal and control method thereof
KR102153436B1 (en) Mobile terminal and method for controlling the same
KR102083595B1 (en) Mobile terminal and control method thereof
KR102155094B1 (en) Mobile terminal and method for controlling the same
US11790612B2 (en) Information display method and device, terminal, and storage medium
KR102471447B1 (en) Mirror type display device andmethod for controlling the same
KR20160017991A (en) Mobile terminal having smart measuring tape and object size measuring method thereof
KR20170014355A (en) Mobile terminal and method of controlling the same
US20180267663A1 (en) Terminal and method of controlling therefor
KR20140125212A (en) Mobile terminal and control method thereof
KR20170055869A (en) Mobile terminal and method for controlling the same
KR20170025177A (en) Mobile terminal and method for controlling the same
US20160178905A1 (en) Facilitating improved viewing capabitlies for glass displays
KR20150131815A (en) Mobile terminal and controlling method thereof
KR20170025413A (en) Mobile terminal and method for controlling the same
KR20170089653A (en) Mobile terminal and method for controlling the same
KR20160032565A (en) Mobile terminal and control method for the mobile terminal
KR20180048170A (en) Display apparatus
KR20160026532A (en) Mobile terminal and method for controlling the same
KR20170011183A (en) Mobile terminal and method for controlling the same
KR20190054727A (en) Smart mirror device and and method the same
KR20150025268A (en) Electronic Device And Method Of Controlling The Same
KR20150049168A (en) Mobile terminal and operating method thereof
KR102130802B1 (en) Mobile terminal and method for controlling the same

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination